检测到您已登录华为云国际站账号,为了您更好的体验,建议您访问国际站服务网站 https://www.huaweicloud.com/intl/zh-cn
不再显示此消息
The Hive table import and export function does not support importing or exporting encrypted tables, HBase external tables, Hudi tables, view tables, and materialized view tables.
For details, see Non-Encrypted Transmission or Encrypted Transmission. Set the parameters in the properties.properties file. The following uses SpoolDir Source+File Channel+Kafka Sink as an example. Run the following command on the node where the Flume client is installed.
In the keytab login mode, this parameter does not need to be set. authentication.password Encrypted password of the user for accessing the Loader service if the keytab file authentication is not used in the security mode.
/encrypt_tool Unencrypted password The obtained encrypted password is used as the value of authentication.password. NOTE: If a non-encrypted password contains special characters, the special characters must be escaped.
/encrypt_tool Unencrypted password The obtained encrypted password is used as the value of authentication.password. NOTE: If a non-encrypted password contains special characters, the special characters must be escaped.
ClickHouse cannot connect to encrypted HDFS directories.
/encrypt_tool Unencrypted password The obtained encrypted password is used as the value of authentication.password. NOTE: If a non-encrypted password contains special characters, the special characters must be escaped.
/encrypt_tool Unencrypted password The obtained encrypted password is used as the value of authentication.password. NOTE: If a non-encrypted password contains special characters, the special characters must be escaped.
StormSubmitter.submitTopology(args[0], conf, builder.createTopology()); } The target file path of Storm cannot be in an SM4 encrypted HDFS partition. Running the Application and Viewing Results Export the local JAR package. For details, see Packaging IntelliJ IDEA Code.
The password is encrypted in the background. Log in to each MetaStore background node and check whether the local directory /opt/Bigdata/tmp exists. If it exist, go to 4.
When data is remotely backed up to HDFS, HDFS encrypted directories are not supported. If you want to back up data to OBS, you have connected the current cluster to OBS and have the permission to access OBS.
In the keytab login mode, this parameter does not need to be set. authentication.password Encrypted password of the user for accessing the Loader service if the keytab file authentication is not used in the security mode.
When data is remotely backed up to HDFS, HDFS encrypted directories are not supported. If you want to back up data to OBS, you have connected the current cluster to OBS and have the permission to access OBS.
Flink supports authentication and encrypted transmission. To use these functions, you need to install the Flink client and configure security authentication.
For details about how to use the encryption mode, see Configuring an Encrypted Flume Data Collection Task. The configuration applies to scenarios where only the server is configured, for example, Spooldir Source+File Channel+HBase Sink.
For details about how to use the encryption mode, see Configuring an Encrypted Flume Data Collection Task. The configuration applies to scenarios where only the server is configured, for example, Spooldir Source+File Channel+HBase Sink.
Encrypted directories cannot be backed up or restored. Prerequisites To back up data to a remote HDFS, the following conditions must be met: A standby cluster for backing up data has been created. The authentication mode must be the same as that of the active cluster.
else echo "security.ssl.encrypt.enabled is true, please enter security.ssl.key-password security.ssl.keystore-password security.ssl.truststore-password encrypted value in flink-conf.yaml."
For details, see Non-Encrypted Transmission or Encrypted Transmission. Set the parameters in the properties.properties file. The following uses SpoolDir Source+File Channel+Kafka Sink as an example. Run the following command on the node where the Flume client is installed.
NOTE: Encrypted directories cannot be backed up or restored. Hive Table-level user data. IoTDB IoTDB service data. RemoteHDFS ClickHouse Table-level user data. RemoteHDFS Doris Doris service data. This function is available for MRS 3.3.1 and later.