检测到您已登录华为云国际站账号,为了您更好的体验,建议您访问国际站服务网站 https://www.huaweicloud.com/intl/zh-cn
不再显示此消息
To ensure data transmission security, this channel is encrypted using SSL. The nettyconnector.ssl.enabled configures whether to enable SSL encryption. The SSL encryption is enabled only when nettyconnector.ssl.enabled is set to true.
The Hive table import and export function does not support importing or exporting encrypted tables, HBase external tables, Hudi tables, view tables, and materialized view tables.
In the keytab login mode, this parameter does not need to be set. authentication.password Encrypted password of the user for accessing the Loader service if the keytab file authentication is not used in the security mode.
For details, see Non-Encrypted Transmission or Encrypted Transmission. Set the parameters in the properties.properties file. The following uses SpoolDir Source+File Channel+Kafka Sink as an example. Run the following command on the node where the Flume client is installed.
If the Syslog protocol is not encrypted, data may be stolen. Prerequisites The ECS corresponding to the server must be in the same VPC as the MRS cluster's Master node, and the Master node must be able to access the server's IP address and specified port.
After being encrypted, the password is saved in password.property. Please input key password: Please Confirm password: The password.property file generated on the node you have logged is available only for the current cluster and cannot be used for other clusters.
/encrypt_tool Unencrypted password The obtained encrypted password is used as the value of authentication.password. NOTE: If a non-encrypted password contains special characters, the special characters must be escaped.
/encrypt_tool Unencrypted password The obtained encrypted password is used as the value of authentication.password. NOTE: If a non-encrypted password contains special characters, the special characters must be escaped.
ClickHouse cannot connect to encrypted HDFS directories.
/encrypt_tool Unencrypted password The obtained encrypted password is used as the value of authentication.password. NOTE: If a non-encrypted password contains special characters, the special characters must be escaped.
/encrypt_tool Unencrypted password The obtained encrypted password is used as the value of authentication.password. NOTE: If a non-encrypted password contains special characters, the special characters must be escaped.
StormSubmitter.submitTopology(args[0], conf, builder.createTopology()); } The target file path of Storm cannot be in an SM4 encrypted HDFS partition. Running the Application and Viewing Results Export the local JAR package. For details, see Packaging IntelliJ IDEA Code.
In the keytab login mode, this parameter does not need to be set. authentication.password Encrypted password of the user for accessing the Loader service if the keytab file authentication is not used in the security mode.
When data is remotely backed up to HDFS, HDFS encrypted directories are not supported. If you want to back up data to OBS, you have connected the current cluster to OBS and have the permission to access OBS.
The password is encrypted in the background. Log in to each MetaStore background node and check whether the local directory /opt/Bigdata/tmp exists. If it exits, go to 4.
When data is remotely backed up to HDFS, HDFS encrypted directories are not supported. If you want to back up data to OBS, you have connected the current cluster to OBS and have the permission to access OBS.
For details about how to use the encryption mode, see Configuring an Encrypted Flume Data Collection Task. The configuration applies to scenarios where only the server is configured, for example, Spooldir Source+File Channel+HBase Sink.
For details about how to use the encryption mode, see Configuring an Encrypted Flume Data Collection Task. The configuration applies to scenarios where only the server is configured, for example, Spooldir Source+File Channel+HBase Sink.
Flink supports authentication and encrypted transmission. To use these functions, you need to install the Flink client and configure security authentication.
Encrypted directories cannot be backed up or restored. Prerequisites To back up data to a remote HDFS, the following conditions must be met: A standby cluster for backing up data has been created. The authentication mode must be the same as that of the active cluster.