检测到您已登录华为云国际站账号,为了您更好的体验,建议您访问国际站服务网站 https://www.huaweicloud.com/intl/zh-cn
不再显示此消息
The value must be an encrypted password.
SSL encrypted transmission is supported by components of a Flink cluster and between components in a cluster, such as Flink client and JobManager, JobManager and TaskManager, as well as TaskManager and TaskManager.
Hardening LDAP LDAP is hardened as follows after a cluster is installed: In the LDAP configuration file, the password of the administrator account is encrypted using SHA.
The password is encrypted and saved in the configuration file. Please input sftp/ftp server password: Check the configuration result. If the following information is displayed, the configuration is successful.
For more information about disk encryption, see Managing Encrypted EVS Disks. Keys used by encrypted system disks are provided by Key Management Service (KMS) in Data Encryption Workshop (DEW). You do not need to build and maintain the key management infrastructure.
The default value is privacy, indicating encrypted transmission. The value authentication indicates that transmission is not encrypted. For clusters with Kerberos authentication enabled (security mode), mutual trust between clusters needs to be configured.
The OBS path does not support files or programs encrypted by KMS. The path must end with .sql. sql is case-insensitive.
Setting spark.io.encryption.enabled=false disables the function of writing encrypted disks during shuffle, thereby improving shuffle efficiency. Setting spark.shuffle.service.enabled=true starts the shuffle service and enhance task shuffle stability.
org.apache.hadoop.hive.serde2.lazy.LazySimpleSerDe' WITH SERDEPROPERTIES ('column.encode.indices'='2,3', 'column.encode.classname'='org.apache.hadoop.hive.serde2.SMS4Rewriter') STORED AS TEXTFILE; The statement is used to create table encode_test and specify that column 2 and column 3 will be encrypted
org.apache.hadoop.hive.serde2.lazy.LazySimpleSerDe' WITH SERDEPROPERTIES ('column.encode.indices'='2,3', 'column.encode.classname'='org.apache.hadoop.hive.serde2.SMS4Rewriter') STORED AS TEXTFILE; The statement is used to create table encode_test and specify that column 2 and column 3 will be encrypted
The columns to be encrypted and the encryption algorithm can be specified when a Hive table is created. When data is inserted into the table using the INSERT statement, the related columns are encrypted.
org.apache.hadoop.hive.serde2.lazy.LazySimpleSerDe' WITH SERDEPROPERTIES ('column.encode.indices'='2,3', 'column.encode.classname'='org.apache.hadoop.hive.serde2.SMS4Rewriter') STORED AS TEXTFILE; The statement is used to create table encode_test and specify that column 2 and column 3 will be encrypted
For details about how to use the encryption mode, see Configuring an Encrypted Flume Data Collection Task. The configuration applies to scenarios where only the Flume is configured, for example, Spooldir Source+Memory Channel+Kafka Sink.
org.apache.hadoop.hive.serde2.lazy.LazySimpleSerDe' WITH SERDEPROPERTIES ('column.encode.indices'='2,3', 'column.encode.classname'='org.apache.hadoop.hive.serde2.SMS4Rewriter') STORED AS TEXTFILE; The statement is used to create table encode_test and specify that column 2 and column 3 will be encrypted
For details about how to use the encryption mode, see Configuring an Encrypted Flume Data Collection Task. The configuration applies to scenarios where only the Flume is configured, for example, Spooldir Source+Memory Channel+Kafka Sink.
For details, see Encrypted Transmission. Description of APIs RegisterServer API RegisterServerHandler stores information such as IP address, port number, and concurrency of NettySink for the connection with NettySource.
The Hive database import and export function does not support importing or exporting encrypted tables, HBase external tables, Hudi tables, view tables, and materialized view tables.
For details about how to use the encryption mode, see Configuring an Encrypted Flume Data Collection Task. Prerequisites The Flume client has been installed. The cluster and Flume service have been installed. The network environment of the cluster is secure.
For details about how to use the encryption mode, see Configuring an Encrypted Flume Data Collection Task. The configuration applies to scenarios where only the Flume is configured, for example, Spooldir Source+Memory Channel+HDFS Sink.
For details about how to use the encryption mode, see Configuring an Encrypted Flume Data Collection Task. The configuration applies to scenarios where only the Flume is configured, for example, Taildir Source+Memory Channel+HDFS Sink.