检测到您已登录华为云国际站账号,为了您更好的体验,建议您访问国际站服务网站 https://www.huaweicloud.com/intl/zh-cn
不再显示此消息
LDAP Hardening LDAP is hardened as follows after a cluster is installed: In the LDAP configuration file, the password of the administrator account is encrypted using SHA.
The parameter for the ConfigNode and IoTDBServer roles must be both modified. iotdb_server_kerberos_qop Encrypted data transmission of each IoTDBServer instance in the cluster.
Security features that need to be configured by users, such as authentication and SSL encrypted transmission, may affect performance. As a big data computing and analysis platform, Flink does not detect sensitive information.
The value must be an encrypted password.
SSL encrypted transmission is supported by components of a Flink cluster and between components in a cluster, such as Flink client and JobManager, JobManager and TaskManager, as well as TaskManager and TaskManager.
For more information about disk encryption, see Managing Encrypted EVS Disks. Keys used by encrypted system disks are provided by Key Management Service (KMS) in Data Encryption Workshop (DEW). You do not need to build and maintain the key management infrastructure.
Hardening LDAP LDAP is hardened as follows after a cluster is installed: In the LDAP configuration file, the password of the administrator account is encrypted using SHA.
The password is encrypted and saved in the configuration file. Please input sftp/ftp server password: Check the configuration result. If the following information is displayed, the configuration is successful.
The default value is privacy, indicating encrypted transmission. The value authentication indicates that transmission is not encrypted. For clusters with Kerberos authentication enabled (security mode), mutual trust between clusters needs to be configured.
The OBS path does not support files or programs encrypted by KMS. The path must end with .sql. sql is case-insensitive.
Setting spark.io.encryption.enabled=false disables the function of writing encrypted disks during shuffle, thereby improving shuffle efficiency. Setting spark.shuffle.service.enabled=true starts the shuffle service and enhance task shuffle stability.
org.apache.hadoop.hive.serde2.lazy.LazySimpleSerDe' WITH SERDEPROPERTIES ('column.encode.indices'='2,3', 'column.encode.classname'='org.apache.hadoop.hive.serde2.SMS4Rewriter') STORED AS TEXTFILE; The statement is used to create table encode_test and specify that column 2 and column 3 will be encrypted
org.apache.hadoop.hive.serde2.lazy.LazySimpleSerDe' WITH SERDEPROPERTIES ('column.encode.indices'='2,3', 'column.encode.classname'='org.apache.hadoop.hive.serde2.SMS4Rewriter') STORED AS TEXTFILE; The statement is used to create table encode_test and specify that column 2 and column 3 will be encrypted
For details about how to use the encryption mode, see Configuring an Encrypted Flume Data Collection Task. The configuration applies to scenarios where only the Flume is configured, for example, Spooldir Source+Memory Channel+Kafka Sink.
org.apache.hadoop.hive.serde2.lazy.LazySimpleSerDe' WITH SERDEPROPERTIES ('column.encode.indices'='2,3', 'column.encode.classname'='org.apache.hadoop.hive.serde2.SMS4Rewriter') STORED AS TEXTFILE; The statement is used to create table encode_test and specify that column 2 and column 3 will be encrypted
org.apache.hadoop.hive.serde2.lazy.LazySimpleSerDe' WITH SERDEPROPERTIES ('column.encode.indices'='2,3', 'column.encode.classname'='org.apache.hadoop.hive.serde2.SMS4Rewriter') STORED AS TEXTFILE; The statement is used to create table encode_test and specify that column 2 and column 3 will be encrypted
Files or programs encrypted by KMS cannot be imported. An empty folder cannot be imported. The directory and file name can contain letters, digits, hyphens (-), and underscores (_), but cannot contain special characters ;|&>,<'$*?
The columns to be encrypted and the encryption algorithm can be specified when a Hive table is created. When data is inserted into the table using the INSERT statement, the related columns are encrypted.
For details about how to use the encryption mode, see Configuring an Encrypted Flume Data Collection Task. The configuration applies to scenarios where only the Flume is configured, for example, Spooldir Source+Memory Channel+Kafka Sink.
The Hive database import and export function does not support importing or exporting encrypted tables, HBase external tables, Hudi tables, view tables, and materialized view tables.