检测到您已登录华为云国际站账号,为了您更好的体验,建议您访问国际站服务网站 https://www.huaweicloud.com/intl/zh-cn
不再显示此消息
Typical Scenario: Exporting Data from HDFS or OBS to a Relational Database Scenario This section describes how to use Loader to export data from HDFS or OBS to a relational database. Prerequisites You have obtained the service username and password for creating a Loader job.
ALM-50229 Doris FE Failed to Connect to OBS Alarm Description The system checks whether the connection between the Doris FE nodes and OBS is available every 30 seconds. This alarm is generated when the connection status code is not 0.
Possible Causes Connection to the OBS server fails. The specified OBS file system does not exist. The user AK/SK information is invalid. The local OBS configuration cannot be obtained. Procedure Log in to the OBS server and check whether the OBS server can be properly accessed.
Description of Hive Table Location (Either Be an OBS or HDFS Path) Question Can Hive tables be stored in OBS or HDFS? Answer The location of a common Hive table stored on OBS can be set to an HDFS path.
Using Loader to Export Data from HDFS or OBS to a Relational Database Scenario This section describes how to use Loader to export data from HDFS or OBS to a relational database. Prerequisites You have obtained the service username and password for creating a Loader job.
Using Loader to Export Data from HDFS or OBS to an SFTP Server Scenario This section describes how to use Loader to export data from HDFS or OBS to an SFTP server. Prerequisites You have obtained the service username and password for creating a Loader job.
How Do I Read Encrypted OBS Data When Running an MRS Job? In MRS 1.9.x encrypted data in OBS file systems can be used to run jobs, and the encrypted job running results can be stored in OBS file systems. Currently, data can be accessed only through an OBS protocol.
the OBS directory.
How Do I Access OBS Using an MRS Client Installed Outside a Cluster?
OBS file system, the read and write performance may be affected.
How Do I Connect an MRS Cluster Client to OBS Using an AK/SK Pair? In MRS 1.9.2 or later, you can connect MRS clusters to OBS using obs://. Currently, supported components are Hadoop, Hive, Spark, Presto, and Flink. HBase cannot use obs:// to interconnect with OBS.
How Do I Migrate Data from OBS/S3 to ClickHouse? Question How do I migrate data from OBS/S3 to ClickHouse?
What Should I Do If Data Failed to Be Synchronized to a Hive Table on the OBS Using hive-table? Question What should I do if data failed to be synchronized to a Hive table on the OBS using hive-table? Answer Change -hive-table to -hcatalog-table.
Why Do I Fail to Create a Table in the Specified Location on OBS After Logging to spark-beeline? Question When the OBS ECS/BMS image cluster is connected, after spark-beeline is logged in, an error is reported when a location is specified to create a table on OBS.
Why Do I Fail to Create a Table in the Specified Location on OBS After Logging to spark-beeline? Question When the OBS ECS/BMS image cluster is connected, after spark-beeline is logged in, an error is reported when a location is specified to create a table on OBS.
Answer: When a user submits a job that needs to read and write OBS, the job submission program adds the temporary access key (AK) and secret key (SK) for accessing OBS by default. The temporary AK and SK have expiration time.
What Should I Do If the Error Message "requestId=XXX" Is Displayed When a Spark Job Accesses OBS? Symptom Error message "requestId=4971883851071737250" is displayed when a Spark job accesses OBS.
Failed to Use Sqoop to Read MySQL Data and Write Parquet Files to OBS Issue An error is reported when Sqoop reads MySQL data and writes the data to OBS in Parquet format. However, the data can be successfully written to OBS if the Parquet format is not specified.
You have had the permission to access the HDFS or OBS directories and data involved in job execution. You have had the permission to access the HBase tables or phoenix tables that are used during job execution.
Why Does an OBS Quickly Deleted Directory Not Take Effect After Being Added to the Customized Hive Configuration? The error message "java.lang.OutOfMemoryError: Java heap space." is displayed during Hive SQL execution.