检测到您已登录华为云国际站账号,为了您更好的体验,建议您访问国际站服务网站 https://www.huaweicloud.com/intl/zh-cn
不再显示此消息
You have had the permission to access the HDFS or OBS directories and data involved in job execution. You have had the permission to access the HBase tables or phoenix tables that are used during job execution.
You have had the permission to access the HDFS or OBS directories and data involved in job execution. You have obtained the username and password of the relational database.
Using Loader to Import Data from an SFTP Server to HDFS or OBS Scenario Use Loader to import data from an SFTP server to HDFS or OBS. Prerequisites You have obtained the service username and password for creating a Loader job.
Directory Log in to the OBS console.
You have had the permission to access the HDFS or OBS directories and data involved in job execution. You have obtained the username and password of the relational database.
Typical Scenario: Importing Data from an SFTP Server to HDFS or OBS Scenario Use Loader to import data from an SFTP server to HDFS or OBS. Prerequisites You have obtained the service username and password for creating a Loader job.
Relationships Between MemArtsCC and Other Components OBS OBS provides a new InputStream: OBSMemArtsCCInputStream. This InputStream reads data from the MemArtsCC cluster deployed on the compute side to reduce OBS server pressure and improve data read performance.
You can access, manage, and use OBS data on the MRS console and OBS client. You can also import OBS data to the HDFS system of a cluster for processing.
Add the endpoint configuration item fs.obs.endpoint of OBS and enter the endpoint corresponding to OBS in Value. For details, see Endpoints. For MRS 3.x or later, a file path on OBS can start with obs://.
Symptom When Spark SQL is used to access Hive partitioned tables stored in OBS, the access speed is slow and a large number of OBS query APIs are called.
Question Can Hive tables be stored in OBS or HDFS? Answer The location of a common Hive table stored on OBS can be set to an HDFS path. In the same Hive service, you can create tables stored in OBS and HDFS, respectively.
HDFS Client Failed to Delete Overlong Directories Symptom When a user runs the hadoop fs -rm -r -f obs://<obs_path> command to delete an OBS directory with an overlong path name, the following error message is displayed: 2022-02-28 17:12:45,605 INFO internal.RestStorageService: OkHttp
OBS determines whether the current user has the access permission based on the credential. Figure 2 Relationships between Guardian and other components Parent topic: Components
Currently, only MRS 3.3.0-LTS or later supports interconnection with OBS using Guardian. Create an OBS agency. Create an agency with OBS access permissions, which allows Guardian to connect to OBS. Enable Guardian to connect to OBS and set parameters.
Run the following command to mount the internal directory of an OBS container to the /obs directory of Alluxio: alluxio fs mount /obs obs://<OBS_BUCKET>/<OBS_DIRECTORY>/ Parent topic: Using Alluxio
An Error Is Reported When DistCP Is Used to Copy an Empty Folder Symptom When a user runs the following distcp commands on the MRS client, empty folders cannot be copied from HDFS to OBS: hadoop distcp -Dfs.obs.endpoint=xxx -Dfs.obs.access.key=xxx -Dfs.obs.secret.key=xxx -update hdfs
Answer Impala data is stored in HDFS or OBS and does not need to be stored on local disks. Data only needs to be overflowed to disks (specified by --scratch_dirs) if memory space is not enough for service queries running on Impalad instances. Disk hot swapping is not supported.
Answer Impala data is stored in HDFS or OBS and does not need to be stored on local disks. Data only needs to be overflowed to disks (specified by --scratch_dirs) if memory space is not enough for service queries running on Impalad instances. Disk hot swapping is not supported.
Application Development Overview Impala provides fast, interactive SQL queries directly on your Apache Hadoop data stored in HDFS, HBase, or Object Storage Service (OBS).
Application Development Overview Impala provides fast, interactive SQL queries directly on your Apache Hadoop data stored in HDFS, HBase, or Object Storage Service (OBS).