检测到您已登录华为云国际站账号,为了您更好的体验,建议您访问国际站服务网站 https://www.huaweicloud.com/intl/zh-cn
不再显示此消息
https://{endpoint}/v2/{project_id}/clusters/{cluster_id}/sql-execution/{sql_id} Example Response Status code: 200 Querying the SQL execution result is successful. { "id" : "20190909_011820_00151_xxxxx", "statement" : "show tables", "status" : "FINISHED", "result_location" : "obs
When OBS is connected in the scenario where storage and compute are decoupled, task-level fault tolerance is supported, but intermediate data is still flushed to the disk of the HDFS temporary directory.
After a job is executed, the system will compress and save the logs to the corresponding path if you choose to save job logs to OBS or HDFS. Therefore, after a job execution of this type is completed, the job status is still Running.
Similar to the example provided in method 1, set Action to Allow and add the outbound rules whose destinations are the address with Secure Communications enabled, NTP server address, OBS server address, OpenStack address, and DNS server address, respectively.
OBS Bucket Check Determine whether you need to select I confirm that I only need to view logs and data verification results on MgC Agent and do not need to upload them to OBS as required. Create a task for the destination by referring to 9.b to 9.c.
Set this parameter to an OBS bucket path or a local VM path. OBS bucket path: Enter a script path. For example, enter the path of the public sample script provided by MRS. Example: s3a://bootstrap/presto/presto-install.sh.
MRS 2.1.0.8 Patch Description Basic Information Table 1 Basic information Patch Version MRS 2.1.0.8 Release Date 2020-08-04 Resolved Issues List of resolved issues in MRS 2.1.0.8: MRS Manager The problem that the ECS API traffic is limited when OBS is accessed through an agency has
Set this parameter to an OBS bucket path or a local VM path. OBS bucket path: Enter a script path manually, for example, s3a://XXX/scale.sh. Local VM path: Enter a script path. The script path must start with a slash (/) and end with .sh.
It provides a data abstraction layer for computing frameworks including Apache Spark, Presto, MapReduce, and Apache Hive, so that upper-layer computing applications can access persistent storage systems including HDFS and OBS through unified client APIs and a global namespace.
It provides a data abstraction layer for computing frameworks including Apache Spark, Presto, MapReduce, and Apache Hive, so that upper-layer computing applications can access persistent storage systems including HDFS and OBS through unified client APIs and a global namespace.
Accessing MRS Storm with JDBC The program uses Storm topology to insert data into a table. storm-kafka-examples Interaction between Storm and Kafka of MRS The program uses the Storm topology to send data to Kafka and display the data. storm-obs-examples Interaction between Storm and OBS
List of resolved issues in MRS 2.1.0.8: MRS Manager The problem that the ECS API traffic is limited when OBS is accessed through an agency has been solved. Multiple users can log in to MRS Manager at the same time. Full-link monitoring is supported.
- Log Path Path of the OBS bucket for storing job logs obs://test/dataarts-log/ Job Description Job description - For more information about DataArts Studio job parameters and constraints, see Creating a DataArts Studio Job. Go to the job orchestration page.
List of resolved issues in MRS 2.1.0.8: MRS Manager The problem that the ECS API traffic is limited when OBS is accessed through an agency has been solved. Multiple users can log in to MRS Manager at the same time. Full-link monitoring is supported.
The save path of CompiledPlan can be an HDFS path or an OBS path. In this example, the HDFS path is used. How to Use Change the value of table.exec.resource.default-parallelism of the an operator in CompiledPlan. Example Develop a FlinkServer SQL job.
Huawei Cloud OBS is an object storage service that features high availability and low cost.
You have the permission to access the HDFS or OBS directories, HBase tables, and data involved in job execution. You have obtained the username and password used by an external data source (SFTP server or relational database).
You have the permission to access the HDFS or OBS directories, HBase tables, and data involved in job execution. You have obtained the username and password used by an external data source (SFTP server or relational database).
Related Information For details about how to manage ClickHouse permissions, import data from RDS for MySQL, OBS, HDFS, and GaussDB(DWS) to ClickHouse tables, manage multiple ClickHouse tenants, and access ClickHouse through ELB, see Using ClickHouse.
If you cannot directly access the client node to upload files through the local network, upload the JAR package or source data to OBS, import it to HDFS on the Files tab of the MRS console, and run the hdfs dfs -get command on the HDFS client to download it to the client node.