检测到您已登录华为云国际站账号,为了您更好的体验,建议您访问国际站服务网站 https://www.huaweicloud.com/intl/zh-cn
不再显示此消息
What Should I Do If CDM Fails to Migrate Data from OBS to DLI? Symptom CDM fails to migrate data from OBS to DLI. Solution Dirty data writing is configured, but no dirty data exists. You need to decrease the number of concurrent tasks to avoid this issue.
Parent topic: Using CDM to Upload Data to OBS
How Do I Use CDM to Export MySQL Data to an SQL File and Upload the File to an OBS Bucket? Symptom How do I use CDM to export MySQL data to an SQL file and upload the file to an OBS bucket? Solution CDM does not support this operation.
Figure 2 Selecting a connector type Select Object Storage Service (OBS) and click Next to configure parameters for the OBS link. Name: Enter a custom link name, for example, obslink. OBS Server and Port: Enter the actual OBS address information.
TextFile format: UNCOMPRESSED OBS Path Path for storing OBS files. You can enter the #{source_Topic_name} built-in variable so that topics at the source can be written to different paths. An example path is obs://bucket/dir/test.db/prefix_#{source_Topic_name}_suffix/.
Optimizing Destination Parameters Optimization of data writing to OBS If automatic combination is enabled, disable it. Otherwise, increase the concurrency first. Parent topic: Job Performance Optimization
What Can I Do If Error Message "Unable to execute the SQL statement" Is Displayed When I Import Data from OBS to SQL Server? Symptom When CDM is used to import data from OBS to SQL Server, the job fails to be executed and error message "Unable to execute the SQL statement.
Prerequisites When creating an OBS foreign table on DLI, the data storage format of the OBS table must meet the following requirements: When you use the DataSource syntax to create an OBS table, the ORC, Parquet, JSON, CSV, Carbon, and Avro formats are supported.
The following is the EL expression for checking whether the OBS directory that ends with a slash (/) exists: #{OBSUtil.isExistOBSPath("obs://test/jobs/")} The following is the EL expression for checking whether the OBS file exists: #{OBSUtil.isExistOBSPath("obs://test/jobs/job.log
Check whether the OBS bucket is not encrypted. Log in to the OBS management console and click the bucket name to go to the Overview page. Ensure that default encryption is disabled for the OBS bucket. If the OBS bucket is encrypted, click Default Encryption and disable it.
Constraints This function depends on the OBS service. The OBS path is only supported for OBS buckets and not for parallel file systems.
Importing a Job Function This API is used to import one or more job files from OBS to DLF. Before using this API, store job files in OBS buckets.
Figure 3 Configuring properties for an MRS Spark Python node Parameter descriptions: --master yarn --deploy-mode cluster obs://obs-tongji/python/wordcount.py obs://obs-tongji/python/in.txt obs://obs-tongji/python/out Specifically: obs://obs-tongji/python/wordcount.py is the directory
Resources can be imported from OBS or a local path.
Metadata Source Yes Two types of metadata sources are available: Existing file: Select an existing XML metadata file from an OBS bucket. New: Generate an XML metadata file in an OBS bucket based on the vertex tables and edge tables in MRS Hive.
The OBS path is only supported for OBS buckets and not for parallel file systems.
Figure 1 Message displayed Possible Causes Logs of data development jobs are stored in OBS buckets. This message is displayed if the user group to which you belong does not have the OBS operation permission, or no OBS log file is available.
The system automatically populates the Metadata field to the OBS directory where the generated metadata schema is located.
Prerequisites This function depends on OBS buckets. For details about how to configure OBS buckets, see Configuring an OBS Bucket. Script Execution History Log in to the DataArts Studio console by following the instructions in Accessing the DataArts Studio Instance Console.
Resources can be imported from OBS or a local path.