检测到您已登录华为云国际站账号,为了您更好的体验,建议您访问国际站服务网站 https://www.huaweicloud.com/intl/zh-cn
不再显示此消息
You can select an existing OBS bucket. The selected OBS bucket is globally configured in the current workspace. If you do not set this parameter, job logs of DataArts Factory are stored in the OBS bucket named dlf-log-{projectId} by default.
The default path is obs://dlf-log-....../. Select "I confirm that OBS bucket obs://dlf-log-....../ will be created and used to store DLF job logs only". To change the log path, go to the workspace management page on the DataArts Studio console.
Write Dirty Data: Specify this parameter if data that fails to be processed or filtered out during job execution needs to be written to OBS for future viewing. Before writing dirty data, create an OBS link.
The supported source data types are DLI, OBS and MySQL. When the source data type is DLI, the supported destination data types are DWS, GES, CSS, OBS, and DLI. When the source data type is MySQL, the supported destination data type is MySQL.
OBS and HDFS are supported. HDFS supports only MRS Spark, MRS Flink Job and MRS MapReduce nodes. File Path Yes Select an OBS file path when Resource Location is set to OBS. Select an MRS cluster name when Resource Location is set to HDFS.
Use CDM to migrate data from DES to Object Storage Service (OBS). Use CDM to migrate data from OBS to MRS. The operations on CDM are the same as those described in Using CDM to Migrate Data of the Last Month.
Importing a Solution This solution is available only if the OBS service is available. If OBS is unavailable, data can be imported from the local PC.
Topic Name Yes Notification topic OBS Bucket Yes OBS bucket for storing notification records Notification Yes Whether to enable the notification function. The function is enabled by default. Click OK. DataArts Factory sends notifications through SMN. Using SMN may incur fees.
table or in the partition directory of an OBS partition table.
You can enter the JAR package name or the corresponding OBS path. The format is as follows: obs://Bucket name/Folder name/Package name.
How Do I Use CDM to Export MySQL Data to an SQL File and Upload the File to an OBS Bucket? What Should I Do If CDM Fails to Migrate Data from OBS to DLI? What Should I Do If a CDM Connector Reports the Error "Configuration Item [linkConfig.iamAuth] Does Not Exist"?
Simplified Migration of Trade Data to the Cloud and Analysis Scenario Analysis Process Using CDM to Upload Data to OBS Analyzing Data
OBS Bucket: Select an OBS bucket. Database Path Config: The database path is dynamically combined based on the OBS bucket name. The suffix can be automatically matched with built-in keywords. For example, when you enter {{?, all options are automatically displayed, such as {{?
OBS path: OBS 2. DLI package: DLIResources jobClass No String Main class name.
OBS path: OBS 2. DLI package: DLIResources jobClass No String Main class name.
The migration depends on the OBS service. Plan OBS buckets and folders in advance. DataArts Studio data migration depends on the backup, import, and export capabilities of each module. You can choose to migrate the data of the module you want.
Create a job to migrate data from OBS to DWS. For details, see Migrating Data from OBS to DWS. Develop data, including creating DWS SQL scripts and a job.
Uploading Files to JupyterLab Uploading a File from a Local Path to JupyterLab Cloning GitHub Open-Source Repository Files to JupyterLab Uploading OBS Files to JupyterLab Parent topic: Notebook Development
Destination Job Parameters To a Relational Database To OBS To HDFS To Hive To HBase/CloudTable To DDS To Elasticsearch/Cloud Search Service To DLI To DIS Parent Topic: Public Data Structures
Path Yes OBS path where the data is stored. This parameter is available only when Data Location is set to OBS. If no OBS path or OBS bucket is available, the system automatically creates an OBS directory.