检测到您已登录华为云国际站账号,为了您更好的体验,建议您访问国际站服务网站 https://www.huaweicloud.com/intl/zh-cn
不再显示此消息
Solutions Method 1: Create a bucket named dlf-log-{projectID} in OBS and grant the operation permission to the scheduling user. The OBS path is only supported for OBS buckets and not for parallel file systems.
Figure 1 Download Center Set the default OBS path. The workspace admin can set the default OBS path for dump for the current workspace. In the left navigation pane on the DataArts Factory console, choose Download Center. Click Set Default OBS Path. Set the default OBS path.
Data from OBS to CSS Migrating Data from OBS to DLI Migrating Data from MRS HDFS to OBS Migrating the Entire Elasticsearch Database to CSS Parent topic: DataArts Migration (CDM Jobs)
Figure 2 Exporting jobs Importing Jobs This function is available only if the OBS service is available. If OBS is unavailable, jobs can be imported from the local PC. The maximum size of a job file imported from OBS is 10 MB.
OBS endpoint Endpoint of OBS obs.ap-southeast-1.myhuaweicloud.com OBS bucket OBS bucket that stores historical data one month ago of the CDH cluster cdm AK/SK AK and SK for accessing OBS - MRS Manager IP IP address of MRS Manager 192.168.3.11 Parent topic: Migration of IoV Big Data
KERBEROS Account Username for logging in to MRS Manager cdm Password Password for logging in to MRS Manager - OBS storage support Whether to support OBS storage. If the Hudi table data is stored in OBS, you need to enable this function.
The OBS file fails to be accessed. Check the OBS file. 400 DLF.1006 The job node is empty. The job node is empty. Check the node. 400 DLF.1242 The OBS bucket does not exist. The OBS bucket does not exist.
The OBS path where the resource file is located is obs://dlf-test/hadoop-mapreduce-examples-2.4.1.jar. The JAR package and properties file on which the resource's main JAR package depends are obs://dlf-test/depend1.jar","obs://dlf-test/depend2.jar. The description is test.
--Program parameter -c org.apache.flink.streaming.examples.wordcount.WordCount --Flink job resource package wordcount --Input data path obs://dlf-test/lkj_test/input/word.txt --Output data path obs://dlf-test/lkj_test/output.txt Specifically: obs://dlf-test/lkj_test/input/word.txt
Yes OBS Link of List File Select an existing OBS link. obs_link OBS Bucket of entries files Name of the OBS bucket that stores the text file obs-cdm Path/Directory of entries files Custom OBS directories that store the text file.
Delete OBS, and OBS Manager obs:bucket:GetBucketLocation obs:bucket:ListBucketVersions obs:object:GetObject obs:bucket:CreateBucket obs:bucket:DeleteBucket obs:object:DeleteObject obs:object:PutObject obs:bucket:ListAllMyBuckets obs:bucket:ListBucket Parent topic: Configuring Resources
The OBS path where the resource file is located is obs://dlf-test/hadoop-mapreduce-examples-2.4.1.jar. The JAR package and properties file on which the resource's main JAR package depends are obs://dlf-test/depend1.jar","obs://dlf-test/depend2.jar. The description is test.
Function This API is used to import one or more connection files from OBS to the Data Development module. Before using this API, store connection files in OBS buckets.
Delete OBS OBS Manager Open/Close Resource Data Quality Monitor Subjob For Each SMN Dummy Parent topic: DataArts Factory
Day OBS Link for Writing Backups Link used to back up jobs to OBS buckets.
You can obtain the OBS bucket endpoint by either of the following means: To obtain the endpoint of an OBS bucket, go to the OBS console and click the bucket name to go to its details page.
obs:object:GetObject obs:object:PutObject obs:object:DeleteObject obs:bucket:GetBucketLocation obs:bucket:ListAllMyBuckets obs:bucket:ListBucket obs:bucket:ListBucketVersions obs:bucket:CreateBucket obs:bucket:DeleteBucket Run the following OBS job nodes: Create OBS, Delete OBS,
It is used to configure OBS files. No OBS Link Select an OBS link. obs_link OBS Bucket Select an OBS bucket. obs_test Config File Select the OBS configuration file. /obs/config.csv Max. Poll Records (Optional) Maximum number of records per poll 100 Max.
You can obtain the OBS bucket endpoint by either of the following means: To obtain the endpoint of an OBS bucket, go to the OBS console and click the bucket name to go to its details page.
In the Browse OBS File dialog box, select an OBS folder. Figure 1 Managing backup Daily Backup starts at 00:00 every day to back up all jobs, scripts, resources, and environment variables of the previous day.