检测到您已登录华为云国际站账号,为了您更好的体验,建议您访问国际站服务网站 https://www.huaweicloud.com/intl/zh-cn
不再显示此消息
Listing Registered OUs Enabled with Governance Policies Function This API is used to list registered OUs that governance policies have been enabled for. URI GET https://{endpoint}/v1/managed-organization/managed-organizational-units Table 1 Query Parameters Parameter Mandatory Type
Querying Operations on Registered OUs and Enrolled Accounts Function This API is used to query operations on registered OUs and enrolled accounts in RGC. URI GET https://{endpoint}/v1/managed-organization Table 1 Query Parameters Parameter Mandatory Type Description account_id No
Configuring the Distributed Cache to Execute MapReduce Jobs Scenarios This section applies to MRS 3.x or later. Distributed caching is useful in the following scenarios: Rolling upgrade During the upgrade, applications must keep the text content (JAR file or configuration file) unchanged
Configuring the Distributed Cache to Execute MapReduce Jobs Scenarios This section applies to MRS 3.x or later. Distributed caching is useful in the following scenarios: Rolling upgrade During the upgrade, applications must keep the text content (JAR file or configuration file) unchanged
What OSs Does the Agent Support? The following table lists OSs that are proven to be compatible with the Agent. More OSs will be supported soon. The following systems are created using public images or those provided by Huawei Cloud Image Management Service (IMS). Using an unverified
Jobs in a Pipeline Stage Cannot Be Selected for Configuration Symptom On the Task Orchestration page, jobs in a pipeline stage cannot be selected for configuration. Root Cause The stage is set to Yes for Always Run. Solution Log in to the Huawei Cloud console. Click in the upper left
Migrating Archived Data in Alibaba Cloud OSS Alibaba Cloud OSS provides the ossutil tool to restore archived objects with specified prefixes. ossutil is compatible with Windows, Linux, and macOS. Download and install the required version. This section uses Windows as an example. Procedure
Can I Create Jobs in Batches? Symptom Can I create CDM jobs in batches? Solution CDM supports batch job creation with the help of the batch import function. You can create jobs in batches as follows: Create a job manually. Export the job and save the job's JSON file to a local PC.
Can I Schedule Jobs in Batches? Symptom Can I schedule CDM jobs in batches? Solution Yes. Access the DataArts Factory module of the DataArts Studio service. In the navigation pane of the DataArts Factory homepage, choose Data Development > Develop Job to create a job. Drag multiple
DataArts Migration (Real-Time Jobs) Overview How Do I Troubleshoot a Network Disconnection Between the Data Source and Resource Group? Which Ports Must Be Allowed by the Data Source Security Group So That DataArts Migration Can Access the Data Source? How Do I Configure a Spark Periodic
Changing the Password for an OMS Database Access User This section describes how to regularly change the password for an OMS database access user to enhance system O&M security. Impact on the System The OMS service needs to be restarted for the new password to take effect. The service
DataArts Migration (CDM Jobs) Overview Notes and Constraints Supported Data Sources Creating and Managing a CDM Cluster Creating a Link in a CDM Cluster Creating a Job in a CDM Cluster Using Macro Variables of Date and Time Improving Migration Performance Key Operation Guide Tutorials
DataArts Migration (Offline Jobs) Overview of Offline Jobs Notes and Constraints Supported Data Sources Check Before Use Enabling Network Connectivity Creating an Offline Processing Migration Job Configuring an Offline Processing Migration Job Configuring Source Job Parameters Configuring
DataArts Migration (Real-Time Jobs) Overview of Real-Time Jobs Notes and Constraints Supported Data Sources Authorizing the Use of Real-Time Data Migration Check Before Use Enabling Network Communications Creating a Real-Time Migration Job Configuring a Real-Time Migration Job Tutorials
How Do I Handle Failed Jobs? Context After a backup job fails, a backup whose status is Error is generated, and a message is displayed on the Backup Jobs tab page of Job Status. Click the question mark next to the message to view details. After a restoration job fails, a message is
How Many Jobs Can Be Created in DLF? Each user can create a maximum of 1,000 jobs by default.
What Is the Common File Path for Training Jobs? The path to the training environment and the code directory in the container are generally obtained using the environment variable ${MA_JOB_DIR}, which is /home/ma-user/modelarts/user-job-dir. Parent topic: Compiling the Training Code
What Is TensorBoard Used for in Model Visualization Jobs? Visualization jobs are powered by TensorBoard. For details about TensorBoard functions, see the TensorBoard official website. Parent topic: Functional Consulting
Can I Synchronize Jobs to Other Clusters? Symptom Can I synchronize jobs to other CDM clusters? Solution CDM does not support direct job migration across clusters. However, you can use the batch job import and export function to indirectly implement cross-cluster migration as follows
Where Are the Execution Logs of Spark Jobs Stored? Logs of unfinished Spark jobs are stored in the /srv/BigData/hadoop/data1/nm/containerlogs/ directory on the Core node. Logs of finished Spark jobs are stored in the /tmp/logs/Username/logs directory of the HDFS. Parent topic: Job