检测到您已登录华为云国际站账号,为了您更好的体验,建议您访问国际站服务网站 https://www.huaweicloud.com/intl/zh-cn
不再显示此消息
Can I Create Jobs in Batches? Symptom Can I create CDM jobs in batches? Solution CDM supports batch job creation with the help of the batch import function. You can create jobs in batches as follows: Create a job manually. Export the job and save the job's JSON file to a local PC.
Can I Schedule Jobs in Batches? Symptom Can I schedule CDM jobs in batches? Solution Yes. Access the DataArts Factory module of the DataArts Studio service. In the navigation pane of the DataArts Factory homepage, choose Data Development > Develop Job to create a job. Drag multiple
Training Jobs Created in a Dedicated Resource Pool No Cloud Storage Name or Mount Path Displayed on the Page for Creating a Training Job Parent topic: Training Jobs
Managing Scientific Computing Model Training Jobs In the training job list, a job creator can edit, start, clone (copy a training job), retry (retrain a job), and delete a job. Log in to ModelArts Studio and access a workspace. In the navigation pane, choose Model Development > Model
Migrating Servers Whose OSs Are Not Supported by SMS Background Migrating Servers Running Unsupported Linux OSs Migrating Servers Running Unsupported Windows OSs
What OSs Does the Agent Support? The following table lists OSs that are proven to be compatible with the Agent. More OSs will be supported soon. The following systems are created using public images or those provided by Huawei Cloud Image Management Service (IMS). Using an unverified
Listing Registered OUs Enabled with Governance Policies Function This API is used to list registered OUs that governance policies have been enabled for. URI GET https://{endpoint}/v1/managed-organization/managed-organizational-units Table 1 Query Parameters Parameter Mandatory Type
Querying Operations on Registered OUs and Enrolled Accounts Function This API is used to query operations on registered OUs and enrolled accounts in RGC. URI GET https://{endpoint}/v1/managed-organization Table 1 Query Parameters Parameter Mandatory Type Description account_id No
Failed to Run Jobs Related to the sftp-connector Connector Symptom The jobs related to the sftp-connector connector fail to be executed and "Failed to obtain the SFTP stream" is displayed. xxx (Failed to send channel request.) Error "subsystem request failed on channel 0. Connection
Configuring the Distributed Cache to Execute MapReduce Jobs Scenarios This section applies to MRS 3.x or later. Distributed caching is useful in the following scenarios: Rolling upgrade During the upgrade, applications must keep the text content (JAR file or configuration file) unchanged
Supported OSs, Terminals, and Applications Supported OSs Applications deployed on Windows Server 2016 and Windows Server 2019 are supported. Windows Server 2016 Datacenter edition (Chinese) Windows Server 2016 Datacenter edition (English) Windows Server 2019 Datacenter edition (Chinese
Configuring the Distributed Cache to Execute MapReduce Jobs Scenarios This section applies to MRS 3.x or later. Distributed caching is useful in the following scenarios: Rolling upgrade During the upgrade, applications must keep the text content (JAR file or configuration file) unchanged
Changing the Password for an OMS Database Access User This section describes how to regularly change the password for an OMS database access user to enhance system O&M security. Impact on the System The OMS service needs to be restarted for the new password to take effect. The service
Migrating Archived Data in Alibaba Cloud OSS Alibaba Cloud OSS provides the ossutil tool to restore archived objects with specified prefixes. ossutil is compatible with Windows, Linux, and macOS. Download and install the required version. This section uses Windows as an example. Procedure
DataArts Migration (Real-Time Jobs) Overview of Real-Time Jobs Notes and Constraints Supported Data Sources Authorizing the Use of Real-Time Data Migration Check Before Use Enabling Network Communications Creating a Real-Time Migration Job Configuring a Real-Time Migration Job Tutorials
DataArts Migration (Offline Jobs) Overview of Offline Jobs Notes and Constraints Supported Data Sources Check Before Use Enabling Network Connectivity Creating an Offline Processing Migration Job Configuring an Offline Processing Migration Job Configuring Source Job Parameters Configuring
DataArts Migration (CDM Jobs) Overview Notes and Constraints Supported Data Sources Creating and Managing a CDM Cluster Creating a Link in a CDM Cluster Creating a Job in a CDM Cluster Using Macro Variables of Date and Time Improving Migration Performance Key Operation Guide Tutorials
Where Are the Execution Logs of Spark Jobs Stored? Logs of unfinished Spark jobs are stored in the /srv/BigData/hadoop/data1/nm/containerlogs/ directory on the Core node. Logs of finished Spark jobs are stored in the /tmp/logs/Username/logs directory of the HDFS. Parent topic: Job
How Do I Purchase and Renew VBS? Pay per Use By default, you are charged based on the service duration, which is calculated at the top of every hour, and does not include a minimum fee. After registering a cloud service account, top up your account and then you can use VBS. Yearly
Does VBS Support Cross-Region Backup and Restoration? No. Currently VBS supports only backup and restoration within a region but not across regions.