检测到您已登录华为云国际站账号,为了您更好的体验,建议您访问国际站服务网站 https://www.huaweicloud.com/intl/zh-cn
不再显示此消息
Object Storage Migration Service (OMS) The Organizations service provides Service Control Policies (SCPs) to set access control policies. SCPs do not actually grant any permissions to a principal. They only set the permissions boundary for the principal. When SCPs are attached to
Extreme SSD V2 Disks (OBT) Extreme SSD V2 disks deliver ultra-high IOPS, ultra-high throughput, and ultra-low latency. They are designed for latency-sensitive mission-critical applications and provide sustained IOPS performance. You can buy Extreme SSD V2 disks of a given capacity
Configuring Autoscaling for a DB Instance (OBT) Scenarios As your workloads grow, the used storage of a yearly/monthly DB instance may exceed the initially purchased storage. Additional storage will be billed on a pay-per-use basis. To better adapt to this change, TaurusDB provides
Replicating and Rebuilding a Synchronization Task (OBT) Scenarios You can rebuild a synchronization task for a standard HTAP instance. You can also replicate an existing synchronization task from another standard HTAP instance. Constraints Replicating and rebuilding a synchronization
Cold and Hot Data Separation (OBT) What Is Cold and Hot Data Separation? Constraints Configuring a Cold Table
Binding an EIP to a Proxy Instance (OBT) After a proxy instance is created, you can bind an EIP to it. Later, you can also unbind the EIP from the proxy instance as required. Constraints This function is in the OBT phase. To use it, submit a service ticket. Billing Traffic generated
Will Subsequent Jobs Be Affected If a Job Fails to Be Executed During Scheduling of Dependent Jobs? What Should I Do? Possible Causes One of the jobs that depend on each other fails during scheduling. Solution The subsequent jobs may be suspended, continued, or canceled, depending
What Types of Spark Jobs Can Be Submitted in a Cluster? Question: What Types of Spark Jobs Can Be Submitted in a Cluster? Answer: MRS clusters support Spark jobs submitted in Spark, Spark Script, or Spark SQL mode. Parent topic: Job Management
How Do I Back Up CDM Jobs? Symptom How do I back up CDM jobs? Solution You can use the batch export function of CDM to save all job scripts to a local PC. Then, you can create a cluster and import the jobs again when necessary. Parent topic: Functions
Querying the exe Object List of Jobs (Deprecated) Function This API is used to query the exe object list of all jobs. This API is incompatible with Sahara. URI Format GET /v1.1/{project_id}/job-exes Parameter description Table 1 URI parameter Parameter Mandatory Description project_id
On What Linux OSs Can I Install the Agent? To use database audit, you need to install its agent on database nodes or application nodes. The database audit agent can be installed on a 64-bit Linux OS. Table 1 provides more details. Table 1 Supported Linux OS versions System Name System
How Do I Manage Jobs Running on DLI? To manage a large number of DLI jobs, you can use the following methods: Manage jobs by group. Group tens of thousands of jobs by type and run each group on a queue. Create IAM users. Alternatively, create IAM users to execute different types of
Using Spark Jobs to Access Data Sources of Datasource Connections Overview Connecting to CSS Connecting to GaussDB(DWS) Connecting to HBase Connecting to OpenTSDB Connecting to RDS Connecting to Redis Connecting to Mongo Parent topic: Spark Jar Jobs
ALM-50401 Number of JobServer Jobs Waiting to Be Executed Exceeds the Threshold Alarm Description The system checks the number of jobs submitted to JobServer every 30 seconds. This alarm is generated when the number of jobs to be executed exceeds 800. Alarm Attributes Alarm ID Alarm
How Do I Back Up CDM Jobs? Symptom How do I back up CDM jobs? Solution You can use the batch export function of CDM to save all job scripts to a local PC. Then, you can create a cluster and import the jobs again when necessary. Parent topic: DataArts Migration (CDM Jobs)
How Do I Configure Notifications for All Jobs? Symptom How to configure notifications for all jobs Solution On the DataArts Studio console, locate a workspace and click DataArts Factory. In the navigation pane on the left, choose Monitoring > Job Monitoring. Then click the Batch Jobs
Migrating Data to TaurusDB Enterprise Edition (OBT) You can easily migrate data from RDS for MySQL instances to pay-per-use TaurusDB Enterprise Edition instances on the console. There is no need to create a DRS task. IP addresses remain unchanged and data can be transparently migrated
Enabling Binlog Pull for a Proxy Instance (OBT) Scenarios You can enable binlog pull for a proxy instance and then use the proxy address to pull binlogs from the primary node or read replicas. Constraints This function is in the OBT phase. To use it, submit a service ticket. This
What OSs Can Be Used by SAP Products on HUAWEI CLOUD? HUAWEI CLOUD provides the SAP on Cloud solution with dedicated OSs, for example, SUSE Linux Enterprise Server (SLES) 12 SP1 for SAP or later. Parent topic: Purchase
What Is the Number of Concurrent Jobs for Different CDM Cluster Versions? CDM migrates data through data migration jobs. It works in the following way: When data migration jobs are submitted, CDM splits each job into multiple tasks based on the Concurrent Extractors parameter in the