检测到您已登录华为云国际站账号,为了您更好的体验,建议您访问国际站服务网站 https://www.huaweicloud.com/intl/zh-cn
不再显示此消息
Cross-AZ ECS restoration: Backup data is uploaded to OBS for disaster recovery at the AZ level. Deployment Scheme Figure 2 shows the deployment scheme. Figure 2 Application consistency backup deployment scheme Constraints An ECS can be associated with only one backup policy.
You can use shared KMS to encrypt the secrets and key pairs in DEW, and create an encryption task for instances in Relational Database Service (RDS), Document Database Service (DDS), and Object Storage Service (OBS). For more information, see Sharing Overview.
https://{endpoint}/v2/{project_id}/clusters/{cluster_id}/sql-execution/{sql_id} Example Response Status code: 200 Querying the SQL execution result is successful. { "id" : "20190909_011820_00151_xxxxx", "statement" : "show tables", "status" : "FINISHED", "result_location" : "obs
You can use shared KMS to encrypt the secrets and key pairs in DEW, and create an encryption task for instances in Relational Database Service (RDS), Document Database Service (DDS), and Object Storage Service (OBS). For more information, see Sharing Overview.
None None Importing a Connection (to Be Taken Offline) This API is used to import one or more connection files from OBS to the Data Development module.
When you enter the time point that you want to restore the DB instance to, RDS downloads the most recent full backup file from OBS to the DB instance. Then, incremental backups are also restored to the specified point in time on the DB instance.
PARTITION_NAME from information_schema.partitions where table_schema=\"$database\" and table_name=\"$table\" and PARTITION_ORDINAL_POSITION = $partition_order;") $conn -e"CALL dbms_schs.make_io_transfer(\"start\", \"${database}\", \"${table}\", \"${partition_name}\", \"\", \"obs
Not Using Hard-coded Credentials During Development If you want to develop an algorithm and publish it to the production environment in ModelArts Standard Notebook, you shall check the password, AK/SK, database connection, OBS connection, and SWR connection information used in the
obsClient.appendObject({ Bucket:'bucketname', Key:'objectname', Position : result.InterfaceResult.NextPosition, Body : 'Hello OBS Again' }, function(err, result2){ if(err){ console.error
console.log('NoncurrentVersionExpiration[NoncurrentDays]-->' + result.InterfaceResult.Rules[i]['NoncurrentVersionExpiration']['NoncurrentDays']); } } } }); To handle the error codes possibly returned during the operation, see OBS
Figure 1 Exporting logs in CSV format Figure 2 Exporting logs in TXT format (Optional) Click Configure Dumps to dump the searched logs to the same log file in the OBS bucket at a time. For details, see Adding One-Off Dumps. Parent topic: (Old) Log Management
If this parameter is selected, the createDataKey and decryptDatakey operations on DEW will not be transferred to OBS/LTS. NOTE: For details about DEW audit operations, see Operations supported by CTS. Deselect On the transfer configuration page, enable Transfer to LTS.
If this parameter is selected, the createDataKey and decryptDatakey operations on DEW will not be transferred to OBS/LTS. NOTE: For details about DEW audit operations, see Operations supported by CTS. Deselect On the transfer configuration page, enable Transfer to LTS.
Public services, such as Elastic Cloud Server (ECS), Elastic Volume Service (EVS), Object Storage Service (OBS), Virtual Private Cloud (VPC), Elastic IP (EIP), and Image Management Service (IMS), are shared within the same region.
Figure 1 Cluster Status The OBS usage details are displayed in storage-compute decoupled cluster information. Alarms In the Alarms area, you can view all the uncleared alarms of the current cluster and the alarms generated in the last seven days.
{ "id": "b153f6b8-9335-46a1-913e-c2d8f966d4b2", "name": "CustomJdk", "project_id": "578ac30b81034b89a7255b3af26db9c9", "deploy_mode": "virtualmachine", "type": "Java", "version": "1.0.0", "spec": { "os": null, "sdk": null, "parameters": { "jdk_url": "obs
Table 3 Process of building a Pangu NLP model dataset Procedure Step Description Reference Importing data to the Pangu platform Creating an import task Import data stored in OBS into the platform for centralized management, facilitating subsequent processing or publishing.
source_id} Example Responses Status code: 200 Operation successful. { "id" : "90e0b962-c6c1-438c-ba8a-3024fe592bda", "name" : "first-source", "label" : "first-source", "description" : "first event source", "provider_type" : "CUSTOM", "event_types" : [ { "name" : "OBS
with Jenkins to Automatically Build and Perform Rolling Upgrade on Components Deployed on ServiceStage After the code is developed, you need to pack the code into an image package or JAR package on Jenkins before each rollout, upload the image package to SWR or the JAR package to OBS
Additionally, you can use the private DNS servers to directly access the private IP addresses of cloud services, such as OBS and SMN. Compared with the access through the Internet, this access features high performance and low latency.