检测到您已登录华为云国际站账号,为了您更好的体验,建议您访问国际站服务网站 https://www.huaweicloud.com/intl/zh-cn
不再显示此消息
Modifying the OMS Service Configuration Based on the security requirements of the user environment, you can modify the Kerberos and LDAP configurations in the OMS on FusionInsight Manager. This section applies only to MRS 3.x or later. System Impacts After the OMS service configuration
Creating Operators for Alerts and Jobs Scenarios You can use a stored procedure to create an operator (notification recipient) for use with alerts and jobs. Prerequisites An RDS for SQL Server DB instance has been connected. Connect to the DB instance through the SQL Server client
Managing Jobs Using Tags Scenarios Tag Management Service (TMS) enables you to use tags to manage custom jobs. TMS works with other cloud services to enable tag management. TMS manages tags globally, and other cloud services use these tags to manage their specific tasks. You can manage
Managing Binary SCA Jobs Scenarios This section describes how to search for, delete, or stop a binary SCA job. Prerequisites You have obtained a username and its password to log in to the management console. A job has been added. Checking a Job Log in to the CodeArts Governance console
Flink OpenSource SQL Jobs Reading Data from Kafka and Writing Data to RDS Reading Data from Kafka and Writing Data to GaussDB(DWS) Reading Data from Kafka and Writing Data to Elasticsearch Reading Data from MySQL CDC and Writing Data to GaussDB(DWS) Reading Data from PostgreSQL CDC
Querying Supported Image OSs Function This interface is used to query the list of compatible ECS OSs in the current region. Huawei Cloud has stopped providing Windows images. This interface will no longer be used to query Windows images. URI GET /v1/cloudimages/os_version Table 1
Querying Volcano Jobs in a Namespace Function This API is used to query all Volcano jobs in a specified namespace. Calling Method For details, see Calling APIs. URI GET /apis/batch.volcano.sh/v1alpha1/namespaces/{namespace}/jobs Table 1 Path Parameters Parameter Mandatory Type Description
Creating and Managing Scheduled Jobs Context Time-consuming jobs, such as summarizing statistics or synchronizing data from another database, affect service performance if they are performed during the daytime and incur overtime hours if performed at night. To solve this problem,
Creating and Managing Scheduled Jobs Context Time-consuming jobs, such as summarizing statistics or synchronizing data from another database, affect service performance if they are performed during the daytime and incur overtime hours if performed at night. To solve this problem,
Managing Model Training Jobs Viewing Training Job Details Viewing the Resource Usage of a Training Job Viewing the Model Evaluation Result Viewing Training Job Events Viewing Training Job Logs Priority of a Training Job Using Cloud Shell to Debug a Production Training Job Copying,
Managing Batch Inference Jobs Viewing Details About a Batch Service Viewing Events of a Batch Service Managing the Lifecycle of a Batch Service Modifying a Batch Service Parent topic: Using ModelArts Standard to Deploy Models for Inference and Prediction
SDKs Related to SQL Jobs Database-Related SDKs Table-Related SDKs Job-related SDKs Parent topic: Java SDK (DLI SDK V1)
SDKs Related to SQL Jobs Database-Related SDKs Table-Related SDKs Job-related SDKs Parent topic: Python SDK (DLI SDK V1)
SDKs Related to Spark Jobs Prerequisites You have configured the Java SDK environment by following the instructions provided Overview. You have initialized the DLI Client by following the instructions provided in Initializing the DLI Client and created queues by following the instructions
SDKs Related to Flink Jobs Prerequisites You have configured the Java SDK environment by referring to Overview. You have initialized the DLI client by referring to Initializing the DLI Client and created queues by referring to Queue-Related SDKs. Creating a SQL Job DLI provides an
SDKs Related to Spark Jobs For details about the dependencies and complete sample code, see Overview. Submitting Batch Jobs DLI provides an API to perform batch jobs. The example code is as follows: 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 def submit_spark_batch_job(dli_client
Creating Operators for Alerts and Jobs Scenarios You can use a stored procedure to create an operator (notification recipient) for use with alerts and jobs. Prerequisites An RDS for SQL Server DB instance has been connected. You can connect to the DB instance through a SQL Server
OSs That Can Be Collected and Migrated Supported Windows OSs MgC supports the collection and migration of all Windows OS versions. Supported Linux OSs MgC supports the collection and migration for specific Linux distributions and versions that use OpenSSH 7.0 or later, but if a source
Professional Event Stream Jobs Creating a Professional Event Stream Job Deleting a Professional Event Stream Job Enabling a Professional Event Stream Job Disabling a Professional Event Stream Job Configuring a Professional Event Stream Job Querying Details About a Professional Event
APIs (in OBT) API Version Queries Intelligent O&M Development Tool