检测到您已登录华为云国际站账号,为了您更好的体验,建议您访问国际站服务网站 https://www.huaweicloud.com/intl/zh-cn
不再显示此消息
Using Hudi to Develop Jobs in DLI Submitting a Spark SQL Job in DLI Using Hudi Submitting a Spark Jar Job in DLI Using Hudi Submitting a Flink SQL Job in DLI Using Hudi Using HetuEngine on Hudi
Managing Program Packages of Jar Jobs Package Management Overview Creating a DLI Package Configuring DLI Package Permissions Changing the DLI Package Owner Managing DLI Package Tags DLI Built-in Dependencies Parent Topic: Common DLI Management Operations
APIs Related to SQL Jobs (Discarded) Submitting a SQL Job (Discarded) Canceling a Job (Discarded) Querying the Job Execution Result-Method 1 (Discarded) Querying the Job Execution Result-Method 2 (Discarded) Parent topic: Out-of-Date APIs
APIs Related to Flink Jobs (Discarded) Querying Job Monitoring Information (Discarded) Parent topic: Out-of-Date APIs
Connecting to DLI and Submitting SQL Jobs Using JDBC Scenario In Linux or Windows, you can connect to the DLI server using JDBC. Jobs submitted to DLI using JDBC are executed on the Spark engine. Once JDBC 2.X has undergone function reconstruction, query results can only be accessed
Connecting to DLI and Submitting SQL Jobs Using JDBC Scenario In Linux or Windows, you can connect to the DLI server using JDBC. Jobs submitted to DLI using JDBC are executed on the Spark engine. Once JDBC 2.X has undergone function reconstruction, query results can only be accessed
How Do I Manage Jobs Running on DLI? To manage a large number of DLI jobs, you can use the following methods: Manage jobs by group. Group tens of thousands of jobs by type and run each group on a queue. Create IAM users. Alternatively, create IAM users to execute different types of
Using Spark Jobs to Access Data Sources of Datasource Connections Overview Connecting to CSS Connecting to GaussDB(DWS) Connecting to HBase Connecting to OpenTSDB Connecting to RDS Connecting to Redis Connecting to Mongo Parent topic: Spark Jar Jobs
Can I Upload Configuration Files for Flink Jar Jobs? Uploading a Configuration File for a Flink Jar Job You can upload configuration files for custom jobs (Jar). Upload the configuration file to DLI through Package Management. In the Other Dependencies area of the Flink Jar job, select
How Do I Authorize a Subuser to View Flink Jobs? A sub-user can view queues but cannot view Flink jobs. You can authorize the sub-user using DLI or IAM. Authorization on DLI Log in to the DLI console using a tenant account, a job owner account, or an account with the DLI Service Administrator
Flink OpenSource SQL Jobs Using DEW to Manage Access Credentials Scenario When DLI writes the output data of Flink jobs to MySQL or GaussDB(DWS), you need to set attributes such as the username and password in the connector. However, information such as usernames and passwords is
How Do I View the Resource Usage of DLI Spark Jobs? Viewing the Configuration of a Spark Job Log in to the DLI console. In the navigation pane, choose Job Management > Spark Jobs. In the job list, locate the target job and click next to Job ID to view the parameters of the job. The
How Do I Check for a Backlog of Jobs in the Current DLI Queue? Symptom You need to check the large number of jobs in the Submitting and Running states on the queue. Solution Use Cloud Eye to view jobs in different states on the queue. The procedure is as follows: Log in the management