检测到您已登录华为云国际站账号,为了您更好的体验,建议您访问国际站服务网站 https://www.huaweicloud.com/intl/zh-cn
不再显示此消息
This configuration is only available for Flink OpenSource SQL jobs and Flink Jar jobs. Locate the desired Flink job, click More in the Operation column, and select Runtime Configuration.
If this option is selected, you need to set the following parameters: OBS Bucket: Select an OBS bucket to store user job logs. If the selected OBS bucket is not authorized, click Authorize.
Flink Job Agencies Flink OpenSource SQL Jobs Using DEW to Manage Access Credentials Flink Jar Jobs Using DEW to Acquire Access Credentials for Reading and Writing Data from and to OBS Obtaining Temporary Credentials from a Flink Job's Agency for Accessing Other Cloud Services Parent
That is, after account B authorizes account A, account A has the permission to read the metadata and permission information of account B's OBS bucket as well as the read and write permissions on the path. Account A can export data to the OBS path of account B.
Select Save Job Log, and specify the OBS bucket for saving job logs. For details about how to use data types, see Format. Flink 1.15 currently only supports creating OBS tables using Hive syntax, which is supported by Hive dialect DDL statements.
Precautions You can configure the spark.sql.shuffle.partitions parameter to set the number of files to be inserted into the OBS bucket in the non-DLI table.
More Problems Related to Flink Jobs How Do I Map an OBS Table to a DLI Partitioned Table?
Click the name of the corresponding Flink job, choose Run Log, click OBS Bucket, and locate the folder of the log you want to view according to the date.
Request Table 2 Request parameters Parameter Mandatory Type Description paths Yes Array of Strings List of OBS object paths.
Username Username for logging in to the security cluster. krb5_conf Path OBS path to which the krb5.conf file is uploaded. NOTE: The renew_lifetime configuration item under [libdefaults] must be removed from krb5.conf.
Click the name of the corresponding Flink job, choose Run Log, click OBS Bucket, and locate the folder of the log you want to view according to the date.
Example CREATE SCHEMA web; DESCRIBE SCHEMA web; Describe Schema ------------------------------------------------------------------------- web obs://bucket/user/hive/warehouse/web.db dli USER (1 row) Parent topic
Why Are Logs Not Written to the OBS Bucket After a DLI Flink Job Fails to Be Submitted for Running? Why Is the Flink Job Abnormal Due to Heartbeat Timeout Between JobManager and TaskManager? Parent topic: Flink Jobs
Solution Find the DLI job bucket on the OBS management console. View the policy of the bucket you select.
Example: obs://rest-authinfo/tools/oracle/driver/ojdbc6.jar If the driver JAR file defined in this parameter is updated, you need to restart the queue for the update to take effect.
Can I Import OBS Bucket Data Shared by Other Tenants into DLI? Regions and AZs Can a Member Account Use Global Variables Created by Other Member Accounts? Is DLI Affected by the Apache Spark Command Injection Vulnerability (CVE-2022-33891)? How Do I Manage Jobs Running on DLI?
For details, see Creating an OBS Table or Creating a DLI Table.
Select Save Job Log and select an OBS bucket. If you are not authorized to access the bucket, click Authorize. This allows job logs be saved to your OBS bucket. If a job fails, the logs can be used for fault locating.
Select Save Job Log, and specify the OBS bucket for saving job logs. For details about how to use data types, see Format. Flink 1.15 currently only supports creating OBS tables using Hive syntax, which is supported by Hive dialect DDL statements.
to a queue queue queueAuthorize Modifying the CIDR block of a queue queue replaceQueue Restarting a queue queue queueActions Scaling out/in a queue queue queueActions Submitting a job (SQL) queue submitJob Canceling a job (SQL) jobs cancelJob Granting DLI the permission to access OBS