检测到您已登录华为云国际站账号,为了您更好的体验,建议您访问国际站服务网站 https://www.huaweicloud.com/intl/zh-cn
不再显示此消息
When data is stored on OBS, any charges for storage resource usage will be billed by OBS, not DLI. Billing for scanned data You are billed based on the scanned data volume in each job, in GB.
EndPoint (OBSENDPOINT) Yes OBS endpoint address obs.
Make sure that you have uploaded the keys and certificates to the specified OBS path and included them in the other dependencies in the job configuration.
Click the name of the corresponding Flink job, choose Run Log, click OBS Bucket, and locate the folder of the log you want to view according to the date.
`obs://bucket0/db0/table0`;
The path must be an OBS parallel bucket and cannot end with a slash (/).
When using the metadata service provided by DLI, this command does not support OBS paths. System Response You can check whether the job status is successful, and view the job log to confirm whether there is any exception. Parent topic: Hudi DML Syntax
Prerequisites None Precautions When creating a Flink OpenSource SQL job, you need to set Flink Version to 1.12 on the Running Parameters tab of the job editing page, select Save Job Log, and set the OBS bucket for saving job logs.
Importing Data to DLI DLI allows you to analyze and query data stored in OBS without the need to migrate it. Simply upload your data to OBS and use DLI for data analysis. Migrate data from various sources to DLI for central storage and processing.
sparkSession.sparkContext().addFile("obs://Bucket name/Address/transport-keystore.jks"); sparkSession.sparkContext().addFile("obs://Bucket name/Address/truststore.jks"); // Obtain the path of the current working directory.
Common scenarios for creating an agency: DLI is allowed to read and write data from and to OBS to transfer logs. DLI is allowed to access DEW to obtain data access credentials and access catalogs to obtain metadata.
iam_exist", "quote_char": "\"", "start_time": 1517385246111, "table_name": "DLI_table20", "timestamp_format": "yyyy-MM-dd HH:mm:ss", "with_column_header": false } Query jobs of the Export type { "is_success": true, "message": "", "compress": "none", "data_path": "obs
Click the name of the corresponding Flink job, choose Run Log, click OBS Bucket, and locate the folder of the corresponding log based on the job running date.
The options are as follows: MANAGED: DLI table EXTERNAL: OBS table VIEW: view data_type No String Data type, including CSV, Parquet, ORC, JSON, and Avro. data_location No String Path for storing data, which is an OBS path. storage_properties No Array of objects Storage attribute,
The default value is 10. obs_bucket No String OBS bucket where users are authorized to save the snapshot. This parameter is valid only when checkpoint_enabled is set to true. OBS bucket where users are authorized to save the snapshot.
entrypoint="obs://your_obs_bucket_name/your/flink/job.jar", # Program package uploaded to OBS that stores the user-defined job main class.
Database: default Owner: admintest LastAccessTime: 0 Location: obs
Make sure you have authorized DLI to use OBS buckets for saving the SQL execution plans of user jobs. SQL execution plans are stored in paid storage buckets for DLI jobs. The system does not automatically delete them.
Select Save Job Log, and specify the OBS bucket for saving job logs. Storing authentication credentials such as usernames and passwords in code or plaintext poses significant security risks. It is recommended using DEW to manage credentials instead.
Click the name of the corresponding Flink job, choose Run Log, click OBS Bucket, and locate the folder of the corresponding log based on the job running date.