检测到您已登录华为云国际站账号,为了您更好的体验,建议您访问国际站服务网站 https://www.huaweicloud.com/intl/zh-cn
不再显示此消息
What Are the Differences Between DLI Tables and OBS Tables? DLI tables store data within the DLI service, keeping you unaware of the storage path. OBS tables store data in your OBS buckets, allowing you to manage source data files.
Flink Jar job Flink Jar Jobs Using DEW to Acquire Access Credentials for Reading and Writing Data from and to OBS Instructions on using DEW to acquire AK/SK for reading and writing data from and to OBS in Flink Jar jobs.
consisting of letters, numbers, and underscores (_) table_name Name of the table in the database, consisting of letters, numbers, and underscores (_) bucket_name OBS bucket name tbl_path Storage location of the Delta table in the OBS bucket using Parameter delta, defines and creates
To set the OBS bucket for storing the job logs, specify a bucket for OBS Bucket. If the selected OBS bucket is not authorized, click Authorize. The logs are saved in the following path: Bucket name/jobs/logs/Directory starting with the job ID.
Name of the table in the database, consisting of letters, numbers, and underscores (_) bucket_name OBS bucket name tbl_path Storage location of the Delta table in the OBS bucket column Target column to be updated EXPRESSION Expression of the source table column to be updated in the
of letters, numbers, and underscores (_) bucket_name OBS bucket name tbl_path Storage location of the Delta table in the OBS bucket boolExpression Filter conditions for deleting records Required Permissions SQL permissions Table 2 Permissions required for executing DELETE Permission
DLI.0002: FileNotFoundException: getFileStatus on obs://xxx: status [404] Solution Check whether there is another job that has deleted table information. DLI does not allow multiple jobs to read and write the same table at the same time.
`obs://bucket_name/tbl_path`; Show detailed information about a table: DESCRIBE DETAIL [database_name.]table_name|DELTA.
`obs://bucket_name/tbl_path` select query; Overwrite mode: INSERT OVERWRITE [database_name.]table_name|DELTA.
"obs:bucket:GetBucketAcl", "obs:bucket:GetBucketStoragePolicy", "obs:object:AbortMultipartUpload", "obs:object:DeleteObjectVersion", "obs:object:GetObjectAcl", "obs:bucket:ListBucketVersions",
If you have configured an OBS bucket to store job logs, you can access it to download and check historical logs. For details about how to upload files to OBS, see Uploading an Object in Object Storage Service Getting Started.
For example, if the current OBS table directory is obs://bucketName/filePath and a Trash directory has been created in the OBS table directory, you can set the trash bin directory to obs://bucketName/filePath/Trash.
", "obs:bucket:GetBucketVersioning", "obs:object:GetObject", "obs:object:GetObjectVersionAcl", "obs:object:DeleteObject", "obs:object:ListMultipartUploadParts", "obs:bucket:HeadBucket",
Partition-related Syntax Adding Partition Data (Only OBS Tables Supported) Renaming a Partition (Only OBS Tables Supported) Deleting a Partition Deleting Partitions by Specifying Filter Criteria (Only Supported on OBS Tables) Altering the Partition Location of a Table (Only OBS Tables
Prerequisites Before the configuration, create an OBS bucket or parallel file system (PFS). In big data scenarios, you are advised to create a PFS. PFS is a high-performance file system provided by OBS, with access latency in milliseconds.
Prerequisites Before the configuration, create an OBS bucket or parallel file system (PFS). In big data scenarios, you are advised to create a PFS. PFS is a high-performance file system provided by OBS, with access latency in milliseconds.
Spark Jar Jobs Using Spark Jar Jobs to Read and Query OBS Data Using the Spark Job to Access DLI Metadata Using Spark Jobs to Access Data Sources of Datasource Connections Spark Jar Jobs Using DEW to Acquire Access Credentials for Reading and Writing Data from and to OBS Obtaining
Resource Planning and Costs Table 1 Resource planning and costs Resource Description Cost OBS You need to create an OBS bucket and upload data to OBS for data analysis using DLI.
Why Is No Data Queried in the DLI Table Created Using the OBS File Path When Data Is Written to OBS by a Flink Job Output Stream?
": "Kafka_SSL", "create_time": 1578896427789, "update_time": 1578898059677, "owner": "ei_dlics_d00352221", "truststore_location": "obs://lan-1/cer/truststore.jks", "keystore_location": "obs://lan-1/cer/keystore.jks" }, { "auth_info_name": "lan3", "datasource_type":