检测到您已登录华为云国际站账号,为了您更好的体验,建议您访问国际站服务网站 https://www.huaweicloud.com/intl/zh-cn
不再显示此消息
of letters, numbers, and underscores (_) bucket_name OBS bucket name tbl_path Storage location of the Delta table in the OBS bucket boolExpression Filter conditions for deleting records Required Permissions SQL permissions Table 2 Permissions required for executing DELETE Permission
`obs://bucket_name/tbl_path`; Show detailed information about a table: DESCRIBE DETAIL [database_name.]table_name|DELTA.
DLI.0002: FileNotFoundException: getFileStatus on obs://xxx: status [404] Solution Check whether there is another job that has deleted table information. DLI does not allow multiple jobs to read and write the same table at the same time.
`obs://bucket_name/tbl_path` select query; Overwrite mode: INSERT OVERWRITE [database_name.]table_name|DELTA.
If you have configured an OBS bucket to store job logs, you can access it to download and check historical logs. For how to upload files to an OBS bucket, see Uploading an Object in the Object Storage Service Getting Started.
For example, if the current OBS table directory is obs://bucketName/filePath and a Trash directory has been created in the OBS table directory, you can set the trash bin directory to obs://bucketName/filePath/Trash.
"obs:bucket:GetBucketAcl", "obs:bucket:GetBucketStoragePolicy", "obs:object:AbortMultipartUpload", "obs:object:DeleteObjectVersion", "obs:object:GetObjectAcl", "obs:bucket:ListBucketVersions",
", "obs:bucket:GetBucketVersioning", "obs:object:GetObject", "obs:object:GetObjectVersionAcl", "obs:object:DeleteObject", "obs:object:ListMultipartUploadParts", "obs:bucket:HeadBucket",
Prerequisites Before the configuration, create an OBS bucket or parallel file system (PFS). In big data scenarios, you are advised to create a PFS. PFS is a high-performance file system provided by OBS, with access latency in milliseconds.
Prerequisites Before the configuration, create an OBS bucket or parallel file system (PFS). In big data scenarios, you are advised to create a PFS. PFS is a high-performance file system provided by OBS, with access latency in milliseconds.
Partition-related Syntax Adding Partition Data (Only OBS Tables Supported) Renaming a Partition (Only OBS Tables Supported) Deleting a Partition Deleting Partitions by Specifying Filter Criteria (Only Supported on OBS Tables) Altering the Partition Location of a Table (Only OBS Tables
Spark Jar Jobs Using Spark Jar Jobs to Read and Query OBS Data Using the Spark Job to Access DLI Metadata Using Spark Jobs to Access Data Sources of Datasource Connections Spark Jar Jobs Using DEW to Acquire Access Credentials for Reading and Writing Data from and to OBS Obtaining
Resource Planning and Costs Table 1 Resource planning and costs Resource Description Cost OBS You need to create an OBS bucket and upload data to OBS for data analysis using DLI.
Why Is No Data Queried in the DLI Table Created Using the OBS File Path When Data Is Written to OBS by a Flink Job Output Stream?
": "Kafka_SSL", "create_time": 1578896427789, "update_time": 1578898059677, "owner": "ei_dlics_d00352221", "truststore_location": "obs://lan-1/cer/truststore.jks", "keystore_location": "obs://lan-1/cer/keystore.jks" }, { "auth_info_name": "lan3", "datasource_type":
String OBS path of the truststore configuration file keystore_location String OBS path of the keystore configuration file owner String Username Example Request None Example Response { "count": 19, "auth_infos": [{ "auth_info_name": "lan2", "datasource_type": "Kafka_SSL",
In the current version, the location must be specified and can only be set to an OBS path. Creating a table by delta.'Obs path' means the table cannot be found with show tables.
For example, if you select obs://bucket/src1/ as the bucket path, then The job result export path is obs://bucket/src1/test.csv. Export Mode Yes New OBS directory If you select this mode, a new folder path is created and the job results are saved to this path.
How Can I Resolve Data Inconsistencies When Importing Data from DLI to OBS?
is the actual OBS path for storing data.