检测到您已登录华为云国际站账号,为了您更好的体验,建议您访问国际站服务网站 https://www.huaweicloud.com/intl/zh-cn
不再显示此消息
)]; Keywords Table 1 Keywords Parameter Description tablename Name of the target DLI or OBS table that runs the Truncate statement. partcol1 Partition name of the DLI or OBS table to be deleted. Precautions Only data in the DLI or OBS table can be deleted.
Syntax 1 DROP TABLE [IF EXISTS] [db_name.]table_name; Keywords If the table is stored in OBS, only the metadata is deleted. The data stored on OBS is not deleted. If the table is stored in DLI, the data and the corresponding metadata are all deleted.
Description File path, which is the OBS path Parent topic: Identifiers
Currently, data can be exported only from a DLI table to OBS, and the OBS path must be specified to the folder level. The OBS path cannot contain commas (,). The OBS bucket name cannot end with the regular expression format .[0-9]+(.*).
You can use it to import data stored in OBS to a created DLI or OBS table. The example code is as follows: 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 //Instantiate the importJob object.
Upload the files you want to access to OBS and then you can analyze the data using Spark jobs. Upload data to an OBS bucket: Upload data stored in SFTP to an OBS bucket using the OBS management console or command-line tools.
Solution Check the permissions of the OBS bucket to ensure that the account has access to the OBS bucket mentioned in the error message. If not, contact the OBS bucket administrator to add access permissions of the bucket. Parent topic: SQL Job O&M
Partitioned tables are classified into OBS tables and DLI tables. You can delete one or more partitions from a DLI or OBS partitioned table based on specified conditions. OBS tables also support deleting partitions by specifying filter criteria.
Step 2: Configure the OBS Bucket File To create an OBS table, upload data to the OBS bucket directory. Use the following sample data to create the testdata.csv file and upload it to an OBS bucket. 12,Michael 27,Andy 30,Justin Log in to the OBS console.
The procedure is as follows: Export the DLI table data in region 1 to the user-defined OBS bucket. For details, see Exporting Data from DLI to OBS. Use the OBS cross-region replication function to replicate data to the OBS bucket in region 2.
You can use this API to import OBS to a DLI table.
Symptom Checkpoint was enabled when a Flink job is created, and the OBS bucket for storing checkpoints was specified. I am not sure how to restore a Flink job from a specific checkpoint after manually stopping the job.
consisting of letters, numbers, and underscores (_) bucket_name OBS bucket name tbl_path Storage location of the Delta table in the OBS bucket constraint_name Constraint name Required Permissions SQL permissions Table 2 Permissions required for executing DROP CONSTRAINT Permission
Overview Importing Data to OBS DLI enables direct access to data stored in OBS for query and analysis, eliminating the need for data migration. To begin using DLI for data analysis, just import your local data into OBS.
Related Services OBS OBS works as the data source and data storage system for DLI, and delivers the following capabilities: Data source: DLI provides an API for you to import data from corresponding OBS paths to DLI tables. For details about the API, see Importing Data.
What Are the Differences Between DLI Tables and OBS Tables? DLI tables store data within the DLI service, keeping you unaware of the storage path. OBS tables store data in your OBS buckets, allowing you to manage source data files.
Example 1 Use datagen to randomly generate data and write the data into the fileName directory in the OBS bucket bucketName. The file generation time is irrelevant to the checkpoint.
consisting of letters, numbers, and underscores (_) table_name Name of the table in the database, consisting of letters, numbers, and underscores (_) bucket_name OBS bucket name tbl_path Storage location of the Delta table in the OBS bucket using Parameter delta, defines and creates
To set the OBS bucket for storing the job logs, specify a bucket for OBS Bucket. If the selected OBS bucket is not authorized, click Authorize. The logs are saved in the following path: Bucket name/jobs/logs/Directory starting with the job ID.
Name of the table in the database, consisting of letters, numbers, and underscores (_) bucket_name OBS bucket name tbl_path Storage location of the Delta table in the OBS bucket column Target column to be updated EXPRESSION Expression of the source table column to be updated in the