检测到您已登录华为云国际站账号,为了您更好的体验,建议您访问国际站服务网站 https://www.huaweicloud.com/intl/zh-cn
不再显示此消息
For details about the syntax for creating a table, see Creating an OBS Table Using the Hive Syntax.
path of the krb5 configuration file keytab No String OBS path of the keytab configuration file truststore_location No String OBS path of the truststore configuration file truststore_password No String Password of the truststore configuration file keystore_location No String OBS path
the table in the database, consisting of letters, numbers, and underscores (_) bucket_name OBS bucket name tbl_path Storage location of the Delta table in the OBS bucket constraint_name Constraint name boolExpression Constraint expression Required Permissions SQL permissions Table
OBS console Upload the UDAF Jar file to an OBS path. 5 Create a DLI package. DLI console Select the UDAF Jar file that has been uploaded to OBS for management. 6 Create a UDAF on DLI.
Why Do I Get "ResponseCode: 403" and "ResponseStatus: Forbidden" Errors When a Spark Job Accesses OBS Data? Why Do I Encounter the Error "verifyBucketExists on XXXX: status [403]" When Using a Spark Job to Access an OBS Bucket That I Have Permission to Access?
Currently, the multiversion function supports only OBS tables created using the Hive syntax. For details about the syntax for creating a table, see Creating an OBS Table Using the Hive Syntax.
Managing Table Resources on the DLI Console Configuring Table Permissions on the DLI Console Deleting a Table on the DLI Console Changing the Table Owner on the DLI Console Importing OBS Data to DLI Exporting DLI Table Data to OBS Previewing Table Data on the DLI Console Parent topic
Spark SQL job development Using Spark SQL Jobs to Analyze OBS Data Use a Spark SQL job to create OBS tables, and import, insert, and query OBS table data.
There are the following ways to manage JAR files: Upload packages to OBS: Upload Jar packages to an OBS bucket in advance and select the corresponding OBS path.
More information: Exporting Data to OBS Using SQL Statements Importing Data from OBS to a DLI Table Using SQL Statements Encrypting Data at Rest To enhance user data security, DLI allows you to store data tables using encrypted OBS buckets.
If the function is disabled, running logs will not be dumped to an OBS bucket. In this case, perform 2 to save job logs. On the job running page, select Save Job Log and specify an OBS bucket for storing the logs. Click Start to run the job again.
Can I Delete a Row of Data from an OBS Table or DLI Table? Deleting a row of data from an OBS table or DLI table is not allowed. Parent topic: SQL Job Development
`obs://bucket0/db0/delta_table1` SHALLOW CLONE delta_table0 VERSION AS OF 10; System Response Displays whether the task is successfully executed in the execution history or job list. Parent topic: Delta DDL Syntax
SQL Jobs Using Spark SQL Jobs to Analyze OBS Data Developing a DLI SQL Job in DataArts Studio Calling UDFs in Spark SQL Jobs Calling UDTFs in Spark SQL Jobs Calling UDAFs in Spark SQL Jobs
OBS Path: Select the OBS path where the egg package is stored. Set Group and Group Name as you need. Click OK. On the Spark job editing page where the error is reported, choose the uploaded egg package from the Python File Dependencies drop-down list and run the Spark job again.
Currently, the multiversion function supports only OBS tables created using the Hive syntax. For details about the syntax for creating a table, see Creating an OBS Table Using the Hive Syntax.
There are two ways to manage packages: (Recommended) Upload packages to OBS: Upload Jar packages to an OBS bucket in advance and select the OBS path when configuring a job. The DLI package function will soon be discontinued.
If you select Enable Checkpointing, you also need to set OBS Bucket. OBS Bucket: Select an OBS bucket to store your checkpoints. If the OBS bucket you selected is unauthorized, click Authorize.
Flink Jobs Stream Ecosystem Flink OpenSource SQL Jobs Flink Jar Job Examples Writing Data to OBS Using Flink Jar Using Flink Jar to Connect to Kafka that Uses SASL_SSL Authentication Using Flink Jar to Read and Write Data from and to DIS Flink Job Agencies
----------------------------------------------------------------------------- tableName:page_views owner:admintest location:obs