检测到您已登录华为云国际站账号,为了您更好的体验,建议您访问国际站服务网站 https://www.huaweicloud.com/intl/zh-cn
不再显示此消息
Select Save Job Log, and specify the OBS bucket for saving job logs. Change the values of the parameters in bold as needed in the following script.
Select Save Job Log, and specify the OBS bucket for saving job logs. Change the values of the parameters in bold as needed in the following script.
Driver File Path Yes It specifies the OBS path where the driver file is located. You need to download a .jar driver file from the corresponding official website and upload it to OBS. MySQL driver: Download it from https://downloads.mysql.com/archives/c-j/.
The OBS server will compare this MD5 value with the MD5 value obtained by object data calculation. If the two values are not the same, the upload fails with an HTTP 400 error returned. If the MD5 value is not specified, the OBS server will skip MD5 value verification.
OBS: indicates that backup files are stored in OBS.
Insert data. 1 sparkSession.sql("insert into testhbase values('95274','abc','Hongkong')"); Query data. 1 sparkSession.sql("select * from testhbase").show(); Submitting a Spark job Generate a JAR file based on the code file and upload the JAR file to the OBS bucket.
Compliance audit team Cloud Trace Service (CTS), Log Tank Service (LTS), Config, and Object Storage Service (OBS) Data platform Centrally deploy big data platforms and collect service data from other accounts to store, process, and analyze them on the data platforms.
obs:object:PutObject POST /v1/{project_id}/app-warehouse/bucket-and-acl/create workspace:appWarehouse:createBucketOrAcl obs:bucket:GetBucketAcl obs:bucket:HeadBucket obs:bucket:PutBucketAcl obs:bucket:PutReplicationConfiguration obs:bucket:CreateBucket obs:bucket:PutBucketCORS GET
For details, see Introduction to OBS Access Control, IAM Custom Policies, and Creating a Custom Bucket Policy. The mapping between OBS regions and endpoints must comply with what is listed in Regions and Endpoints.
Select Save Job Log, and specify the OBS bucket for saving job logs. Change the values of the parameters in bold as needed in the following script.
Table 3 BackupFilesBody Parameter Type Description file_source String Data source, which can be an OBS bucket or a backup record. bucket_name String OBS bucket name.
Table 3 BackupFilesBody Parameter Type Description file_source String Data source, which can be an OBS bucket or a backup record. bucket_name String OBS bucket name.
OBS Path: Find the info.txt file in the created OBS parallel file system and click Yes. HDFS Path: Select an HDFS path, for example, /tmp/test, and click Yes. Click OK and wait until the data file is imported.
Action Array of strings Specifies OBS access permissions. Resource Array of strings Specifies the OBS object.
Action Array of strings Specifies OBS access permissions. Resource Array of strings Specifies the OBS object.
Create OBS Bucket: If this function is enabled, an OBS bucket will be created automatically with the name you enter. If this function is disabled, select an existing OBS bucket. OBS Bucket: You can create an OBS bucket or select an existing OBS bucket.
ALL,on(),off(GUC,SLRU,MEM_CTL,AUTOVAC,CACHE,ADIO,SSL,GDS,TBLSPC,WLM,OBS,EXECUTOR,VEC_EXECUTOR,STREAM,LLVM,OPT,OPT_REWRITE,OPT_JOIN,OPT_AGG,OPT_SUBPLAN,OPT_SETOP,OPT_SKEW,UDF,COOP_ANALYZE,WLMCP,ACCELERATE,PARQUET,PLANHINT,SNAPSHOT,XACT,HANDLE,CLOG,EC,REMOTE,CN_RETRY,PLSQL,TEXTSEARCH
ALL,on(),off(GUC,SLRU,MEM_CTL,AUTOVAC,CACHE,ADIO,SSL,GDS,TBLSPC,WLM,OBS,EXECUTOR,VEC_EXECUTOR,STREAM,OPT,OPT_REWRITE,OPT_JOIN,OPT_AGG,OPT_SUBPLAN,OPT_SETOP,OPT_SKEW,UDF,COOP_ANALYZE,WLMCP,ACCELERATE,PARQUET,PLANHINT,SNAPSHOT,XACT,HANDLE,CLOG,EC,REMOTE,CN_RETRY,PLSQL,TEXTSEARCH,SEQ
Object Storage Service (OBS) If the component to be deployed comes from the software package stored in OBS, you must have the OBS ReadOnlyAccess permissions assigned.
OBS Path: Find the product_info.txt file in the created OBS bucket and click Yes. HDFS Path: Select /user/hive/warehouse/demo.db/product_info/ and click Yes. Click OK to import the product_info table data. Create an ORC table and import data to the table.