检测到您已登录华为云国际站账号,为了您更好的体验,建议您访问国际站服务网站 https://www.huaweicloud.com/intl/zh-cn
不再显示此消息
Prepare an OBS bucket to store the generated metadata. The OBS bucket must be in the same region as the CDM cluster in the DataArts Studio instance, and the enterprise project of the OBS bucket must be the same as that of the CDM cluster.
obs:object:PutObject POST /v1/{project_id}/app-warehouse/bucket-and-acl/create workspace:appWarehouse:createBucketOrAcl obs:bucket:GetBucketAcl obs:bucket:HeadBucket obs:bucket:PutBucketAcl obs:bucket:PutReplicationConfiguration obs:bucket:CreateBucket obs:bucket:PutBucketCORS GET
Table 6 ObsObjInfo Parameter Type Description bucket String OBS bucket name location String Region where an OBS bucket is located. It must be the same as the region where MPC is deployed. object String OBS object path, which complies with the OSS Object definition.
The OBS server will compare this MD5 value with the MD5 value obtained by object data calculation. If the two values are not the same, the upload fails with an HTTP 400 error returned. If the MD5 value is not specified, the OBS server will skip MD5 value verification.
Compliance audit team Cloud Trace Service (CTS), Log Tank Service (LTS), Config, and Object Storage Service (OBS) Data platform Centrally deploy big data platforms and collect service data from other accounts to store, process, and analyze them on the data platforms.
For details about how to obtain the OBS file URL and temporary authorization URL, see Configuring the Access Permission of OBS. The region of OBS must be the same as that of the requested service.
Insert data. 1 sparkSession.sql("insert into testhbase values('95274','abc','Hongkong')"); Query data. 1 sparkSession.sql("select * from testhbase").show(); Submitting a Spark job Generate a JAR file based on the code file and upload the JAR file to the OBS bucket.
Select Save Job Log, and specify the OBS bucket for saving job logs. Change the values of the parameters in bold as needed in the following script.
Click the name of the corresponding Flink job, choose Run Log, click OBS Bucket, and locate the folder of the log you want to view according to the date.
Select Save Job Log, and specify the OBS bucket for saving job logs.
Select Save Job Log, and specify the OBS bucket for saving job logs. Change the values of the parameters in bold as needed in the following script.
Click the name of the corresponding Flink job, choose Run Log, click OBS Bucket, and locate the folder of the log you want to view according to the date.
Data dump: depends on OBS OperateAccess. CES ReadOnlyAccessPolicy Read-only permissions for viewing data on Cloud Eye System-defined policies Cloud Eye monitoring involves querying resources of other cloud services.
OBS: indicates that backup files are stored in OBS.
Driver File Path Yes It specifies the OBS path where the driver file is located. You need to download a .jar driver file from the corresponding official website and upload it to OBS. MySQL driver: Download it from https://downloads.mysql.com/archives/c-j/.
Catalogs Databases Tables Columns Functions OBS paths. Operation Type Access permission on the authorization object that the authorization entity has. Different authorization objects support different operations. For details, see Table 2.
Using this option requires that target OBS buckets be whitelisted for the CRC64 feature. Migrate Metadata - Determine whether to migrate metadata. If you select this option, object metadata will be migrated.
For details, see Introduction to OBS Access Control, IAM Custom Policies, and Creating a Custom Bucket Policy. The mapping between OBS regions and endpoints must comply with what is listed in Regions and Endpoints.
Action Array of strings Specifies OBS access permissions. Resource Array of strings Specifies the OBS object.
Action Array of strings Specifies OBS access permissions. Resource Array of strings Specifies the OBS object.