检测到您已登录华为云国际站账号,为了您更好的体验,建议您访问国际站服务网站 https://www.huaweicloud.com/intl/zh-cn
不再显示此消息
Figure 3 Viewing slow query log details Click in the upper right corner of the Details area to export slow query log details to a specific OBS bucket. A maximum of 100,000 records can be exported.
This tutorial uses two different OBS buckets. The function you create must be in the same region (default region) as the OBS buckets. Procedure Create two buckets on the OBS console.
Table 6 ObsObjInfo Parameter Type Description bucket String OBS bucket name location String Region where an OBS bucket is located. It must be the same as the region where MPC is deployed. object String OBS object path, which complies with the OSS Object definition.
On the OBS console, you can configure lifecycle rules for a bucket to periodically delete objects in it or change object storage classes. For details, see Configuring a Lifecycle Rule.
Tenant Administrator permissions are required to access data from OBS to execute Flink jobs on DLI, for example, obtaining OBS data sources, log dump (including bucket authorization), checkpointing enabling, and job import and export.
Table 2 System permissions of other services on which DLV depends Service Mandatory Policy Name Role Name OBS No OBS OperateAccess OBS Buckets Viewer DWS No DWS ReadOnlyAccess DWS Administrator DLI No - DLI Service Admin MRS No MRS ReadOnlyAccess MRS Administrator RDS No RDS ReadOnlyAccess
You have created an OBS bucket for storing recording files and authorized SparkRTC to access the OBS bucket. General Process Obtain a token. Create an application. Create a recording template. Configure a recording callback. Join a room. Create a mixed stream recording job.
Feature Description Phase Document 1 Integrating with OBS's obsutil tool KooCLI version: 3.4.6 KooCLI has integrated the obsutil tool of Object Storage Service (OBS). You can run hcloud obs to manage your data in OBS.
20 TB RDS MySQL | 2x4 cores | 16G | 500 GB DWS Optional VPN 720 hours Direct Connect 1GE ELB 5M/720 hours AS Free VPC Free Medical Info Access Secure Medical Information Access Cloud Service Contact Us ECS 6x2 cores | 4G | 128 GB OBS 20 TB RDS MySQL | 2x4 cores | 16G | 500 GB DEW
You can select an HDFS or OBS path. Output Data Path No Set the output data path. You can select an HDFS or OBS path.
Mount the created OBS volume. Click Create. Wait until the job execution is complete. On the OBS page, you can view the execution results that are shown as images.
Provide correct access key Id. 403 Forbidden RequestTimeTooSkewed There was a large time offset between the OBS server time and the time when the client initiated a request. For security purposes, OBS verifies the time offset between the client and server.
OBS: indicates that backup files are stored in OBS.
NOTE: This constraint applies only to OBS, FTP, and MRS HDFS data sources. For example, if two OBS tasks and two FTP tasks are concurrently executed, the total size of files to be collected from the four tasks cannot exceed 800 MB.
The network connection to the OBS server breaks often. Sizes of files to be uploaded are uncertain.
Options: inline: inline code zip: ZIP file obs: function code stored in an OBS bucket jar: JAR file, mainly for Java functions Custom-Image-Swr: The function code comes from the SWR custom image.
Target location: Huawei Cloud OBS An object list file cannot exceed 30 MB. An object list file must be a .txt file, and the Content-Type metadata must be text/plain. An object list file must be in UTF-8 without BOM.
bucket where snapshots are stored. basePath String Parameter description: Storage path of the snapshot in the OBS bucket. agency String Parameter description: Agency used to access OBS buckets. enable String Parameter description: Whether to enable the automatic snapshot creation
is not supported for resource packages that include multiple usage types, even if the services involved are supported as listed above, except for newly purchased or renewed OBS packages).
Currently, only the URL of an OBS bucket on Huawei Cloud is supported and FRS must have the permission to read data in the OBS bucket.