检测到您已登录华为云国际站账号,为了您更好的体验,建议您访问国际站服务网站 https://www.huaweicloud.com/intl/zh-cn
不再显示此消息
After the permission is granted, you can access OBS and SWR of IAM users in a notebook instance.
to obtain the project ID, see Obtaining a Project ID. graph_name Yes String Graph name Request Parameters Table 2 Request body parameters Parameter Mandatory Type Description scriptPath Yes String Path of the DSL algorithm file that the user has written. obsParameters Yes Object OBS
ID, see Obtaining a Project ID. graph_name Yes String Graph name job_id Yes String Job ID of the algorithm task in the response result Request Parameters Table 2 Request body parameters Parameter Mandatory Type Description exportPath Yes String Dump path obsParameters Yes String OBS
Why Do I Fail to Create a Table in the Specified Location on OBS After Logging to Spark Beeline? Spark Shuffle Exception Handling Why Does the Cluster Port Fail to Connect When a Client Outside the Cluster Is Installed or Used?
ae84c61ea9ee _service_type Service for which access logs are collected Fixed value: DNS DNS category Log category Fixed value: LTS LTS collectTime LTS log collection time Integer 1704158708902 Configuring Log Transfer If you want to analyze access logs later, transfer the logs to OBS
Only OBS resource packages can be purchased on a monthly basis but renewed on a yearly basis.
Table 1 Process of building a third-party model dataset Procedure Step Description Reference Importing data to the Pangu platform Creating an import task Import data stored in OBS or local data into the platform for centralized management, facilitating subsequent processing or publishing
Possible values: OBS DLI Data Format Yes Format of data. This parameter is available only when Data Location is set to OBS.
Possible values: OBS DLI Data Format Yes Format of data. This parameter is available only when Data Location is set to OBS.
Enumeration values: IAM SAML LDAP LOCAL AGENTTENANT OTHER description String Database description. location String Database directory, for example, obs://location/uri/. data_statistic_enable Boolean Whether to enable data overview statistics.
In the enterprise project authorization, if OBS permissions are assigned, they will be applied about 15 to 30 minutes after the authorization is complete.
Creating a datasource connection: VPC ReadOnlyAccess Creating yearly/monthly resources: BSS Administrator Creating a tag: TMS FullAccess and EPS EPS FullAccess Using OBS for storage: OBS OperateAccess Creating an agency: Security Administrator DLI ReadOnlyAccess Read-only permissions
Figure 17 Embedding completed Access the created OBS bucket. If the knowledge base file is saved in the upload_files folder, the configuration is successful.
bucket management Granting or canceling authorization of accessing OBS buckets 150 times/minute 300 times/minute Statistical analysis Querying the duration of transcoded outputs Querying the number of recording channels Querying the number of snapshots Querying playback profiles
You can upload a software/firmware upgrade package to IoTDA or use a file associated with an object on OBS for device remote upgrades. For details, see About OTA Upgrade. Parent topic: API Reference on the Application Side
Importing Metadata from OBS 1.0.0 POST /v2/{project_id}/graphs/metadata/upload-from-obs Import metadata from OBS.
OBS Networking No.
Offline data lake: collects and processes offline data from service systems, processes wide tables, uses HDFS or OBS for data storage, and uses Spark SQL as the data processing engine.
Enable the log dump function and select a dump mode: If you select OBS, all the API access logs in the current workspace will be dumped to the specified OBS bucket. If you want to select LTS, you need to create a log group and a log stream on the LTS console in advance.
transfer times iotda_obs_forwarding_totalCount Number of OBS transfer successes iotda_obs_forwarding_successCount Number of OBS transfer failures iotda_obs_forwarding_failedCount Total number of DMS Kafka transfer times iotda_dmsKafka_forwarding_totalCount Number of DMS Kafka transfer