检测到您已登录华为云国际站账号,为了您更好的体验,建议您访问国际站服务网站 https://www.huaweicloud.com/intl/zh-cn
不再显示此消息
Planning and creating an OBS bucket and importing data Create an OBS bucket and folders for data storage. Planning and creating catalogs and databases Create catalogs and databases on the LakeFormation page and specify the OBS bucket directory.
Table 1 Permissions Service Role Policy Function System administrator { "Version": "1.1", "Statement": [ { "Effect": "Allow", "Action": [ "DataArtsFabric:*:*", "obs:bucket;*", "obs:object:*" ] } ] } With all DataArtsFabric permissions, this role can perform all DataArtsFabric operations
Planning and Creating an OBS Parallel File System and Importing Data DataArtsFabric SQL uses OBS to store data. You need to create a parallel file system and folders on the OBS console and import sample data. Log in to the management console.
FABRIC_OBS_POLICY obs:bucket:PutLifecycleConfiguration obs:bucket:ListBucketMultipartUploads obs:object:GetObject obs:bucket:HeadBucket obs:bucket:DeleteBucket obs:bucket:CreateBucket obs:bucket:ListAllMyBuckets obs:bucket:ListBucket obs:object:PutObject No Permissions required by
You have enabled LakeFormation and OBS permissions and confirmed the agency. You have a workspace available. Step 1: Plan and Create an OBS Bucket and Import Data DataArts Fabric SQL uses OBS to store data.
Robert Algorithm engineer DataArtsFabricFullPolicy and required OBS permissions are required. To use model files in OBS in DataArts Fabric, some OBS permissions need to be granted by the user permission administrator Tom.
You can select OBS, Image Path, or Other. Deployment File Path Path of the inference instance in the code. Routing Prefix Routing prefix for inference. The routing prefix of each application must be unique.
You can view the output results in the OBS bucket path.
Enter basic model information, including the name and description, select the OBS path of the model file, and click Create Now. In the navigation pane, choose Resources and Assets > Inference Endpoint. In the upper right corner of the page, click Create Inference Endpoint.
You have created an OBS bucket and folders for storing models, uploaded model files that meet the requirements, and ensured that the OBS bucket for storing models is in the same region as DataArtsFabric. For details, see Creating an OBS Bucket.
Code Directory Select the OBS directory where the job code is stored. Ray Main File Select the main entry Python file of the job running code in the code directory. Ensure that you have selected the main entry file for running the job.
You have created an OBS bucket and folders for storing models, uploaded model files that meet the requirements, and ensured that the OBS bucket for storing models is in the same region as DataArtsFabric. For details, see Creating an OBS Bucket.
Managing Ray Jobs Prerequisites You have a valid Huawei Cloud account. You have at least one workspace available. You have at least one Ray cluster available. You have at least one job available. Procedure Log in to Workspace Management Console. Select the created workspace, click
Creating an Inference Endpoint Registering a model You can register the fine-tuning model file stored in OBS as your fine-tuning model on the model management page.
Creating a model OBS To create a model and specify its OBS file path on the model management page, an IAM user must have the DataArtsFabricFullPolicy and OBS OperateAccess policies.
The image package version must be the same as the version of the selected OBS file package. Version Description Description of the version to be created. Version Type Currently, only OBS is supported. Path OBS path of the version to be created.