检测到您已登录华为云国际站账号,为了您更好的体验,建议您访问国际站服务网站 https://www.huaweicloud.com/intl/zh-cn
不再显示此消息
Creating an Inference Endpoint Registering a model You can register the fine-tuning model file stored in OBS as your fine-tuning model on the model management page.
All related code files dependent on the UDF should be uniformly archived into a compressed package, uploaded to OBS, and then the storage path of the compressed package should be specified during function creation.
You can only use buckets of the OBS parallel file system to store UDF code compressed packages, and you must grant read permissions to IAM users on LakeFormation. Parent topic: UDF Development (Python)
Code Directory Select the OBS directory where the job code is stored. Ray Main File Select the main entry Python file of the job running code in the code directory. Ensure that you have selected the main entry file for running the job.
Obtain Query Results (Through OBS) getStatementResult Obtains the result of an asynchronous query. The result set is read from OBS. Obtain Query Results (Directly) getStatementResultDirect Obtains the result of an asynchronous query, which is directly returned from the server.
- IMPORTS Path to the compressed package on OBS that the function depends on at runtime. Only one path can be introduced in the IMPORTS clause, that is, one compressed package. STRICT Used to specify that the function always returns NULL if any of its parameters are NULL.
If space becomes insufficient, OBS serves as an escape route. When set to false, the old method of writing directly to the local EVS disk is used.
pysum", imports=["other/util.py"], packages=["numpy"]) You can select optional solution 1 or 2 as needed. ibis-fabric will automatically pack all files that the UDF depends on into a compressed package based on the input parameters of the registration interface and upload it to the OBS
You can select OBS, Image Path, or Other. Deployment File Path Path of the inference instance in the code. Routing Prefix Routing prefix for inference. The routing prefix of each application must be unique.
Enter basic model information, including the name and description, select the OBS path of the model file, and click Create Now. In the navigation pane, choose Resources and Assets > Inference Endpoint. In the upper right corner of the page, click Create Inference Endpoint.
For managed tables, data files are still stored in a conventional file system (for example, OBS parallel file system). Users can alter these files without informing DataArts Fabric SQL.
This interface method accesses OBS through the Java SDK to obtain and return the result set.
OVERWRITE allows overwriting external OBS tables, deleting original directory data files. table_name Name of the target table where data is to be inserted. Range: an existing table name.
Managing Ray Jobs Prerequisites You have a valid Huawei Cloud account. You have at least one workspace available. You have at least one Ray cluster available. You have at least one job available. Procedure Log in to Workspace Management Console. Select the created workspace, click