检测到您已登录华为云国际站账号,为了您更好的体验,建议您访问国际站服务网站 https://www.huaweicloud.com/intl/zh-cn
不再显示此消息
All these functions are implemented by the AI Core. Static AIPP and dynamic AIPP modes are supported. However, the two modes are mutually exclusive. Static AIPP: During model conversion, set the AIPP mode to static and set the AIPP parameters.
All these functions are implemented by the AI Core. Static AIPP and dynamic AIPP modes are supported. However, the two modes are mutually exclusive. Static AIPP: During model conversion, set the AIPP mode to static and set the AIPP parameters.
Intelligent analysis Large AI models automatically interpret data and provide a complete report with data overviews, detailed analyses, and visualized conclusions, replacing manual analysis and document writing.
Using Big Data Capabilities (MRS) in a Workflow Function This phase calls MRS for big data cluster computing. It is used for batch data processing and model training. Application Scenarios You can use MRS Spark for big data computing in this phase. Examples On the Huawei Cloud MRS
Connecting to a Notebook Instance Through VS Code Toolkit This section describes how to use the ModelArts VS Code Toolkit plug-in to remotely connect to a notebook instance. Prerequisites You have downloaded and installed VS Code. For details, see Connecting to a Notebook Instance
Manually Connecting to a Notebook Instance Through VS Code A local IDE supports PyCharm and VS Code. You can use PyCharm or VS Code to remotely connect the local IDE to the target notebook instance on ModelArts for running and debugging code. This section describes how to use VS Code
You have deployed an inference service using the AI Inference Framework add-on by referring to AI Inference Framework Add-on. Constraints kagent needs to be installed immediately after it is started. Ensure that the pods in the cluster can access the public network.
Parent Topic: AI Data Acceleration
SFS Turbo Security Best Practices SFS Turbo provides scalable, high-performance file storage that can be used for AI training, AI generated content (AIGC), autonomous driving, rendering, EDA simulation, and enterprise NAS applications.
Parent topic: AI
In the distributed scenario, this system catalog is provided, but the AI capabilities are unavailable. Parent topic: AI
Deploying a service for inference Creating a Custom Image and Using It to Create an AI Application If you want to use an AI engine that is not supported by ModelArts, create a custom image, import the image to ModelArts, and use the image to create AI applications.
This parameter does not need to be specified. description text Model comment Parent topic: AI
In addition, Huawei uses its AI technologies and architecture experience to help customers analyze AI service requirements and design the AI platform architecture.
It integrates the concepts of backend as a service (BaaS) and LLMOps, enabling you to quickly build production-level generative AI applications. Solution Architecture This solution helps you quickly deploy the Dify platform.
Parent topic: AI Application Management
After the AI platform consulting and planning service is purchased, the service cannot be unsubscribed and refunded. Parent topic: About Service Purchase
Parent topic: AI Platform Development and Implementation Service
Creating a Training Job for Automatic Model Tuning Context To use ModelArts hyperparameter search, the AI engine must be either pytorch_1.8.0-cuda_10.2-py_3.7-ubuntu_18.04-x86_64 or tensorflow_2.1.0-cuda_10.1-py_3.7-ubuntu_18.04-x86_64, and the hyperparameter to be optimized must
Parent topic: AI Platform Consulting and Planning Services