检测到您已登录华为云国际站账号,为了您更好的体验,建议您访问国际站服务网站 https://www.huaweicloud.com/intl/zh-cn
不再显示此消息
AI Data Acceleration Engine There are several key challenges that need to be addressed when using Kubernetes for AI and big data tasks, including excessive latency, inefficient bandwidth usage, insufficient data management, fragmented storage interfaces, and a lack of intelligent
AI Data Acceleration Fluid Overview AI Data Acceleration Engine Parent Topic: Cloud Native AI
Parent Topic: Cloud Native AI Add-ons
CCE AI Suite (NVIDIA GPU) Add-on Overview CCE AI Suite (NVIDIA GPU) is a device management add-on that supports GPUs in containers. To use GPU nodes in a cluster, this add-on must be installed.
Parent Topic: Cloud Native AI Add-ons
CCE AI Suite (Ascend NPU) Add-on Overview CCE AI Suite (Ascend NPU) is a device management add-on that supports NPUs in containers. After this add-on is installed, you can create AI-accelerated nodes to quickly and efficiently process inference and image recognition.
CCE AI Suite (NVIDIA GPU) Release History CCE regularly updates the CCE AI Suite (NVIDIA GPU) add-on to enhance features, optimize performance, and fix bugs, ultimately improving user experience and system stability.
CCE AI Suite (Ascend NPU) Release History CCE regularly updates the CCE AI Suite (Ascend NPU) add-on to enhance features, optimize performance, and fix bugs, ultimately improving user experience and system stability.
Cloud Native AI Cloud Native AI Suite Overview AI Workload Scheduling AI Task Management AI Data Acceleration AI Service Deployment
CCE AI Suite (NVIDIA GPU) Check Items CCE AI Suite (NVIDIA GPU) is involved in the upgrade, which may affect the GPU driver installation during the creation of a GPU node. Solution The driver of CCE AI Suite (NVIDIA GPU) needs to be configured by yourself.
CCE AI Suite (NVIDIA GPU) Exceptions Check Items Check whether CCE AI Suite (NVIDIA GPU) involved in the upgrade affects the GPU driver installation when creating a GPU node. Solution The driver of CCE AI Suite (NVIDIA GPU) needs to be configured by yourself.
AI Inference Gateway Add-on With the rapid development of large language models (LLMs) and AI inference services, cloud native AI teams struggle with increasingly complex inference traffic management.
AI Inference Framework Add-on Introduction AI Inference Framework is a cloud native add-on for full lifecycle management of AI models.
Cloud Native AI Add-ons CCE AI Suite (NVIDIA GPU) CCE AI Suite (Ascend NPU) Parent Topic: Add-ons
AI Performance-based Scheduling DRF Gang Parent Topic: Volcano Scheduling
AI and machine learning inherently involve a large number of computing-intensive tasks. Kubernetes is a preferential tool for developers building AI platforms because of its excellent capabilities in resource management, application orchestration, and O&M monitoring.
Parent Topic: Cloud Native AI
AI Service Deployment AI Inference Framework Add-on AI Inference Gateway Add-on LeaderWorkerSet Add-on kagent Add-on Parent Topic: Cloud Native AI
AI Task Management Kubeflow Add-on Kuberay Parent Topic: Cloud Native AI
AI Task Management CCE integrates the cloud native AI engines Kubeflow and Kuberay, to deliver robust AI development support.