检测到您已登录华为云国际站账号,为了您更好的体验,建议您访问国际站服务网站 https://www.huaweicloud.com/intl/zh-cn
不再显示此消息
Querying an APIGroup (/apis/networking.cci.io) Function This API is used to query an APIGroup (/apis/networking.cci.io). Calling Method For details, see Calling APIs.
Querying an APIGroup (/apis/rbac.authorization.k8s.io) Function get information of a group Calling Method For details, see Calling APIs.
Querying an APIGroup (/apis/batch) Function This API is used to query an APIGroup (/apis/batch). Calling Method For details, see Calling APIs.
A pod is the smallest and simplest unit in the Kubernetes object model that you create or deploy. A pod encapsulates one or more containers, storage resources, a unique network IP address, and options that govern how the container(s) should run.
300 × 1,024 profile-controller (C + 1,000)/6,000 × 1,000 (C + 400)/1,200 × 1,000 (C + 1,000)/6,000 × 1,024 (C + 400)/1,200 × 1,024 proxy (P + 2,000)/12,000 × 1,000 (P + 800)/2,400 × 1,000 (P + 2,000)/12,000 × 1,024 (P + 800)/2,400 × 1,024 resource-syncer/bursting-resource-syncer (
Model Name Minimum Flavor GPU 0 DeepSeek-R1 DeepSeek-V3 p2s.16xlarge.8 V100 (32 GiB) × 8 GPUs × 8 nodes p2v.16xlarge.8 V100 (16 GiB) × 8 GPUs × 16 nodes pi2.4xlarge.4 T4 (16 GiB) × 8 GPUs × 16 nodes Manually Deploying a DeepSeek-R1 or DeepSeek-V3 model Using SGLang and Docker on Multi-GPU
CREATE TABLE myschema.mytable (firstcol int); Insert data into the table. INSERT INTO myschema.mytable values (100); View data in the table. SELECT * FROM myschema.mytable; | firstcol | ---+----------+ 1 | 100 | Update data in the table.
Server looks at X-Forwarded-For header or X-Real-Ip header or request.RemoteAddr (in that order) to get the client IP. versions Array of strings versions are the api versions that are available.
Figure 1 Initializing a custom model In the Initialize Custom Model dialog box, set the following parameters: VPC: Select vpc-fg (192.168.x.x/16). Subnet: Select subnet-fg (192.168.x.x/24). File System Type: Select SFS Turbo. File System: Select sfs-turbo-fg.
Public services, such as Elastic Cloud Server (ECS), Elastic Volume Service (EVS), Object Storage Service (OBS), Virtual Private Cloud (VPC), Elastic IP (EIP), and Image Management Service (IMS), are shared within the same region.
Changing Node Specifications (Discarded) Function This API is used to modify the node specifications of a cluster. It can only change the specifications of ess nodes (data nodes).
For details about how to obtain the value, see How to Obtain Parameters in the API URI. nodepool_id Yes String Node pool ID. Request Parameters Table 2 Request header parameters Parameter Mandatory Type Description Content-Type Yes String Message body type (format).
Cluster Creation Table 5 Different cluster creation modes Cloud Container Engine (CCE) Cloud Container Instance (CCI) Configure basic information (name, Region, networking, and compute) > Create a worker node > Configure the cluster > Create a workload.
Describes pod affinity scheduling rules (e.g. co-locate this pod in the same node, zone, etc. as some other pod(s)). podAntiAffinity io.k8s.api.core.v1.PodAntiAffinity object Describes pod anti-affinity scheduling rules (e.g. avoid putting this pod in the same node, zone, etc. as
Model Name Minimum Flavor GPU Nodes 0 DeepSeek-R1 DeepSeek-V3 p2s.16xlarge.8 V100 (32 GiB) × 8 8 p2v.16xlarge.8 V100 (16 GiB) × 8 16 pi2.4xlarge.4 T4 (16 GiB) × 8 16 Contact Huawei Cloud technical support to select GPU ECSs suitable for your deployment.
Status Codes Status Code Description 200 The job for managing a node in the customized node pool in the cluster delivered. Error Codes See Error Codes. Parent Topic: Node Management
Support for Third-party JAR Packages on x86 and TaiShan Platforms Question How to enable Spark2x to support the third-party JAR packages (for example, custom UDF packages) if these packages have two versions (x86 and TaiShan)? Answer Use the hybrid solution.
Support for Third-party JAR Packages on x86 and TaiShan Platforms Question How to enable Spark2x to support the third-party JAR packages (for example, custom UDF packages) if these packages have two versions (x86 and TaiShan)? Answer Use the hybrid solution.
Support for Third-party JAR Packages on x86 and TaiShan Platforms Question How to enable Spark2x to support the third-party JAR packages (for example, custom UDF packages) if these packages have two versions (x86 and TaiShan)?
Support for Third-party JAR Packages on x86 and TaiShan Platforms Question How to enable Spark2x to support the third-party JAR packages (for example, custom UDF packages) if these packages have two versions (x86 and TaiShan)? Answer Use the hybrid solution.