检测到您已登录华为云国际站账号,为了您更好的体验,建议您访问国际站服务网站 https://www.huaweicloud.com/intl/zh-cn
不再显示此消息
Status Code Status Code Description 200 The ring node information of the cluster topology is queried. 400 Request error. 401 Authentication failed. 403 You do not have required permissions. 404 No resources found. 500 Internal server error. 503 Service unavailable.
(20) , product_type2 char(10) , product_monthly_sales_cnt integer , product_comment_time date , product_comment_num integer , product_comment_content varchar(200) ) SERVER
This section describes how to optimize table performance in GaussDB(DWS) by properly designing the table structure (for example, by selecting the table model, table storage mode, compression level, distribution mode, distribution column, partitioned tables, and local clustering).
Figure 1 Initializing a custom model In the Initialize Custom Model dialog box, set the following parameters: VPC: Select vpc-fg (192.168.x.x/16). Subnet: Select subnet-fg (192.168.x.x/24). File System Type: Select SFS Turbo. File System: Select sfs-turbo-fg.
Synchronization for Existing Nodes (labels and taints) and Synchronization for Existing Nodes (labels) can be modified synchronously for existing nodes (by selecting the check boxes). The updated resource tag information in the node pool is synchronized to its nodes.
(Optional) Public network bandwidth When a yearly/monthly cluster is configured with an EIP billed by bandwidth, the bandwidth is billed by the Elastic Cloud Server (ECS) service in yearly/monthly mode.
In this case, log messages generated by Java Logger are all redirected to the GaussDB(DWS) backend. Then, the log messages are written into server logs or displayed on the user interface. MPPDB server logs record information at the LOG, WARNING, and ERROR levels.
varchar(200), product_type1 varchar(20), product_type2 char(10), product_monthly_sales_cnt integer, product_comment_time date, product_comment_num integer, product_comment_content varchar(200) ) SERVER gsmpp_server OPTIONS ( LOCATION'obs://OBS bucket name/input_data/
You can query the cluster flavor by referring to Querying Node Types. Response Parameters None Example Request Expand the cluster disk capacity to 200 GB on a single node.
For details about how to obtain the value, see How to Obtain Parameters in the API URI. nodepool_id Yes String Node pool ID. Request Parameters Table 2 Request header parameters Parameter Mandatory Type Description Content-Type Yes String Message body type (format).
Changing Node Specifications (Discarded) Function This API is used to modify the node specifications of a cluster. It can only change the specifications of ess nodes (data nodes).
For more information about the node flavors supported by GaussDB(DWS) and their prices, see the GaussDB(DWS) pricing details.
For more information about the node flavors supported by GaussDB(DWS) and their prices, see the GaussDB(DWS) pricing details.
varchar(200) ) SERVER obs_server OPTIONS ( format 'orc', foldername '/mybucket/demo.db/product_info_orc/', encoding 'utf8', totalrows '10' ) DISTRIBUTE BY ROUNDROBIN; Create an OBS foreign table that contains partition columns.
Support for Third-party JAR Packages on x86 and TaiShan Platforms Question How to enable Spark2x to support the third-party JAR packages (for example, custom UDF packages) if these packages have two versions (x86 and TaiShan)? Answer Use the hybrid solution.
Support for Third-party JAR Packages on x86 and TaiShan Platforms Question How to enable Spark2x to support the third-party JAR packages (for example, custom UDF packages) if these packages have two versions (x86 and TaiShan)? Answer Use the hybrid solution.
integer , product_comment_content varchar(200) ) SERVER hdfs_server_8f79ada0_d998_4026_9020_80d6de2692ca OPTIONS ( format 'orc', foldername '/user/hive/warehouse/product_info_orc/', compression 'snappy', version '0.12' ) Write Only;
Support for Third-party JAR Packages on x86 and TaiShan Platforms Question How to enable Spark2x to support the third-party JAR packages (for example, custom UDF packages) if these packages have two versions (x86 and TaiShan)?
Support for Third-party JAR Packages on x86 and TaiShan Platforms Question How to enable Spark2x to support the third-party JAR packages (for example, custom UDF packages) if these packages have two versions (x86 and TaiShan)? Answer Use the hybrid solution.
varchar(200), product_type1 varchar(20), product_type2 char(10), product_monthly_sales_cnt integer, product_comment_time date, product_comment_num integer, product_comment_content varchar(200) ) SERVER gsmpp_server OPTIONS ( LOCATION'obs://OBS bucket name/input_data/