检测到您已登录华为云国际站账号,为了您更好的体验,建议您访问国际站服务网站 https://www.huaweicloud.com/intl/zh-cn
不再显示此消息
Improving HBase Real-Time Write Efficiency Scenario Scenarios where data needs to be written to HBase in real time, or large-scale and consecutive put scenarios Prerequisites The HBase put or delete interface can be used to save data to HBase.
You can create and delete HDFS files and folders and grant permissions for HDFS files and folders using this node. Parameter Description Table 1 describes parameters used on the FS Action node.
You can create and delete HDFS files and folders and grant permissions for HDFS files and folders using this node. Parameter Description Table 1 describes parameters used on the FS Action node.
You can create and delete HDFS files and folders and grant permissions for HDFS files and folders using this node. Parameter Description Table 1 describes parameters used on the FS Action node.
You can create and delete HDFS files and folders and grant permissions for HDFS files and folders using this node. Parameter Description Table 1 describes parameters used on the FS Action node.
If you click DELETE PATH+ to add the HDFS path to be deleted, this parameter cannot be empty. Otherwise, the /user/{Submit user name} directory of the HDFS is deleted by default, which may cause other tasks to run abnormally. Click in the upper right corner of the Oozie editor.
DROP SCHEMA web; If the schema sales exists, drop the schema: DROP SCHEMA IF EXISTS sales; Delete schema test_drop in cascading mode. The tb_web table that exists in schema test_drop is deleted before schema test_drop is deleted.
Accessing Hive Data Sources Using HSBroker Description This section describes how to use HSBroker to connect to HetuEngine, assemble SQL statements, and send the SQL statements to HetuEngine for execution to add, delete, modify, and query Hive data sources. public class JDBCExampleBroker
Accessing Hive Data Sources Using HSFabric Description This section describes how to use HSFabric to connect to HetuServer, assemble SQL statements, and send the SQL statements to HetuServer for execution to add, delete, modify, and query Hive data sources. public class JDBCExampleFabric
Development Plan Scenarios You can customize JDBCServer clients and use JDBC connections to create, load data to, query, and delete data tables. Data Preparation Upload the data file to HDFS.
For clusters with Kerberos authentication enabled, only users in the stormadmin or storm group can query all topologies. storm list Run the following command to delete a Storm topology. storm kill Topology name Parent topic: Using Storm
loader-tool Usage Example Scenario loader-tool can be used to create, update, query, and delete a connector or job by using a job template or setting parameters. This section describes how to use loader-tool in the job template mode.
loader-tool Usage Example Scenario loader-tool can be used to create, update, query, and delete a connector or job by using a job template or setting parameters. This section describes how to use loader-tool in the job template mode.
DELETE FROM dataorigin2 WHERE date_p="2021-03-31"; Parent topic: Using Impala
DELETE FROM dataorigin2 WHERE date_p="2021-03-31"; Parent topic: Using Impala
Accessing Hive Data Sources Using HSFabric This section describes how to use HSFabric to connect to HetuEngine, assemble SQL statements, and send the them to HetuEngine for execution to add, delete, modify, and query Hive data sources. import jaydebeapi driver = "io.XXX.jdbc.XXXDriver
Quickly Using Kafka to Produce and Consume Data Scenario You can create, query, and delete topics on a cluster client. Set user permissions by referring to Kafka User Permissions.
Using Kafka to Produce Consumption Data Scenario You can use the MRS cluster client to create, query, and delete Kafka topics. You can also log in to the Kafka UI to view the consumption information of the current cluster.
The following describes how to use SQL statements to create tables, insert data, query data, and delete tables. Prerequisites The HBase client has been installed. For example, the client has been installed in the /opt/client directory.
The following describes how to use SQL statements to create tables, insert data, query data, and delete tables. Prerequisites The HBase client has been installed. For example, the client has been installed in the /opt/client directory.