检测到您已登录华为云国际站账号,为了您更好的体验,建议您访问国际站服务网站 https://www.huaweicloud.com/intl/zh-cn
不再显示此消息
On the configuration page that is displayed, click Delete + to delete a directory, for example, hdfs://hacluster/user/admin/examples/output-data/spark_workflow. Click PROPERTIES+ and add sharelib used by Oozie.
a table. kudu table delete KuduMaster instance IP1:7051, KuduMaster instance IP2:7051, KuduMaster instance IP3:7051 Table name To obtain the IP address of the KuduMaster instance, choose Components > Kudu > Instances on the cluster details page.
On the configuration page that is displayed, click Delete + to delete a directory, for example, hdfs://hacluster/user/admin/examples/output-data/spark_workflow. Click PROPERTIES+ and add sharelib used by Oozie.
Delete a topic: kafka-topics.sh --delete --zookeeper ZooKeeper Cluster service IP address:2181/kafka --topic topicname Query all topics: kafka-topics.sh --zookeeper ZooKeeper cluster service IP address:2181/kafka --list After the deletion command is executed, empty topics will be
Replicated Table After It Is Deleted from ClickHouse Failed to Create a ClickHouse Replicated Table "Table is in readonly mode" Is Reported When Data Is Imported to ClickHouse ClickHouse Frequently Fails to Be Connected with Error Message Indicating Incorrect Password Failed to Delete
Answer You do not have the permission to delete directories on OBS. As a result, Hive tables cannot be deleted. In this case, modify the custom IAM policy of the agency and configure Hive with the permission for deleting tables in the OBS directory.
If an ECS is deleted, the ECS OS is modified or reinstalled, or the ECS specifications are modified on the ECS console, MRS will automatically identify and delete the node. You can log in to the MRS console and restore the deleted node through scale-out.
Table 1 API Item Description public String run(Properties conf) Runs a job. public void start(String jobId) Starts the specified job. public String submit(Properties conf) Submits a job. public void kill(String jobId) Delete the specified job. public void suspend(String jobId) Suspends
Table 1 API Item Description public String run(Properties conf) Runs a job. public void start(String jobId) Starts the specified job. public String submit(Properties conf) Submits a job. public void kill(String jobId) Delete the specified job. public void suspend(String jobId) Suspends
If you do not import this JAR package, you need to delete hbase.rpc.controllerfactory.class from the hbase-site.xml configuration file. Parent topic: FAQs
Table 1 API Item Description public String run(Properties conf) Runs a job. public void start(String jobId) Starts the specified job. public String submit(Properties conf) Submits a job. public void kill(String jobId) Delete the specified job. public void suspend(String jobId) Suspends
Table 1 API Item Description public String run(Properties conf) Runs a job. public void start(String jobId) Starts the specified job. public String submit(Properties conf) Submits a job. public void kill(String jobId) Delete the specified job. public void suspend(String jobId) Suspends
Procedure Manually delete data residuals. Parent topic: Using HDFS
Procedure It takes time for the DataNode to delete the corresponding blocks after files are deleted. When the NameNode is restarted immediately, it checks the block information reported by all DataNodes.
To delete a component or cluster connected to OBS (including storage-compute decoupling and cold-hot data separation scenarios), you must also delete the service data on OBS.
DELETE: requests a server to delete specified resources, for example, to delete an object. HEAD: requests a server resource header. PATCH: requests a server to update the partial content of a specified resource. If the resource does not exist, a new resource will be created.
Delete directories that do not comply with the disk plan from the DataNode data directory. Choose Components > HDFS > Instances. In the instance list, click the DataNode instance on the node for which the alarm is generated.
Handling Procedure Check the disk capacity and delete unnecessary files. On the homepage of FusionInsight Manager, choose Cluster > Services > HDFS.
If a policy is no longer used, click to delete it. Parent topic: Configuration Examples for Ranger Permission Policy
Answer After a project is imported, to manually delete the automatically loaded Jars packages, perform the following steps: On the IDEA tool, choose File > Project Structures.... Select Libraries and select the Jars packages that are automatically imported.