MapReduce服务 MRS-向动态分区表中插入数据时,在重试的task中出现"Failed to CREATE_FILE"异常:问题

时间:2023-11-01 16:25:25

问题

向动态分区表中插入数据时,shuffle过程中大面积shuffle文件损坏(磁盘掉线、节点故障等)后,为什么会在重试的task中出现"Failed to CREATE_FILE"异常?

2016-06-25 15:11:31,323 | ERROR | [Executor task launch worker-0] | Exception in task 15.0 in stage 10.1 (TID 1258) | org.apache.spark.Logging$class.logError(Logging.scala:96)org.apache.hadoop.hive.ql.metadata.HiveException: org.apache.hadoop.ipc.RemoteException(org.apache.hadoop.hdfs.protocol.AlreadyBeingCreatedException): Failed to CREATE_FILE /user/hive/warehouse/testdb.db/web_sales/.hive-staging_hive_2016-06-25_15-09-16_999_8137121701603617850-1/-ext-10000/_temporary/0/_temporary/attempt_201606251509_0010_m_000015_0/ws_sold_date=1999-12-17/part-00015 for DFSClient_attempt_201606251509_0010_m_000015_0_353134803_151 on 10.1.1.5 because this file lease is currently owned by DFSClient_attempt_201606251509_0010_m_000015_0_-848353830_156 on 10.1.1.6
support.huaweicloud.com/cmpntguide-lts-mrs/mrs_01_2013.html