apache配置单元作业不工作-容器失败,exitcode=-1000无法获取块

yizd12fk  于 2021-06-29  发布在  Hive
关注(0)|答案(1)|浏览(388)

我刚刚用hortonworks数据平台安装了hadoop。我有三台运行centos 7的机器。这三台计算机中的一台正在运行amabari服务器、namenode、hiveserver2等。另外两个只运行这些服务的客户端。
每次我尝试执行需要mapreduce作业的配置单元查询时,它们都会失败。每个作业中的所有taskattempts都会失败,并出现blockmissingexception,诊断设置为“[容器失败,exitcode=-1000”。无法获取块。
例如。:

hive> select count(*) from pgc;
Query ID = root_20160510184153_51d881b2-fbb5-47d3-8a06-9d62f51950e1
Total jobs = 1
Launching Job 1 out of 1

Status: Running (Executing on YARN cluster with App id application_1462904248344_0007)

--------------------------------------------------------------------------------
        VERTICES      STATUS  TOTAL  COMPLETED  RUNNING  PENDING  FAILED  KILLED
--------------------------------------------------------------------------------
Map 1                 FAILED      9          0        0        9      14       0
Reducer 2             KILLED      1          0        0        1       0       0
--------------------------------------------------------------------------------
VERTICES: 00/02  [>>--------------------------] 0%    ELAPSED TIME: 80.05 s
--------------------------------------------------------------------------------
Status: Failed
Vertex failed, vertexName=Map 1, vertexId=vertex_1462904248344_0007_1_00, diagnostics=[Task failed, taskId=task_1462904248344_0007_1_00_000001, diagnostics=[TaskAttempt 0 failed, info=[Container container_e49_1462904248344_0007_02_000003 finished with diagnostics set to [Container failed, exitCode=-1000. Could not obtain block: BP-1214017999-10.130.3.52-1459431677581:blk_1073742638_1816 file=/user/root/.hiveJars/hive-exec-1.2.1000.2.4.0.0-169-1a2ff40e61734ceb11f35cefe7900422e9064d327d2021994ebadff4e8c631f5.jar
org.apache.hadoop.hdfs.BlockMissingException: Could not obtain block: BP-1214017999-10.130.3.52-1459431677581:blk_1073742638_1816 file=/user/root/.hiveJars/hive-exec-1.2.1000.2.4.0.0-169-1a2ff40e61734ceb11f35cefe7900422e9064d327d2021994ebadff4e8c631f5.jar
        at org.apache.hadoop.hdfs.DFSInputStream.chooseDataNode(DFSInputStream.java:945)
        at org.apache.hadoop.hdfs.DFSInputStream.blockSeekTo(DFSInputStream.java:604)
        at org.apache.hadoop.hdfs.DFSInputStream.readWithStrategy(DFSInputStream.java:844)
        at org.apache.hadoop.hdfs.DFSInputStream.read(DFSInputStream.java:896)
        at java.io.DataInputStream.read(DataInputStream.java:100)
        at org.apache.hadoop.io.IOUtils.copyBytes(IOUtils.java:85)
        at org.apache.hadoop.io.IOUtils.copyBytes(IOUtils.java:59)
        at org.apache.hadoop.io.IOUtils.copyBytes(IOUtils.java:119)
        at org.apache.hadoop.fs.FileUtil.copy(FileUtil.java:366)
        at org.apache.hadoop.yarn.util.FSDownload.copy(FSDownload.java:267)
        at org.apache.hadoop.yarn.util.FSDownload.access$000(FSDownload.java:63)
        at org.apache.hadoop.yarn.util.FSDownload$2.run(FSDownload.java:361)
        at org.apache.hadoop.yarn.util.FSDownload$2.run(FSDownload.java:359)
        at java.security.AccessController.doPrivileged(Native Method)
        at javax.security.auth.Subject.doAs(Subject.java:422)
        at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1657)
        at org.apache.hadoop.yarn.util.FSDownload.call(FSDownload.java:358)
        at org.apache.hadoop.yarn.util.FSDownload.call(FSDownload.java:62)
        at java.util.concurrent.FutureTask.run(FutureTask.java:266)
        at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
        at java.util.concurrent.FutureTask.run(FutureTask.java:266)
        at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
        at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
        at java.lang.Thread.run(Thread.java:745)

]], TaskAttempt 1 failed, info=[Container container_e49_1462904248344_0007_02_000009 finished with diagnostics set to [Container failed, exitCode=-1000. Could not obtain block: BP-1214017999-10.130.3.52-1459431677581:blk_1073742638_1816 file=/user/root/.hiveJars/hive-exec-1.2.1000.2.4.0.0-169-1a2ff40e61734ceb11f35cefe7900422e9064d327d2021994ebadff4e8c631f5.jar
org.apache.hadoop.hdfs.BlockMissingException: Could not obtain block: BP-1214017999-10.130.3.52-1459431677581:blk_1073742638_1816 file=/user/root/.hiveJars/hive-exec-1.2.1000.2.4.0.0-169-1a2ff40e61734ceb11f35cefe7900422e9064d327d2021994ebadff4e8c631f5.jar
        at org.apache.hadoop.hdfs.DFSInputStream.chooseDataNode(DFSInputStream.java:945)
        at org.apache.hadoop.hdfs.DFSInputStream.blockSeekTo(DFSInputStream.java:604)
        at org.apache.hadoop.hdfs.DFSInputStream.readWithStrategy(DFSInputStream.java:844)
        at org.apache.hadoop.hdfs.DFSInputStream.read(DFSInputStream.java:896)
        at java.io.DataInputStream.read(DataInputStream.java:100)
        at org.apache.hadoop.io.IOUtils.copyBytes(IOUtils.java:85)
        at org.apache.hadoop.io.IOUtils.copyBytes(IOUtils.java:59)
        at org.apache.hadoop.io.IOUtils.copyBytes(IOUtils.java:119)
        at org.apache.hadoop.fs.FileUtil.copy(FileUtil.java:366)
        at org.apache.hadoop.yarn.util.FSDownload.copy(FSDownload.java:267)
        at org.apache.hadoop.yarn.util.FSDownload.access$000(FSDownload.java:63)
        at org.apache.hadoop.yarn.util.FSDownload$2.run(FSDownload.java:361)
        at org.apache.hadoop.yarn.util.FSDownload$2.run(FSDownload.java:359)
        at java.security.AccessController.doPrivileged(Native Method)
        at javax.security.auth.Subject.doAs(Subject.java:422)
        at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1657)
        at org.apache.hadoop.yarn.util.FSDownload.call(FSDownload.java:358)
        at org.apache.hadoop.yarn.util.FSDownload.call(FSDownload.java:62)
        at java.util.concurrent.FutureTask.run(FutureTask.java:266)
        at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
        at java.util.concurrent.FutureTask.run(FutureTask.java:266)
        at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
        at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
        at java.lang.Thread.run(Thread.java:745)

]], TaskAttempt 2 failed, info=[Container container_e49_1462904248344_0007_02_000013 finished with diagnostics set to [Container failed, exitCode=-1000. Could not obtain block: BP-1214017999-10.130.3.52-1459431677581:blk_1073759911_19135 file=/tmp/hive/root/_tez_session_dir/fe7f8921-1363-410f-9bcf-1ef2285fe369/hive-hcatalog-core.jar
org.apache.hadoop.hdfs.BlockMissingException: Could not obtain block: BP-1214017999-10.130.3.52-1459431677581:blk_1073759911_19135 file=/tmp/hive/root/_tez_session_dir/fe7f8921-1363-410f-9bcf-1ef2285fe369/hive-hcatalog-core.jar
        at org.apache.hadoop.hdfs.DFSInputStream.chooseDataNode(DFSInputStream.java:945)
        at org.apache.hadoop.hdfs.DFSInputStream.blockSeekTo(DFSInputStream.java:604)
        at org.apache.hadoop.hdfs.DFSInputStream.readWithStrategy(DFSInputStream.java:844)
        at org.apache.hadoop.hdfs.DFSInputStream.read(DFSInputStream.java:896)
        at java.io.DataInputStream.read(DataInputStream.java:100)
        at org.apache.hadoop.io.IOUtils.copyBytes(IOUtils.java:85)
        at org.apache.hadoop.io.IOUtils.copyBytes(IOUtils.java:59)
        at org.apache.hadoop.io.IOUtils.copyBytes(IOUtils.java:119)
        at org.apache.hadoop.fs.FileUtil.copy(FileUtil.java:366)
        at org.apache.hadoop.yarn.util.FSDownload.copy(FSDownload.java:267)
        at org.apache.hadoop.yarn.util.FSDownload.access$000(FSDownload.java:63)
        at org.apache.hadoop.yarn.util.FSDownload$2.run(FSDownload.java:361)
        at org.apache.hadoop.yarn.util.FSDownload$2.run(FSDownload.java:359)
        at java.security.AccessController.doPrivileged(Native Method)
        at javax.security.auth.Subject.doAs(Subject.java:422)
        at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1657)
        at org.apache.hadoop.yarn.util.FSDownload.call(FSDownload.java:358)
        at org.apache.hadoop.yarn.util.FSDownload.call(FSDownload.java:62)
        at java.util.concurrent.FutureTask.run(FutureTask.java:266)
        at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
        at java.util.concurrent.FutureTask.run(FutureTask.java:266)
        at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
        at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
        at java.lang.Thread.run(Thread.java:745)

]], TaskAttempt 3 failed, info=[Container container_e49_1462904248344_0007_02_000018 finished with diagnostics set to [Container failed, exitCode=-1000. Could not obtain block: BP-1214017999-10.130.3.52-1459431677581:blk_1073742638_1816 file=/user/root/.hiveJars/hive-exec-1.2.1000.2.4.0.0-169-1a2ff40e61734ceb11f35cefe7900422e9064d327d2021994ebadff4e8c631f5.jar
org.apache.hadoop.hdfs.BlockMissingException: Could not obtain block: BP-1214017999-10.130.3.52-1459431677581:blk_1073742638_1816 file=/user/root/.hiveJars/hive-exec-1.2.1000.2.4.0.0-169-1a2ff40e61734ceb11f35cefe7900422e9064d327d2021994ebadff4e8c631f5.jar
        at org.apache.hadoop.hdfs.DFSInputStream.chooseDataNode(DFSInputStream.java:945)
        at org.apache.hadoop.hdfs.DFSInputStream.blockSeekTo(DFSInputStream.java:604)
        at org.apache.hadoop.hdfs.DFSInputStream.readWithStrategy(DFSInputStream.java:844)
        at org.apache.hadoop.hdfs.DFSInputStream.read(DFSInputStream.java:896)
        at java.io.DataInputStream.read(DataInputStream.java:100)
        at org.apache.hadoop.io.IOUtils.copyBytes(IOUtils.java:85)
        at org.apache.hadoop.io.IOUtils.copyBytes(IOUtils.java:59)
        at org.apache.hadoop.io.IOUtils.copyBytes(IOUtils.java:119)
        at org.apache.hadoop.fs.FileUtil.copy(FileUtil.java:366)
        at org.apache.hadoop.yarn.util.FSDownload.copy(FSDownload.java:267)
        at org.apache.hadoop.yarn.util.FSDownload.access$000(FSDownload.java:63)
        at org.apache.hadoop.yarn.util.FSDownload$2.run(FSDownload.java:361)
        at org.apache.hadoop.yarn.util.FSDownload$2.run(FSDownload.java:359)
        at java.security.AccessController.doPrivileged(Native Method)
        at javax.security.auth.Subject.doAs(Subject.java:422)
        at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1657)
        at org.apache.hadoop.yarn.util.FSDownload.call(FSDownload.java:358)
        at org.apache.hadoop.yarn.util.FSDownload.call(FSDownload.java:62)
        at java.util.concurrent.FutureTask.run(FutureTask.java:266)
        at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
        at java.util.concurrent.FutureTask.run(FutureTask.java:266)
        at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
        at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
        at java.lang.Thread.run(Thread.java:745)

]]], Task failed, taskId=task_1462904248344_0007_1_00_000003, diagnostics=[TaskAttempt 0 failed, info=[Container container_e49_1462904248344_0007_02_000005 finished with diagnostics set to [Container failed, exitCode=-1000. Could not obtain block: BP-1214017999-10.130.3.52-1459431677581:blk_1073742638_1816 file=/user/root/.hiveJars/hive-exec-1.2.1000.2.4.0.0-169-1a2ff40e61734ceb11f35cefe7900422e9064d327d2021994ebadff4e8c631f5.jar
org.apache.hadoop.hdfs.BlockMissingException: Could not obtain block: BP-1214017999-10.130.3.52-1459431677581:blk_1073742638_1816 file=/user/root/.hiveJars/hive-exec-1.2.1000.2.4.0.0-169-1a2ff40e61734ceb11f35cefe7900422e9064d327d2021994ebadff4e8c631f5.jar
        at org.apache.hadoop.hdfs.DFSInputStream.chooseDataNode(DFSInputStream.java:945)
        at org.apache.hadoop.hdfs.DFSInputStream.blockSeekTo(DFSInputStream.java:604)
        at org.apache.hadoop.hdfs.DFSInputStream.readWithStrategy(DFSInputStream.java:844)
        at org.apache.hadoop.hdfs.DFSInputStream.read(DFSInputStream.java:896)
        at java.io.DataInputStream.read(DataInputStream.java:100)
        at org.apache.hadoop.io.IOUtils.copyBytes(IOUtils.java:85)
        at org.apache.hadoop.io.IOUtils.copyBytes(IOUtils.java:59)
        at org.apache.hadoop.io.IOUtils.copyBytes(IOUtils.java:119)
        at org.apache.hadoop.fs.FileUtil.copy(FileUtil.java:366)
        at org.apache.hadoop.yarn.util.FSDownload.copy(FSDownload.java:267)
        at org.apache.hadoop.yarn.util.FSDownload.access$000(FSDownload.java:63)
        at org.apache.hadoop.yarn.util.FSDownload$2.run(FSDownload.java:361)
        at org.apache.hadoop.yarn.util.FSDownload$2.run(FSDownload.java:359)
        at java.security.AccessController.doPrivileged(Native Method)
        at javax.security.auth.Subject.doAs(Subject.java:422)
        at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1657)
        at org.apache.hadoop.yarn.util.FSDownload.call(FSDownload.java:358)
        at org.apache.hadoop.yarn.util.FSDownload.call(FSDownload.java:62)
        at java.util.concurrent.FutureTask.run(FutureTask.java:266)
        at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
        at java.util.concurrent.FutureTask.run(FutureTask.java:266)
        at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
        at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
        at java.lang.Thread.run(Thread.java:745)

]], TaskAttempt 1 failed, info=[Container container_e49_1462904248344_0007_02_000008 finished with diagnostics set to [Container failed, exitCode=-1000. Could not obtain block: BP-1214017999-10.130.3.52-1459431677581:blk_1073742638_1816 file=/user/root/.hiveJars/hive-exec-1.2.1000.2.4.0.0-169-1a2ff40e61734ceb11f35cefe7900422e9064d327d2021994ebadff4e8c631f5.jar
org.apache.hadoop.hdfs.BlockMissingException: Could not obtain block: BP-1214017999-10.130.3.52-1459431677581:blk_1073742638_1816 file=/user/root/.hiveJars/hive-exec-1.2.1000.2.4.0.0-169-1a2ff40e61734ceb11f35cefe7900422e9064d327d2021994ebadff4e8c631f5.jar
        at org.apache.hadoop.hdfs.DFSInputStream.chooseDataNode(DFSInputStream.java:945)
        at org.apache.hadoop.hdfs.DFSInputStream.blockSeekTo(DFSInputStream.java:604)
        at org.apache.hadoop.hdfs.DFSInputStream.readWithStrategy(DFSInputStream.java:844)
        at org.apache.hadoop.hdfs.DFSInputStream.read(DFSInputStream.java:896)
        at java.io.DataInputStream.read(DataInputStream.java:100)
        at org.apache.hadoop.io.IOUtils.copyBytes(IOUtils.java:85)
        at org.apache.hadoop.io.IOUtils.copyBytes(IOUtils.java:59)
        at org.apache.hadoop.io.IOUtils.copyBytes(IOUtils.java:119)
        at org.apache.hadoop.fs.FileUtil.copy(FileUtil.java:366)
        at org.apache.hadoop.yarn.util.FSDownload.copy(FSDownload.java:267)
        at org.apache.hadoop.yarn.util.FSDownload.access$000(FSDownload.java:63)
        at org.apache.hadoop.yarn.util.FSDownload$2.run(FSDownload.java:361)
        at org.apache.hadoop.yarn.util.FSDownload$2.run(FSDownload.java:359)
        at java.security.AccessController.doPrivileged(Native Method)
        at javax.security.auth.Subject.doAs(Subject.java:422)
        at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1657)
        at org.apache.hadoop.yarn.util.FSDownload.call(FSDownload.java:358)
        at org.apache.hadoop.yarn.util.FSDownload.call(FSDownload.java:62)
        at java.util.concurrent.FutureTask.run(FutureTask.java:266)
        at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
        at java.util.concurrent.FutureTask.run(FutureTask.java:266)
        at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
        at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
        at java.lang.Thread.run(Thread.java:745)

]], TaskAttempt 2 failed, info=[Container container_e49_1462904248344_0007_02_000014 finished with diagnostics set to [Container failed, exitCode=-1000. Could not obtain block: BP-1214017999-10.130.3.52-1459431677581:blk_1073759911_19135 file=/tmp/hive/root/_tez_session_dir/fe7f8921-1363-410f-9bcf-1ef2285fe369/hive-hcatalog-core.jar
org.apache.hadoop.hdfs.BlockMissingException: Could not obtain block: BP-1214017999-10.130.3.52-1459431677581:blk_1073759911_19135 file=/tmp/hive/root/_tez_session_dir/fe7f8921-1363-410f-9bcf-1ef2285fe369/hive-hcatalog-core.jar
        at org.apache.hadoop.hdfs.DFSInputStream.chooseDataNode(DFSInputStream.java:945)
        at org.apache.hadoop.hdfs.DFSInputStream.blockSeekTo(DFSInputStream.java:604)
        at org.apache.hadoop.hdfs.DFSInputStream.readWithStrategy(DFSInputStream.java:844)
        at org.apache.hadoop.hdfs.DFSInputStream.read(DFSInputStream.java:896)
        at java.io.DataInputStream.read(DataInputStream.java:100)
        at org.apache.hadoop.io.IOUtils.copyBytes(IOUtils.java:85)
        at org.apache.hadoop.io.IOUtils.copyBytes(IOUtils.java:59)
        at org.apache.hadoop.io.IOUtils.copyBytes(IOUtils.java:119)
        at org.apache.hadoop.fs.FileUtil.copy(FileUtil.java:366)
        at org.apache.hadoop.yarn.util.FSDownload.copy(FSDownload.java:267)
        at org.apache.hadoop.yarn.util.FSDownload.access$000(FSDownload.java:63)
        at org.apache.hadoop.yarn.util.FSDownload$2.run(FSDownload.java:361)
        at org.apache.hadoop.yarn.util.FSDownload$2.run(FSDownload.java:359)
        at java.security.AccessController.doPrivileged(Native Method)
        at javax.security.auth.Subject.doAs(Subject.java:422)
        at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1657)
        at org.apache.hadoop.yarn.util.FSDownload.call(FSDownload.java:358)
        at org.apache.hadoop.yarn.util.FSDownload.call(FSDownload.java:62)
        at java.util.concurrent.FutureTask.run(FutureTask.java:266)
        at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
        at java.util.concurrent.FutureTask.run(FutureTask.java:266)
        at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
        at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
        at java.lang.Thread.run(Thread.java:745)

]], TaskAttempt 3 failed, info=[Container container_e49_1462904248344_0007_02_000017 finished with diagnostics set to [Container failed, exitCode=-1000. Could not obtain block: BP-1214017999-10.130.3.52-1459431677581:blk_1073742638_1816 file=/user/root/.hiveJars/hive-exec-1.2.1000.2.4.0.0-169-1a2ff40e61734ceb11f35cefe7900422e9064d327d2021994ebadff4e8c631f5.jar
org.apache.hadoop.hdfs.BlockMissingException: Could not obtain block: BP-1214017999-10.130.3.52-1459431677581:blk_1073742638_1816 file=/user/root/.hiveJars/hive-exec-1.2.1000.2.4.0.0-169-1a2ff40e61734ceb11f35cefe7900422e9064d327d2021994ebadff4e8c631f5.jar
        at org.apache.hadoop.hdfs.DFSInputStream.chooseDataNode(DFSInputStream.java:945)
        at org.apache.hadoop.hdfs.DFSInputStream.blockSeekTo(DFSInputStream.java:604)
        at org.apache.hadoop.hdfs.DFSInputStream.readWithStrategy(DFSInputStream.java:844)
        at org.apache.hadoop.hdfs.DFSInputStream.read(DFSInputStream.java:896)
        at java.io.DataInputStream.read(DataInputStream.java:100)
        at org.apache.hadoop.io.IOUtils.copyBytes(IOUtils.java:85)
        at org.apache.hadoop.io.IOUtils.copyBytes(IOUtils.java:59)
        at org.apache.hadoop.io.IOUtils.copyBytes(IOUtils.java:119)
        at org.apache.hadoop.fs.FileUtil.copy(FileUtil.java:366)
        at org.apache.hadoop.yarn.util.FSDownload.copy(FSDownload.java:267)
        at org.apache.hadoop.yarn.util.FSDownload.access$000(FSDownload.java:63)
        at org.apache.hadoop.yarn.util.FSDownload$2.run(FSDownload.java:361)
        at org.apache.hadoop.yarn.util.FSDownload$2.run(FSDownload.java:359)
        at java.security.AccessController.doPrivileged(Native Method)
        at javax.security.auth.Subject.doAs(Subject.java:422)
        at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1657)
        at org.apache.hadoop.yarn.util.FSDownload.call(FSDownload.java:358)
        at org.apache.hadoop.yarn.util.FSDownload.call(FSDownload.java:62)
        at java.util.concurrent.FutureTask.run(FutureTask.java:266)
        at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
        at java.util.concurrent.FutureTask.run(FutureTask.java:266)
        at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
        at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
        at java.lang.Thread.run(Thread.java:745)

]]], Vertex did not succeed due to OWN_TASK_FAILURE, failedTasks:2 killedTasks:7, Vertex vertex_1462904248344_0007_1_00 [Map 1] killed/failed due to:OWN_TASK_FAILURE]
...

以前有人看过这个问题吗?提前谢谢。

mtb9vblg

mtb9vblg1#

在linux终端中,suhdfs&运行hadoop-dfsadmin-report以确认块没有损坏。
根据日志,我认为您正在以root用户身份运行查询,请尝试模拟root用户。登录到ambari ui,然后转到hdfs-->config-->advanced-->custom core站点
更新或添加属性
hadoop.proxyuser.root.groups根目录=*
hadoop.proxyuser.root.hosts=*

相关问题