Yarn日志+blk__不存在或未在构建中

btqmn9zl  于 2021-05-27  发布在  Hadoop
关注(0)|答案(1)|浏览(325)

我们有spark cluster,详细信息如下(所有机器都是linux redhat机器)

2 name-node machines 
2 resource-manager machines 
8 data-node machines ( HDFS file-system)

我们正在运行spark流媒体应用程序
从Yarn日志中,我们可以看到以下错误,例如:

yarn logs -applicationId application_xxxxxxxx -log_files ALL

---2019-11-08T10:12:20.040 ERROR [][][] [org.apache.spark.scheduler.LiveListenerBus] Listener EventLoggingListener threw an exception
org.apache.hadoop.ipc.RemoteException(java.io.IOException): BP-484874736-172.2.45.23-8478399929292:blk_1081495827_7755233 does not exist or is not under Construction
        at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.checkUCBlock(FSNamesystem.java:6721)
        at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.updateBlockForPipeline(FSNamesystem.java:6789)
        at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.updateBlockForPipeline(NameNodeRpcServer.java:931)
        at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.updateBlockForPipeline(ClientNamenodeProtocolServerSideTranslatorPB.java:979)
        at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java)
        at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:640)
        at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:982)
        at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2351)
        at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2347)
        at java.security.AccessController.doPrivileged(Native Method)
        at javax.security.auth.Subject.doAs(Subject.java:422)
        at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1869)
        at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2347)

我们可以看到- 8478399929292:-blk_1081495827_7755233 不存在或不在建设中
但Yarn抱怨这一点的原因是什么呢?

n9vozmp4

n9vozmp41#

根据Yarn文件

yarn logs -applicationId <Application ID> -am ALL

会更合适

相关问题