使用awk处理java异常

bz4sfanl  于 2021-05-29  发布在  Hadoop
关注(0)|答案(0)|浏览(270)

我想知道什么是收集日志文件中发生的所有不同异常的最佳方法。
条目如下所示:
/var/log/hadoophdfs/hadoopcmfhdfs数据节点aaa.log.out.5

2017-08-30 13:54:44,561 ERROR org.apache.hadoop.hdfs.server.datanode.DataNode: DataNode{data=FSDataset{dirpath='[/var/hadoop/sdc/dn/current, /var/hadoop/sdd/dn/current]'}, localName='host.tld:50010', datanodeUuid='aaaaaa-6828-44dd-xxx-bbbbb', xmitsInProgress=0}:Exception transfering block BP-111111-172.16.9.110-1471873778315:blk_1086251547_12532682 to mirror 172.16.9.8:50010: org.apache.hadoop.hdfs.protocol.datatransfer.InvalidEncryptionKeyException: Can't re-compute encryption key for nonce, since the required block key (keyID=-111) doesn't exist. Current key: 123

或者这个:

2016-08-22 15:50:09,706 ERROR org.apache.hadoop.hdfs.server.datanode.DiskBalancer: Disk Balancer is not enabled.

我想打印出例外情况,如果没有例外,然后剩余的字段$4后。
当前代码:

awk '/ERROR/{print $3" "$4}' /var/log/hadoop-hdfs/*.log.out | sort | uniq -c

有没有一种简单的方法来查看$4之后的所有字段,如果有异常,打印出有异常的字段,如果没有打印出所有内容?
现在的输出是这样的:

93 ERROR org.apache.hadoop.hdfs.server.datanode.DataNode:
   8403 ERROR org.apache.hadoop.hdfs.server.datanode.DiskBalancer:

预期输出为:

xx ERROR org.apache.hadoop.hdfs.server.datanode.DataNode: java.io.IOException: Broken pipe
     yy ERROR org.apache.hadoop.hdfs.server.datanode.DataNode: java.net.SocketTimeoutException
   8403 ERROR org.apache.hadoop.hdfs.server.datanode.DiskBalancer: Disk Balancer is not enabled.

样本输入:

2016-08-22 16:35:42,502 ERROR org.apache.hadoop.hdfs.server.datanode.DiskBalancer: Disk Balancer is not enabled.
2016-08-22 16:36:42,506 ERROR org.apache.hadoop.hdfs.server.datanode.DiskBalancer: Disk Balancer is not enabled.
2016-08-22 16:37:29,515 ERROR org.apache.hadoop.hdfs.server.datanode.DiskBalancer: Disk Balancer is not enabled.
2016-08-22 16:37:29,530 ERROR org.apache.hadoop.hdfs.server.datanode.DataNode: RECEIVED SIGNAL 15: SIGTERM
2018-01-06 13:45:18,899 ERROR org.apache.hadoop.hdfs.server.datanode.DataNode: hostname:50010:DataXceiver error processing WRITE_BLOCK operation  src: /172.16.9.68:53477 dst: /172.16.9.6:50010
2018-01-06 14:04:05,176 ERROR org.apache.hadoop.hdfs.server.datanode.DataNode: DataNode{data=FSDataset{dirpath='[/var/hadoop/sdc/dn/current, /var/hadoop/sdd/dn/current]'}, localName='hostname:50010', datanodeUuid='uuid', xmitsInProgress=11}:Exception transfering block BP-1301709078-172.16.9.110-1471873778315:blk_1095601056_21903280 to mirror 172.16.9.34:50010: java.net.SocketTimeoutException: 65000 millis timeout while waiting for channel to be ready for read. ch : java.nio.channels.SocketChannel[connected local=/172.16.9.6:37439 remote=/172.16.9.34:50010]

暂无答案!

目前还没有任何答案,快来回答吧!

相关问题