hadoop dfsclient getfileinfo():远程主机已强制关闭现有连接

qvsjd97n  于 2021-07-13  发布在  Hadoop
关注(0)|答案(1)|浏览(390)

我正在尝试将文件从本地复制到hdp(hortonworks数据平台)集群中的hadoop。但是我在org.apache.hadoop.hdfs.dfsclient.getfileinfo(dfsclient)上遇到了连接被拒绝的异常。java:2116)我需要更改集群中hadoop的任何配置吗?
示例程序:

String hdfsUrl = "hdfs://ambari-agent-1:8020";
    Configuration configuration = new Configuration();
    configuration.set("dfs.client.use.datanode.hostname", "true");
    FileSystem fs = FileSystem.get(new URI(hdfsUrl), configuration);
    Path srcPath = new Path("C:\\HadoopPlugin\\SampleData\\salesrecord.csv");
    Path destPath = new Path("hdfs://ambari-agent-1:8020/user/hadoop/testResult/");
    fs.copyFromLocalFile(srcPath, destPath);
    System.out.println("Copied file successfully");

异常堆栈跟踪:

Exception in thread "main" java.io.IOException: Failed on local exception: java.io.IOException: An existing connection was forcibly closed by the remote host; Host Details : local host is: "DT01070442/192.168.44.7"; destination host is: "ambari-agent-1":8020; 
at org.apache.hadoop.net.NetUtils.wrapException(NetUtils.java:776)
at org.apache.hadoop.ipc.Client.call(Client.java:1480)
at org.apache.hadoop.ipc.Client.call(Client.java:1407)
at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:229)
at com.sun.proxy.$Proxy9.getFileInfo(Unknown Source)
at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.getFileInfo(ClientNamenodeProtocolTranslatorPB.java:771)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:187)
at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102)
at com.sun.proxy.$Proxy10.getFileInfo(Unknown Source)
at org.apache.hadoop.hdfs.DFSClient.getFileInfo(DFSClient.java:2116)
at org.apache.hadoop.hdfs.DistributedFileSystem$22.doCall(DistributedFileSystem.java:1305)
at org.apache.hadoop.hdfs.DistributedFileSystem$22.doCall(DistributedFileSystem.java:1301)
at org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)
at org.apache.hadoop.hdfs.DistributedFileSystem.getFileStatus(DistributedFileSystem.java:1317)
at org.apache.hadoop.fs.FileSystem.exists(FileSystem.java:1424)
at org.apache.hadoop.fs.FileUtil.checkDest(FileUtil.java:496)
at org.apache.hadoop.fs.FileUtil.copy(FileUtil.java:348)
at org.apache.hadoop.fs.FileUtil.copy(FileUtil.java:338)
at org.apache.hadoop.fs.FileSystem.copyFromLocalFile(FileSystem.java:1965)
at org.apache.hadoop.fs.FileSystem.copyFromLocalFile(FileSystem.java:1933)
at org.apache.hadoop.fs.FileSystem.copyFromLocalFile(FileSystem.java:1898)
at practiceHadoop.CopyFile.main(CopyFile.java:19)
Caused by: java.io.IOException: An existing connection was forcibly closed by the remote host
at sun.nio.ch.SocketDispatcher.read0(Native Method)
at sun.nio.ch.SocketDispatcher.read(SocketDispatcher.java:43)
at sun.nio.ch.IOUtil.readIntoNativeBuffer(IOUtil.java:223)
at sun.nio.ch.IOUtil.read(IOUtil.java:197)
at sun.nio.ch.SocketChannelImpl.read(SocketChannelImpl.java:380)
at org.apache.hadoop.net.SocketInputStream$Reader.performIO(SocketInputStream.java:57)
at org.apache.hadoop.net.SocketIOWithTimeout.doIO(SocketIOWithTimeout.java:142)
at org.apache.hadoop.net.SocketInputStream.read(SocketInputStream.java:161)
at org.apache.hadoop.net.SocketInputStream.read(SocketInputStream.java:131)
at java.io.FilterInputStream.read(FilterInputStream.java:133)
at java.io.FilterInputStream.read(FilterInputStream.java:133)
at org.apache.hadoop.ipc.Client$Connection$PingInputStream.read(Client.java:515)
at java.io.BufferedInputStream.fill(BufferedInputStream.java:246)
at java.io.BufferedInputStream.read(BufferedInputStream.java:265)
at java.io.DataInputStream.readInt(DataInputStream.java:387)
at org.apache.hadoop.ipc.Client$Connection.receiveRpcResponse(Client.java:1079)
at org.apache.hadoop.ipc.Client$Connection.run(Client.java:974)
44u64gxh

44u64gxh1#

这通常意味着远程端关闭了连接(通常通过发送tcp/ip rst数据包)
可能的原因是:网络问题—路由器、防火墙、线路服务器(主机)问题—资源不足、操作系统问题—服务问题—名称节点、jurnalnode、datanode服务
为了找到问题的原因,您必须首先检查每个步骤的日志。
首先,检查hadoop服务日志
hadoop名称节点日志可以在/var/hadoop/hadoop-hdfs-namenode-.log中找到(位置和名称可能不同)
祝你好运!

相关问题