无法从本地计算机连接hdfs

x3naxklr  于 2021-05-29  发布在  Hadoop
关注(0)|答案(1)|浏览(404)

我正在写一个简单的程序来从hdfs读/写数据。我无法从本地计算机连接安装在远程计算机中的hdfs。我得到以下例外

18/08/19 16:47:45 DEBUG lib.MutableMetricsFactory: field org.apache.hadoop.metrics2.lib.MutableRate org.apache.hadoop.security.UserGroupInformation$UgiMetrics.loginSuccess with annotation @org.apache.hadoop.metrics2.annotation.Metric(sampleName=Ops, always=false, about=, type=DEFAULT, value=[Rate of successful kerberos logins and latency (milliseconds)], valueName=Time)
18/08/19 16:47:45 DEBUG lib.MutableMetricsFactory: field org.apache.hadoop.metrics2.lib.MutableRate org.apache.hadoop.security.UserGroupInformation$UgiMetrics.loginFailure with annotation @org.apache.hadoop.metrics2.annotation.Metric(sampleName=Ops, always=false, about=, type=DEFAULT, value=[Rate of failed kerberos logins and latency (milliseconds)], valueName=Time)
18/08/19 16:47:45 DEBUG lib.MutableMetricsFactory: field org.apache.hadoop.metrics2.lib.MutableRate org.apache.hadoop.security.UserGroupInformation$UgiMetrics.getGroups with annotation @org.apache.hadoop.metrics2.annotation.Metric(sampleName=Ops, always=false, about=, type=DEFAULT, value=[GetGroups], valueName=Time)
18/08/19 16:47:45 DEBUG impl.MetricsSystemImpl: UgiMetrics, User and group related metrics
18/08/19 16:47:45 DEBUG security.Groups:  Creating new Groups object
18/08/19 16:47:45 DEBUG util.NativeCodeLoader: Trying to load the custom-built native-hadoop library...
18/08/19 16:47:45 DEBUG util.NativeCodeLoader: Failed to load native-hadoop with error: java.lang.UnsatisfiedLinkError: no hadoop in java.library.path
18/08/19 16:47:45 DEBUG util.NativeCodeLoader: java.library.path=/Users/rabbit/Library/Java/Extensions:/Library/Java/Extensions:/Network/Library/Java/Extensions:/System/Library/Java/Extensions:/usr/lib/java:.
18/08/19 16:47:45 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
18/08/19 16:47:45 DEBUG security.JniBasedUnixGroupsMappingWithFallback: Falling back to shell based
18/08/19 16:47:45 DEBUG security.JniBasedUnixGroupsMappingWithFallback: Group mapping impl=org.apache.hadoop.security.ShellBasedUnixGroupsMapping
18/08/19 16:47:45 DEBUG util.Shell: Failed to detect a valid hadoop home directory
java.io.IOException: HADOOP_HOME or hadoop.home.dir are not set.
    at org.apache.hadoop.util.Shell.checkHadoopHome(Shell.java:302)
    at org.apache.hadoop.util.Shell.<clinit>(Shell.java:327)
    at org.apache.hadoop.util.StringUtils.<clinit>(StringUtils.java:78)
    at org.apache.hadoop.security.Groups.parseStaticMapping(Groups.java:93)
    at org.apache.hadoop.security.Groups.<init>(Groups.java:77)
    at org.apache.hadoop.security.Groups.getUserToGroupsMappingService(Groups.java:240)
    at org.apache.hadoop.security.UserGroupInformation.initialize(UserGroupInformation.java:257)
    at org.apache.hadoop.security.UserGroupInformation.ensureInitialized(UserGroupInformation.java:234)
    at org.apache.hadoop.security.UserGroupInformation.loginUserFromSubject(UserGroupInformation.java:749)
    at org.apache.hadoop.security.UserGroupInformation.getLoginUser(UserGroupInformation.java:734)
    at org.apache.hadoop.security.UserGroupInformation.getCurrentUser(UserGroupInformation.java:607)
    at org.apache.hadoop.fs.FileSystem$Cache$Key.<init>(FileSystem.java:2748)
    at org.apache.hadoop.fs.FileSystem$Cache$Key.<init>(FileSystem.java:2740)
    at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:2606)
    at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:368)
    at com.rabbit.hdfs.io.ReadFileDataToConsole.main(ReadFileDataToConsole.java:22)
18/08/19 16:47:45 DEBUG util.Shell: setsid is not available on this machine. So not using it.
18/08/19 16:47:45 DEBUG util.Shell: setsid exited with exit code 0
18/08/19 16:47:45 DEBUG security.Groups: Group mapping impl=org.apache.hadoop.security.JniBasedUnixGroupsMappingWithFallback; cacheTimeout=300000; warningDeltaMs=5000
18/08/19 16:47:45 DEBUG security.UserGroupInformation: hadoop login
18/08/19 16:47:45 DEBUG security.UserGroupInformation: hadoop login commit
18/08/19 16:47:45 DEBUG security.UserGroupInformation: using local user:UnixPrincipal: rabbit
18/08/19 16:47:45 DEBUG security.UserGroupInformation: UGI loginUser:rabbit (auth:SIMPLE)
18/08/19 16:47:46 DEBUG hdfs.BlockReaderLocal: dfs.client.use.legacy.blockreader.local = false
18/08/19 16:47:46 DEBUG hdfs.BlockReaderLocal: dfs.client.read.shortcircuit = false
18/08/19 16:47:46 DEBUG hdfs.BlockReaderLocal: dfs.client.domain.socket.data.traffic = false
18/08/19 16:47:46 DEBUG hdfs.BlockReaderLocal: dfs.domain.socket.path = 
18/08/19 16:47:46 DEBUG retry.RetryUtils: multipleLinearRandomRetry = null
18/08/19 16:47:46 DEBUG ipc.Server: rpcKind=RPC_PROTOCOL_BUFFER, rpcRequestWrapperClass=class org.apache.hadoop.ipc.ProtobufRpcEngine$RpcRequestWrapper, rpcInvoker=org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker@12405818
18/08/19 16:47:46 DEBUG ipc.Client: getting client out of cache: org.apache.hadoop.ipc.Client@7ff2a664
18/08/19 16:47:46 DEBUG shortcircuit.DomainSocketFactory: Both short-circuit local reads and UNIX domain socket are disabled.
18/08/19 16:47:46 DEBUG ipc.Client: The ping interval is 60000 ms.
18/08/19 16:47:46 DEBUG ipc.Client: Connecting to /192.168.143.150:54310
18/08/19 16:47:46 DEBUG ipc.Client: closing ipc connection to 192.168.143.150/192.168.143.150:54310: Connection refused
java.net.ConnectException: Connection refused
    at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method)
    at sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:717)
    at org.apache.hadoop.net.SocketIOWithTimeout.connect(SocketIOWithTimeout.java:206)
    at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:529)
    at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:493)
    at org.apache.hadoop.ipc.Client$Connection.setupConnection(Client.java:606)
    at org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:700)
    at org.apache.hadoop.ipc.Client$Connection.access$2800(Client.java:367)
    at org.apache.hadoop.ipc.Client.getConnection(Client.java:1463)
    at org.apache.hadoop.ipc.Client.call(Client.java:1382)
    at org.apache.hadoop.ipc.Client.call(Client.java:1364)
    at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:206)
    at com.sun.proxy.$Proxy9.getBlockLocations(Unknown Source)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:498)
    at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:187)
    at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102)
    at com.sun.proxy.$Proxy9.getBlockLocations(Unknown Source)
    at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.getBlockLocations(ClientNamenodeProtocolTranslatorPB.java:225)
    at org.apache.hadoop.hdfs.DFSClient.callGetBlockLocations(DFSClient.java:1165)
    at org.apache.hadoop.hdfs.DFSClient.getLocatedBlocks(DFSClient.java:1155)
    at org.apache.hadoop.hdfs.DFSClient.getLocatedBlocks(DFSClient.java:1145)
    at org.apache.hadoop.hdfs.DFSInputStream.fetchLocatedBlocksAndGetLastBlockLength(DFSInputStream.java:268)
    at org.apache.hadoop.hdfs.DFSInputStream.openInfo(DFSInputStream.java:235)
    at org.apache.hadoop.hdfs.DFSInputStream.<init>(DFSInputStream.java:228)
    at org.apache.hadoop.hdfs.DFSClient.open(DFSClient.java:1318)
    at org.apache.hadoop.hdfs.DistributedFileSystem$3.doCall(DistributedFileSystem.java:293)
    at org.apache.hadoop.hdfs.DistributedFileSystem$3.doCall(DistributedFileSystem.java:289)
    at org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)
    at org.apache.hadoop.hdfs.DistributedFileSystem.open(DistributedFileSystem.java:289)
    at org.apache.hadoop.fs.FileSystem.open(FileSystem.java:764)
    at com.rabbit.hdfs.io.ReadFileDataToConsole.main(ReadFileDataToConsole.java:29)
18/08/19 16:47:46 DEBUG ipc.Client: IPC Client (775931202) connection to /192.168.143.150:54310 from rabbit: closed
18/08/19 16:47:46 DEBUG ipc.Client: stopping client from cache: org.apache.hadoop.ipc.Client@7ff2a664
18/08/19 16:47:46 DEBUG ipc.Client: removing client from cache: org.apache.hadoop.ipc.Client@7ff2a664
18/08/19 16:47:46 DEBUG ipc.Client: stopping actual client because no more references remain: org.apache.hadoop.ipc.Client@7ff2a664
18/08/19 16:47:46 DEBUG ipc.Client: Stopping client
Exception in thread "main" java.net.ConnectException: Call From rabbit/127.0.0.1 to 192.168.143.150:54310 failed on connection exception: java.net.ConnectException: Connection refused; For more details see:  http://wiki.apache.org/hadoop/ConnectionRefused
    at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
    at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
    at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
    at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
    at org.apache.hadoop.net.NetUtils.wrapWithMessage(NetUtils.java:783)
    at org.apache.hadoop.net.NetUtils.wrapException(NetUtils.java:730)
    at org.apache.hadoop.ipc.Client.call(Client.java:1415)
    at org.apache.hadoop.ipc.Client.call(Client.java:1364)
    at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:206)
    at com.sun.proxy.$Proxy9.getBlockLocations(Unknown Source)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:498)
    at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:187)
    at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102)
    at com.sun.proxy.$Proxy9.getBlockLocations(Unknown Source)
    at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.getBlockLocations(ClientNamenodeProtocolTranslatorPB.java:225)
    at org.apache.hadoop.hdfs.DFSClient.callGetBlockLocations(DFSClient.java:1165)
    at org.apache.hadoop.hdfs.DFSClient.getLocatedBlocks(DFSClient.java:1155)
    at org.apache.hadoop.hdfs.DFSClient.getLocatedBlocks(DFSClient.java:1145)
    at org.apache.hadoop.hdfs.DFSInputStream.fetchLocatedBlocksAndGetLastBlockLength(DFSInputStream.java:268)
    at org.apache.hadoop.hdfs.DFSInputStream.openInfo(DFSInputStream.java:235)
    at org.apache.hadoop.hdfs.DFSInputStream.<init>(DFSInputStream.java:228)
    at org.apache.hadoop.hdfs.DFSClient.open(DFSClient.java:1318)
    at org.apache.hadoop.hdfs.DistributedFileSystem$3.doCall(DistributedFileSystem.java:293)
    at org.apache.hadoop.hdfs.DistributedFileSystem$3.doCall(DistributedFileSystem.java:289)
    at org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)
    at org.apache.hadoop.hdfs.DistributedFileSystem.open(DistributedFileSystem.java:289)
    at org.apache.hadoop.fs.FileSystem.open(FileSystem.java:764)
    at com.rabbit.hdfs.io.ReadFileDataToConsole.main(ReadFileDataToConsole.java:29)
Caused by: java.net.ConnectException: Connection refused
    at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method)
    at sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:717)
    at org.apache.hadoop.net.SocketIOWithTimeout.connect(SocketIOWithTimeout.java:206)
    at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:529)
    at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:493)
    at org.apache.hadoop.ipc.Client$Connection.setupConnection(Client.java:606)
    at org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:700)
    at org.apache.hadoop.ipc.Client$Connection.access$2800(Client.java:367)
    at org.apache.hadoop.ipc.Client.getConnection(Client.java:1463)
    at org.apache.hadoop.ipc.Client.call(Client.java:1382)
    ... 24 more

我把这个链接作为我的参考。我正在绞尽脑汁进一步调试。我不知道我哪里出错了。有人能帮我整理一下吗?

tez616oj

tez616oj1#

添加ip地址 192.168.143.150/etc/hosts 喜欢

192.168.143.150 192.168.143.150

127.0.0.1 localhost

这帮助我解决了问题。谢谢:)

相关问题