java.net.connectexception执行本地到hdfs文件传输时出错

jxct1oxe  于 2021-05-27  发布在  Hadoop
关注(0)|答案(0)|浏览(227)

我浏览了几个相关的帖子,但没有找到解决办法。
目的:尝试执行map reduce作业
到目前为止:mapper和reducer.py已经创建和测试。工作得很好。
访客操作系统:ubuntu 20
主机操作系统:windows
hv工具:vmware 15
从中使用的步骤https://www.michael-noll.com/tutorials/writing-an-hadoop-mapreduce-program-in-python/
过程:首先创建mapper.py和reducer.py。用简单文件测试。将csv文件上载到hadoop文件系统。[使用put或copyfromlocal]
错误:正在尝试将文件从本地传输到hdfs
这是错误图片:错误图片
我相信hdfs目录是/usr/local/hadoop/data[从core site.xml创建]
基本上我找不到hadoop目录。主要是“/”或“.”,下面有2个文件夹/tmp和/user。如果我表演 hdfs fs -ls / 抛出的错误是没有这样的文件或目录。

<?xml version="1.0" encoding="UTF-8"?>
<?xml-stylesheet type="text/xsl" href="configuration.xsl"?>

<!--
  Licensed under the Apache License, Version 2.0 (the "License");
  you may not use this file except in compliance with the License.
  You may obtain a copy of the License at

    http://www.apache.org/licenses/LICENSE-2.0

  Unless required by applicable law or agreed to in writing, software
  distributed under the License is distributed on an "AS IS" BASIS,
  WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
  See the License for the specific language governing permissions and
  limitations under the License. See accompanying LICENSE file.
-->

<!-- Put site-specific property overrides in this file. -->

<configuration>
    <property>
        <name>fs.default.name</name>
        <value>hdfs://127.0.0.1:9000</value>
    </property>

    <property>
        <name>hadoop.tmp.dir</name>
        <value>/usr/local/hadoop/data</value>
    </property>
</configuration>

以下是core-site.xml:
今天是第六天。请帮助我解决这最后一块,这样我就可以执行Map减少没有错误感谢提前!!

暂无答案!

目前还没有任何答案,快来回答吧!

相关问题