用于hadoop核心的maven artifactid hadoop 2.2.0

qv7cva1a  于 2021-06-04  发布在  Hadoop
关注(0)|答案(3)|浏览(311)

我正在将我的应用程序从hadoop1.0.3迁移到hadoop2.2.0,maven build将hadoopcore标记为dependency。因为hadoop2.2.0没有hadoopcore。我尝试用hadoop客户端和hadoop common替换它,但是我仍然在ant.filter中遇到这个错误。有人能建议使用哪件文物吗?

previous config :
<dependency>
    <groupId>org.apache.hadoop</groupId>
    <artifactId>hadoop-core</artifactId>
    <version>1.0.3</version>
</dependency>

New Config:

<dependency>
    <groupId>org.apache.hadoop</groupId>
    <artifactId>hadoop-client</artifactId>
    <version>2.2.0</version>
</dependency>

错误:

[ERROR] Failed to execute goal org.apache.maven.plugins:maven-compiler-plugin:3.1:compile (default-compile) on project event: Compilation failure: Compilation failure:

[ERROR] /opt/teamcity/buildAgent/work/c670ebea1992ec2f/event/src/main/java/com/intel/event/EventContext.java:[27,36] package org.apache.tools.ant.filters does not exist

[ERROR] /opt/teamcity/buildAgent/work/c670ebea1992ec2f/event/src/main/java/com/intel/event/EventContext.java:[27,36] package org.apache.tools.ant.filters does not exist

[ERROR] /opt/teamcity/buildAgent/work/c670ebea1992ec2f/event/src/main/java/com/intel/event/EventContext.java:[180,59] cannot find symbol

[ERROR] symbol: class StringInputStream

[ERROR] location: class com.intel.event.EventContext
d5vmydt9

d5vmydt91#

在我的示例项目wordcount中尝试这些工件,word fine

<dependency>
    <groupId>org.apache.hadoop</groupId>
    <artifactId>hadoop-common</artifactId>
    <version>2.2.0</version>
</dependency>

<dependency>
    <groupId>org.apache.hadoop</groupId>
    <artifactId>hadoop-core</artifactId>
    <version>1.2.1</version>
</dependency>
7uzetpgm

7uzetpgm2#

我们的应用程序主要依赖于hdfsapi。当我们迁移到hadoop2.x时,我们惊讶地看到依赖关系的变化。我们开始一次添加一个依赖项。今天,我们依赖以下核心库。

hadoop-annotations-2.2.0
hadoop-auth-2.2.0
hadoop-common-2.2.0
hadoop-hdfs-2.2.0
hadoop-mapreduce-client-core-2.2.0

除此之外,我们还依赖于测试库。根据您的需要,您可能希望将hadoophdfs和hadoopmapreduce客户机与hadoopcommon一起包含到依赖项中。

aiqt4smr

aiqt4smr3#

maven依赖项可以从这个链接获得。就HadoopCore的依赖关系而言,HadoopCore是Hadoop1.x的名称,仅仅将版本重命名为2.x是没有帮助的。在Hadoop2.x项目中,使用Hadoop1.x依赖项也会出现如下错误
原因:org.apache.hadoop.ipc.remoteexception:服务器ipc版本9无法与客户端版本4通信
因此建议不要使用。我在hadoop中使用了以下依赖项

<dependency>
    <groupId>org.apache.hadoop</groupId>
    <artifactId>hadoop-client</artifactId>
    <version>2.7.1</version>
</dependency>
<dependency>
    <groupId>org.apache.hadoop</groupId>
    <artifactId>hadoop-hdfs</artifactId>
    <version>2.7.1</version>
</dependency>
<dependency>
    <groupId>org.apache.hadoop</groupId>
    <artifactId>hadoop-mapreduce-client-core</artifactId>
    <version>2.7.1</version>
</dependency>
<dependency>
    <groupId>org.apache.hadoop</groupId>
    <artifactId>hadoop-mapreduce-client-jobclient</artifactId>
    <version>2.7.1</version>
</dependency>
<dependency>
    <groupId>org.apache.hadoop</groupId>
    <artifactId>hadoop-mapreduce-client-common</artifactId>
    <version>2.7.1</version>
</dependency>
<dependency>

你可以试试这些。

相关问题