运行kafka spark streaming jar时spark提交错误

yyhrrdl8  于 2021-07-09  发布在  Spark
关注(0)|答案(0)|浏览(179)

我试图运行我的KafkaSpark流应用程序使用Spark提交。它的javamaven项目和我使用汇编插件创建了一个胖jar。我正试图使用spark submit执行同一个jar,但它失败了,出现以下错误。我的pom.xml依赖关系如下。

<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 https://maven.apache.org/xsd/maven-4.0.0.xsd">
  <modelVersion>4.0.0</modelVersion>
  <groupId>consumer</groupId>
  <artifactId>consumer</artifactId>
  <version>0.0.1-SNAPSHOT</version>

   <properties>
   <maven.compiler.source>1.8</maven.compiler.source>
   <maven.compiler.target>1.8</maven.compiler.target>
 </properties>

<build>
  <plugins>

    <plugin>
      <groupId>org.apache.maven.plugins</groupId>
      <artifactId>maven-surefire-plugin</artifactId>
      <version>2.22.1</version>
    </plugin>

   <plugin>
    <groupId>org.apache.maven.plugins</groupId>
    <artifactId>maven-assembly-plugin</artifactId>
     <executions>
        <execution>
            <phase>package</phase>
            <goals>
                <goal>single</goal>
            </goals>
            <configuration>
                <archive>
                <manifest>
                    <mainClass>
                     consumer.consumer
                    </mainClass>
                </manifest>
                </archive>
                <descriptorRefs>
                    <descriptorRef>jar-with-dependencies</descriptorRef>
                </descriptorRefs>
            </configuration>
        </execution>
    </executions>
</plugin>
</plugins> 
</build>

 <dependencies>
  <!-- https://mvnrepository.com/artifact/org.apache.spark/spark-streaming-kafka-0-10 -->
<dependency>
    <groupId>org.apache.spark</groupId>
    <artifactId>spark-streaming-kafka-0-10_2.11</artifactId>
    <version>2.4.6</version>
   </dependency>

<!-- https://mvnrepository.com/artifact/com.influxdb/influxdb-client-java -->
<dependency>
    <groupId>com.influxdb</groupId>
    <artifactId>influxdb-client-java</artifactId>
    <version>1.11.0</version>
</dependency>
<!-- https://mvnrepository.com/artifact/com.influxdb/influxdb-client-core -->
<dependency>
    <groupId>com.influxdb</groupId>
    <artifactId>influxdb-client-core</artifactId>
    <version>1.11.0</version>
</dependency>
<!-- https://mvnrepository.com/artifact/org.apache.spark/spark-streaming -->
<dependency>
    <groupId>org.apache.spark</groupId>
    <artifactId>spark-streaming_2.11</artifactId>
    <version>2.4.6</version>
    <scope>provided</scope>
</dependency>

<!-- https://mvnrepository.com/artifact/org.apache.spark/spark-core -->
<dependency>
    <groupId>org.apache.spark</groupId>
    <artifactId>spark-core_2.11</artifactId>
    <version>2.4.6</version>
    <scope>provided</scope>
</dependency>
<!-- https://mvnrepository.com/artifact/org.scala-lang/scala-library -->
<dependency>
    <groupId>org.scala-lang</groupId>
    <artifactId>scala-library</artifactId>
    <version>2.11.12</version>
</dependency>
</dependencies>  
</project>
Exception in thread "main" java.lang.NoClassDefFoundError: org/apache/kafka/clients/consumer/Consumer
    at org.apache.spark.streaming.kafka010.ConsumerStrategies$.Subscribe(ConsumerStrategy.scala:299)
    at org.apache.spark.streaming.kafka010.ConsumerStrategies.Subscribe(ConsumerStrategy.scala)
    at consumer.consumer.consume(consumer.java:156)
    at consumer.consumer.main(consumer.java:80)

尝试在spark submit中使用--jars提供类路径中的所有依赖项。但运气不好。
尝试降低版本的Spark,Kafka和斯卡拉。但还是有同样的错误。
Kafka和斯卡拉尝试使用相同的斯卡拉版本,但没有运气。
尝试使用与kafka和scala安装jar相同的版本,但无法修复。
我正在单机上使用spark-2.4.6-bin-hadoop2.7和kafkaèu 2.13-2.6.0。我有什么遗漏吗?我也试着实现类似问题的答案,但还是出错了。谢谢你的帮助。谢谢!

暂无答案!

目前还没有任何答案,快来回答吧!

相关问题