maven构建错误(scala+spark):对象apache不是包org的成员

vktxenjb  于 2021-05-27  发布在  Spark
关注(0)|答案(1)|浏览(518)

我在stackoverflow和其他资源(1st,2nd,3rd)上读了几个关于这个问题的线程,但不幸的是,它没有帮助。而且几乎所有人都在sbt上描述了相同的问题,而不是maven。
我还检查了spark文档中的scala/spark兼容性(这里),似乎我有正确的版本(scala 2.11.8+spark 2.2.0)
我将描述我的整个工作流程,因为我不确定哪些信息有助于确定根本原因。
这就是我要构建的代码
pom.xml文件:

<?xml version="1.0" encoding="UTF-8"?>
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
         xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
    <modelVersion>4.0.0</modelVersion>
    <groupId>org.example</groupId>
    <artifactId>SparkWordCount</artifactId>
    <version>1.0-SNAPSHOT</version>
    <name>SparkWordCount</name>

    <properties>
        <project.build.sourceEncoding>UTF-8</project.build.sourceEncoding>
    </properties>

    <build>
        <sourceDirectory>src/main/scala</sourceDirectory>
        <testSourceDirectory>src/test/scala</testSourceDirectory>

        <plugins>
            <!-- mixed scala/java compile -->
            <plugin>
                <groupId>net.alchim31.maven</groupId>
                <artifactId>scala-maven-plugin</artifactId>
                <version>3.3.1</version>
                <executions>
                    <execution>
                        <id>compile</id>
                        <goals>
                            <goal>compile</goal>
                        </goals>
                        <phase>compile</phase>
                    </execution>
                    <execution>
                        <id>test-compile</id>
                        <goals>
                            <goal>testCompile</goal>
                        </goals>
                        <phase>test-compile</phase>
                    </execution>
                    <execution>
                        <phase>process-resources</phase>
                        <goals>
                            <goal>compile</goal>
                        </goals>
                    </execution>
                </executions>
            </plugin>
            <plugin>
                <artifactId>maven-compiler-plugin</artifactId>
                <version>3.5.1</version>
                <configuration>
                    <source>1.7</source>
                    <target>1.7</target>
                </configuration>
            </plugin>
            <!-- for fatjar -->
            <plugin>
                <groupId>org.apache.maven.plugins</groupId>
                <artifactId>maven-assembly-plugin</artifactId>
                <version>3.1.0</version>
                <configuration>
                    <descriptorRefs>
                        <descriptorRef>jar-with-dependencies</descriptorRef>
                    </descriptorRefs>
                    <finalName>uber-SparkWordCount-1.0-SNAPSHOT</finalName>
                    <appendAssemblyId>false</appendAssemblyId>
                </configuration>
                <executions>
                    <execution>
                        <id>assemble-all</id>
                        <phase>package</phase>
                        <goals>
                            <goal>single</goal>
                        </goals>
                    </execution>
                </executions>
            </plugin>
            <plugin>
                <groupId>org.apache.maven.plugins</groupId>
                <artifactId>maven-jar-plugin</artifactId>
                <version>3.2.0</version>
                <configuration>
                    <archive>
                        <manifest>
                            <addClasspath>true</addClasspath>
                            <mainClass>fully.qualified.MainClass</mainClass>
                        </manifest>
                    </archive>
                </configuration>
            </plugin>
        </plugins>
    </build>
    <dependencies>
        <!-- https://mvnrepository.com/artifact/org.apache.spark/spark-core -->
        <dependency>
            <groupId>org.apache.spark</groupId>
            <artifactId>spark-core_2.11</artifactId>
            <version>2.2.0</version>
            <scope>provided</scope>
        </dependency>
        <!-- https://mvnrepository.com/artifact/org.apache.spark/spark-sql -->
        <dependency>
            <groupId>org.apache.spark</groupId>
            <artifactId>spark-sql_2.11</artifactId>
            <version>2.2.0</version>
            <scope>provided</scope>
        </dependency>
    </dependencies>
</project>

sparkwordcount.scala公司

import org.apache.spark.sql.SparkSession
import org.apache.log4j.Logger
import org.apache.log4j.Level

object SparkWordCount {
  def main(args: Array[String]) {

    Logger.getLogger("org").setLevel(Level.ERROR)

    val spark = SparkSession
      .builder()
      .appName("SparkSessionZipsExample")
      .master("local")
      .getOrCreate()

    val myRdd = spark.sparkContext.parallelize(List(1,2,3,4,5,6,7))
    myRdd.foreach(number => println("Lol, this is number = " + number))
  }
}

当我启动main()方法时,它可以正常工作:

Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties
Lol, this is number = 1
Lol, this is number = 2
Lol, this is number = 3
Lol, this is number = 4
Lol, this is number = 5
Lol, this is number = 6
Lol, this is number = 7

然后我尝试使用sparksqlDataframe:

import org.apache.spark.sql.{DataFrame, SparkSession}
import org.apache.log4j.Logger
import org.apache.log4j.Level

object SparkWordCount {
  def main(args: Array[String]) {

    Logger.getLogger("org").setLevel(Level.ERROR)

    val spark = SparkSession
      .builder()
      .appName("SparkSessionZipsExample")
      .master("local")
      .getOrCreate()

    val airports: DataFrame = spark.read.csv("C:\\Users\\Евгений\\Desktop\\DATALEARN\\airports.csv")
    airports.show()
  }
}

这段代码会抛出一个错误:

Exception in thread "main" java.lang.NoClassDefFoundError: org/apache/log4j/Logger
    at SparkWordCount$.main(SparkWordCount.scala:10)
    at SparkWordCount.main(SparkWordCount.scala)
Caused by: java.lang.ClassNotFoundException: org.apache.log4j.Logger
    at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
    at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:349)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
    ... 2 more

Process finished with exit code 1

我不确定它到底是如何工作的,但我通过更改运行配置并选中复选框“include dependencies with provided scope”来解决这个问题

在此之后,我的sparksql代码也可以正常工作:

Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties
+---------+--------------------+-------------+-----+-------+--------+----------+
|      _c0|                 _c1|          _c2|  _c3|    _c4|     _c5|       _c6|
+---------+--------------------+-------------+-----+-------+--------+----------+
|IATA_CODE|             AIRPORT|         CITY|STATE|COUNTRY|LATITUDE| LONGITUDE|
|      ABE|Lehigh Valley Int...|    Allentown|   PA|    USA|40.65236| -75.44040|
|      ABI|Abilene Regional ...|      Abilene|   TX|    USA|32.41132| -99.68190|
|      ABQ|Albuquerque Inter...|  Albuquerque|   NM|    USA|35.04022|-106.60919|
|      ABR|Aberdeen Regional...|     Aberdeen|   SD|    USA|45.44906| -98.42183|
|      ABY|Southwest Georgia...|       Albany|   GA|    USA|31.53552| -84.19447|
|      ACK|Nantucket Memoria...|    Nantucket|   MA|    USA|41.25305| -70.06018|
|      ACT|Waco Regional Air...|         Waco|   TX|    USA|31.61129| -97.23052|
|      ACV|      Arcata Airport|Arcata/Eureka|   CA|    USA|40.97812|-124.10862|
|      ACY|Atlantic City Int...|Atlantic City|   NJ|    USA|39.45758| -74.57717|
|      ADK|        Adak Airport|         Adak|   AK|    USA|51.87796|-176.64603|
|      ADQ|      Kodiak Airport|       Kodiak|   AK|    USA|57.74997|-152.49386|
|      AEX|Alexandria Intern...|   Alexandria|   LA|    USA|31.32737| -92.54856|
|      AGS|Augusta Regional ...|      Augusta|   GA|    USA|33.36996| -81.96450|
|      AKN| King Salmon Airport|  King Salmon|   AK|    USA|58.67680|-156.64922|
|      ALB|Albany Internatio...|       Albany|   NY|    USA|42.74812| -73.80298|
|      ALO|Waterloo Regional...|     Waterloo|   IA|    USA|42.55708| -92.40034|
|      AMA|Rick Husband Amar...|     Amarillo|   TX|    USA|35.21937|-101.70593|
|      ANC|Ted Stevens Ancho...|    Anchorage|   AK|    USA|61.17432|-149.99619|
|      APN|Alpena County Reg...|       Alpena|   MI|    USA|45.07807| -83.56029|
+---------+--------------------+-------------+-----+-------+--------+----------+
only showing top 20 rows

但是,当我尝试执行maven“clean->package”命令时,出现了几个错误,它们都与“org.apache”包有关:

D:\Work Projects\SparkWordCount\src\main\scala\SparkWordCount.scala:3: error: object apache is not a member of package org
import org.apache.spark.sql.{DataFrame, SparkSession}
           ^
D:\Work Projects\SparkWordCount\src\main\scala\SparkWordCount.scala:4: error: object apache is not a member of package org
import org.apache.log4j.Logger
           ^
D:\Work Projects\SparkWordCount\src\main\scala\SparkWordCount.scala:5: error: object apache is not a member of package org
import org.apache.log4j.Level
           ^
D:\Work Projects\SparkWordCount\src\main\scala\SparkWordCount.scala:10: error: not found: value Logger
    Logger.getLogger("org").setLevel(Level.ERROR)

这是有关我的环境的详细信息,其中一些可能会有所帮助:
我用intellij的主意
视窗10
已安装java 8

scala 2.11.8安装和运行:

spark 2.2.0通过spark shell安装和工作

我下载了winutils,在其中创建了“bin”文件夹,复制了winutils.exe for hadoop\u 2.7.1并将其粘贴到“bin”中:

这是我的hadoop\u home和java\u home环境变量:

我还设置了spark2\u home环境变量(不确定是否确实需要这样做):

这就是我的道路:

以前我在路径中有hadoop\u主页,但我根据stackoverflow上一个相关线程的建议将其删除。
提前感谢您的帮助!
更新-1我的项目结构:

更新-2
如果这很重要:我没有maven\u home环境变量,因此在我的路径中没有maven\u home。我通过intellij idea maven接口执行了“clean->package”。
更新-3
项目结构中的库列表




update-4有关scala sdk的信息:

3hvapo4f

3hvapo4f1#

删除 <scope>provided</scope> 从pom.xml文件。去 file 选项卡单击 Invalidate Caches / Restart 选项并重试。
对于maven问题-尝试 mvn clean scala:compile -DdisplayCmd=true -DrecompileMode=all package 命令。

相关问题