java.lang.nosuchmethoderror:org.apache.spark.sql.catalyst.analysis.functionregistry.clone()lorg/apache/spark/sql/catalyst/analysis/functionregistry

bz4sfanl  于 2021-07-14  发布在  Spark
关注(0)|答案(0)|浏览(216)

我成功地用 mvn clean package 它在生成jar之前运行了一些scala单元测试,没有任何问题,但是当我试图在jenkins上实现同样的东西时遇到了上面提到的问题。我尝试了所有的方法,甚至从pom中删除了额外的hive元存储库,但是除了这个错误之外,我无法做到这一点。
下面是我的pom文件

<properties>
    <maven.compiler.source>1.6</maven.compiler.source>
    <maven.compiler.target>1.6</maven.compiler.target>
    <encoding>UTF-8</encoding>
    <scala.version>2.11.11</scala.version>
    <spark.version>2.3.0</spark.version>
    <java.version>1.8</java.version>
    <sonar.maven.version>3.4.1.1168</sonar.maven.version>
    <scoverage.plugin.version>1.4.0</scoverage.plugin.version>
    <sonar.scala.coverage.reportPaths>${project.build.directory}/scoverage.xml</sonar.scala.coverage.reportPaths>
</properties>

<dependencyManagement>
    <dependencies>
        <dependency>
            <groupId>io.netty</groupId>
            <artifactId>netty-all</artifactId>
            <version>4.1.18.Final</version>
        </dependency>
    </dependencies>
</dependencyManagement>

<dependencies>
    <dependency>
        <groupId>org.scala-lang</groupId>
        <artifactId>scala-library</artifactId>
        <version>${scala.version}</version>
    </dependency>
    <dependency>
        <groupId>org.apache.spark</groupId>
        <artifactId>spark-core_2.11</artifactId>
        <version>${spark.version}</version>
        <exclusions>
            <exclusion>
                <groupId>io.netty</groupId>
                <artifactId>netty</artifactId>
            </exclusion>
        </exclusions>
    </dependency>
    <dependency>
        <groupId>org.apache.spark</groupId>
        <artifactId>spark-sql_2.11</artifactId>
        <version>${spark.version}</version>
    </dependency>
    <dependency>
        <groupId>org.apache.spark</groupId>
        <artifactId>spark-hive_2.11</artifactId>
        <version>${spark.version}</version>
    </dependency>
    <dependency>
        <groupId>com.typesafe</groupId>
        <artifactId>config</artifactId>
        <version>1.3.2</version>
    </dependency>
    <dependency>
        <groupId>com.typesafe.scala-logging</groupId>
        <artifactId>scala-logging_2.11</artifactId>
        <version>3.8.0</version>
    </dependency>
    <dependency>
        <groupId>org.scalatest</groupId>
        <artifactId>scalatest_2.11</artifactId>
        <version>3.0.4</version>
        <scope>test</scope>
    </dependency>
    <dependency>
        <groupId>junit</groupId>
        <artifactId>junit</artifactId>
        <version>4.12</version>
        <scope>test</scope>
    </dependency>
    <dependency>
        <groupId>org.apache.commons</groupId>
        <artifactId>commons-email</artifactId>
        <version>1.5</version>
    </dependency>
</dependencies>
<reporting>
    <plugins>
        <plugin>
            <groupId>org.scoverage</groupId>
            <artifactId>scoverage-maven-plugin</artifactId>
            <version>${scoverage.plugin.version}</version>

            <reportSets>
                <reportSet>
                    <reports>
                        <report>report-only</report>
                    </reports>
                </reportSet>
            </reportSets>
        </plugin>
    </plugins>
</reporting>

<build>
    <sourceDirectory>src/main/scala</sourceDirectory>
    <testSourceDirectory>src/test/scala</testSourceDirectory>
    <plugins>
        <plugin>
            <!-- see http://davidb.github.com/scala-maven-plugin -->
            <groupId>net.alchim31.maven</groupId>
            <artifactId>scala-maven-plugin</artifactId>
            <version>3.1.3</version>
            <executions>
                <execution>
                    <goals>
                        <goal>compile</goal>
                        <goal>testCompile</goal>
                    </goals>
                    <configuration>
                        <args>
                            <arg>-dependencyfile</arg>
                            <arg>${project.build.directory}/.scala_dependencies</arg>
                        </args>
                    </configuration>
                </execution>
            </executions>
        </plugin>
        <plugin>
            <groupId>org.apache.maven.plugins</groupId>
            <artifactId>maven-assembly-plugin</artifactId>
            <version>2.4</version>
            <configuration>
                <descriptorRefs>
                    <descriptorRef>jar-with-dependencies</descriptorRef>
                </descriptorRefs>
                <archive>
                    <manifest>
                        <mainClass>com.your-package.MainClass</mainClass>
                    </manifest>
                </archive>
            </configuration>
            <executions>
                <execution>
                    <phase>package</phase>
                    <goals>
                        <goal>single</goal>
                    </goals>
                </execution>
            </executions>
        </plugin>
        <plugin>
            <artifactId>maven-compiler-plugin</artifactId>
            <version>3.6.1</version>
            <configuration>
                <source>1.8</source>
                <target>1.8</target>
            </configuration>
        </plugin>
        <plugin>
            <groupId>org.sonarsource.scanner.maven</groupId>
            <artifactId>sonar-maven-plugin</artifactId>
            <version>${sonar.maven.version}</version>
        </plugin>
        <plugin>
            <groupId>org.scalatest</groupId>
            <artifactId>scalatest-maven-plugin</artifactId>
            <version>1.0</version>
            <configuration>
                <reportsDirectory>${project.build.directory}/surefire-reports</reportsDirectory>
                <junitxml>.</junitxml>
                <filereports>WDFTestSuite.txt</filereports>
                <skipTests>false</skipTests>
            </configuration>
            <executions>
                <execution>
                    <id>test</id>
                    <goals>
                        <goal>test</goal>
                    </goals>
                </execution>
            </executions>
        </plugin>
        <plugin>
            <groupId>org.codehaus.mojo</groupId>
            <artifactId>build-helper-maven-plugin</artifactId>
            <version>3.0.0</version>
            <executions>
                <execution>
                    <id>add-source</id>
                    <phase>generate-sources</phase>
                    <goals>
                        <goal>add-source</goal>
                    </goals>
                    <configuration>
                        <sources>
                            <source>src/main/scala</source>
                        </sources>
                    </configuration>
                </execution>
            </executions>
        </plugin>
        <plugin>
            <groupId>org.scoverage</groupId>
            <artifactId>scoverage-maven-plugin</artifactId>
            <version>${scoverage.plugin.version}</version>
            <configuration>
                <minimumCoverage>5</minimumCoverage>
                <failOnMinimumCoverage>false</failOnMinimumCoverage>
                <additionalForkedProjectProperties>skipTests=false</additionalForkedProjectProperties>
            </configuration>
            <executions>
                <execution>
                    <goals>
                        <goal>check</goal>
                    </goals>
                    <phase>prepare-package</phase>
                </execution>
            </executions>
        </plugin>
    </plugins>
</build>

Jenkins的错误

21/03/16 15:30:01 INFO SharedState: Warehouse path is 'file:/home/ec2-user/workspace/myproject/src/scala/driver/spark-warehouse'.

***RUN ABORTED***

java.lang.NoSuchMethodError: org.apache.spark.sql.catalyst.analysis.FunctionRegistry.clone()Lorg/apache/spark/sql/catalyst/analysis/FunctionRegistry;
at org.apache.spark.sql.internal.BaseSessionStateBuilder.functionRegistry$lzycompute(BaseSessionStateBuilder.scala:98)
at org.apache.spark.sql.internal.BaseSessionStateBuilder.functionRegistry(BaseSessionStateBuilder.scala:97)
at org.apache.spark.sql.internal.BaseSessionStateBuilder.build(BaseSessionStateBuilder.scala:289)
at org.apache.spark.sql.SparkSession$.org$apache$spark$sql$SparkSession$$instantiateSessionState(SparkSession.scala:1061)
at org.apache.spark.sql.SparkSession$$anonfun$sessionState$2.apply(SparkSession.scala:141)
at org.apache.spark.sql.SparkSession$$anonfun$sessionState$2.apply(SparkSession.scala:140)
at scala.Option.getOrElse(Option.scala:121)
at org.apache.spark.sql.SparkSession.sessionState$lzycompute(SparkSession.scala:140)
at org.apache.spark.sql.SparkSession.sessionState(SparkSession.scala:137)
at org.apache.spark.sql.SparkSession$Builder$$anonfun$getOrCreate$3.apply(SparkSession.scala:913)

下面是测试文件(即使没有任何特定的测试也会失败)

package com.driver.writedriver_test

import org.apache.spark.sql._
import org.scalatest.{FlatSpec, Matchers}

class WriterTestSpec extends FlatSpec with Matchers  {
  val spark = SparkSession.builder().master("local[*]").appName("readFileAndCreateDF").getOrCreate()

  behavior of "Write Dataframe to Hive"

}

暂无答案!

目前还没有任何答案,快来回答吧!

相关问题