spark submit giving“main”java.lang.nosuchmethoderror:scala.some.value()ljava/lang/object

tzxcd3kk  于 2021-07-12  发布在  Spark
关注(0)|答案(1)|浏览(819)

我试图做一个spark提交来检查与一些简单scala代码的兼容性

println("Hi there")

val p = Some("pop")
p match {
  case Some(a) => println("Matched " + a)
  case _ => println("00000009")
}

scala版本:2.12.5 spark版本:2.4.6
目前,在通过spark submit 2.4.7构建并运行jar之后,它给出了:

Hi there
Exception in thread "main" java.lang.NoSuchMethodError: scala.Some.value()Ljava/lang/Object;
    at MangoPop$.main(MangoPop.scala:9)
    at MangoPop.main(MangoPop.scala)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:498)
    at org.apache.spark.deploy.JavaMainApplication.start(SparkApplication.scala:52)
    at org.apache.spark.deploy.SparkSubmit.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:845)
    at org.apache.spark.deploy.SparkSubmit.doRunMain$1(SparkSubmit.scala:161)
    at org.apache.spark.deploy.SparkSubmit.submit(SparkSubmit.scala:184)
    at org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:86)
    at org.apache.spark.deploy.SparkSubmit$$anon$2.doSubmit(SparkSubmit.scala:920)
    at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:929)
    at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
log4j:WARN No appenders could be found for logger (org.apache.spark.util.ShutdownHookManager).
log4j:WARN Please initialize the log4j system properly.
log4j:WARN See http://logging.apache.org/log4j/1.2/faq.html#noconfig for more info.

从maven来看,spark 2.4.6似乎支持scala 2.12https://mvnrepository.com/artifact/org.apache.spark/spark-core
但是在运行spark submit 3.0.2时,它运行得很好。
spark 2.4.6缺少什么
(也尝试了spark 2.4.7,尽管没有实际的spark依赖项/代码,只有scala)
运行spark提交为

~/Downloads/spark-2.4.7-bin-hadoop2.7/bin$  ./spark-submit --class=Test myprojectLocation..../target/scala-2.12/compatibility-check_2.12-0.1.jar
/spark-2.4.7-bin-hadoop2.7/bin$ ./spark-submit --version
Welcome to
      ____              __
     / __/__  ___ _____/ /__
    _\ \/ _ \/ _ `/ __/  '_/
   /___/ .__/\_,_/_/ /_/\_\   version 2.4.7
      /_/

Using Scala version 2.11.12, OpenJDK 64-Bit Server VM, 1.8.0_282
Branch HEAD
Compiled by user prashant on 2020-09-08T05:22:44Z
Revision 14211a19f53bd0f413396582c8970e3e0a74281d
Url https://prashant:Sharma1988%235031@gitbox.apache.org/repos/asf/spark.git
Type --help for more information.

还尝试了从https://archive.apache.org/dist/spark/spark-2.4.6/
但找不到scala 2.12
我们是否可以明确提到在执行spark submit或spark shell时使用哪个scala版本,因为在配置中它似乎支持这两个版本,但它使用了较低的版本,即2.11
这是load-spark-env.cmd文件

rem Setting SPARK_SCALA_VERSION if not already set.

set ASSEMBLY_DIR2="%SPARK_HOME%\assembly\target\scala-2.11"
set ASSEMBLY_DIR1="%SPARK_HOME%\assembly\target\scala-2.12"
yrdbyhpb

yrdbyhpb1#

问题是spark的运行时版本是“使用scala版本2.11.12”,而您的代码( MangoPop$.main(MangoPop.scala:9) )使用“scala版本:2.12.5”。
确保spark的构建版本和运行时版本是相同的scala版本。

相关问题