Scala模块2.10.0要求Jackson数据绑定版本大于等于2.10.0且小于2.11.0

6ie5vjzr  于 2023-02-16  发布在  Scala
关注(0)|答案(1)|浏览(410)

我有一个sbt项目,我想用scala测试和共享spark会话做一个测试,几周前我的项目开始出错。

java.lang.ExceptionInInitializerError
    at org.apache.spark.sql.execution.SparkPlan.executeQuery(SparkPlan.scala:215)
    at org.apache.spark.sql.execution.SparkPlan.execute(SparkPlan.scala:176)
.....
Caused by: com.fasterxml.jackson.databind.JsonMappingException: Scala module 2.10.0 requires Jackson Databind version >= 2.10.0 and < 2.11.0
    at com.fasterxml.jackson.module.scala.JacksonModule.setupModule(JacksonModule.scala:61)
    at com.fasterxml.jackson.module.scala.JacksonModule.setupModule$(JacksonModule.scala:46)

有一个非常简单的测试

import org.apache.spark.sql.QueryTest.checkAnswer
import org.apache.spark.sql.Row
import org.apache.spark.sql.test.SharedSparkSession

class SparkTestSpec extends SharedSparkSession  {
  import testImplicits._
  test("join - join using") {
    val df = Seq(1, 2, 3).toDF("int")

    checkAnswer(df, Row(1) :: Row(2) :: Row(3) :: Nil)
  }
}

和sbt配置

ThisBuild / scalaVersion := "2.12.10"
val sparkVersion = "3.1.0"
val scalaTestVersion = "3.2.1"

libraryDependencies ++= Seq(
  "org.apache.spark" %% "spark-sql" % sparkVersion,
  "org.apache.spark" %% "spark-sql" % sparkVersion % Test,
  "org.apache.spark" %% "spark-sql" % sparkVersion % Test classifier "tests",
  "org.apache.spark" %% "spark-catalyst" % sparkVersion % Test,
  "org.apache.spark" %% "spark-catalyst" % sparkVersion % Test classifier "tests",
  "org.apache.spark" %% "spark-hive" % sparkVersion % Test,
  "org.apache.spark" %% "spark-hive" % sparkVersion % Test classifier "tests",
  "org.apache.spark" %% "spark-core" % sparkVersion,
  "org.apache.spark" %% "spark-core" % sparkVersion % Test classifier "tests",
  "log4j" % "log4j" % "1.2.17",
  "org.slf4j" % "slf4j-log4j12" % "1.7.30",

  "org.scalatest" %% "scalatest" % scalaTestVersion % Test,
  "org.scalatestplus" %% "scalacheck-1-14" % "3.2.2.0",

)
vuv7lop3

vuv7lop31#

这是Jackson的一个非常典型的问题,错误告诉你需要在所有依赖项中使用一个版本的Jackson,但事实并非如此。
通常你有Spark和另一个库拉传递Jackson在不同的版本。
你需要做的是:

  • 运行sbt dependencyTree以确定正在提取Jackson的库及其版本
  • 定义一个dependencyOverrides,强制所有Jackson库使用相同的Jackson版本(具体版本取决于与其他需要它的库的兼容性)

相关问题