如何在scala中使用aws emr上的sbt程序集创建单个jar?运行重复数据消除:在以下位置发现不同的文件内容:错误

cczfrluj  于 2021-05-27  发布在  Spark
关注(0)|答案(1)|浏览(352)

我在一个aws emr集群上,我刚刚站起来,有一个scala文件编译,还有一个我想编译成程序集的。但是,当我发布sbt程序集时,我遇到了重复数据消除错误。
每https://medium.com/@tedherman/compile-scala-on-emr-cb77610559f0我最初有一个lib到usr lib spark jars的符号链接;

ln -s /usr/lib/spark/jars lib

尽管我注意到我的代码通过了sbt编译,不管有没有这个。但是,我不知道为什么/如何解决sbt程序集重复错误。我还将注意到,我按照本文在引导操作中安装了sbt。
符号链接在
有些重复数据消除似乎是清晰的精确重复数据;例子:

[error] deduplicate: different file contents found in the following:
[error] /home/hadoop/.ivy2/cache/org.apache.parquet/parquet-jackson/jars/parquet-jackson-1.10.1.jar:shaded/parquet/org/codehaus/jackson/util/CharTypes.class
[error] /usr/lib/spark/jars/parquet-jackson-1.10.1-spark-amzn-1.jar:shaded/parquet/org/codehaus/jackson/util/CharTypes.class

另一些似乎是相互竞争的版本;

[error] deduplicate: different file contents found in the following:
[error] /home/hadoop/.ivy2/cache/org.apache.spark/spark-core_2.11/jars/spark-core_2.11-2.4.3.jar:org/spark_project/jetty/util/MultiPartOutputStream.class
[error] /usr/lib/spark/jars/spark-core_2.11-2.4.5-amzn-0.jar:org/spark_project/jetty/util/MultiPartOutputStream.class

我不明白为什么会有相互竞争的版本;或者,如果他们是默认的这样或我做了什么介绍他们。
没有符号链接
我想如果我去掉这个,我的问题会少一些;虽然我仍然有欺骗(只是少);

[error] deduplicate: different file contents found in the following:
[error] /home/hadoop/.ivy2/cache/org.apache.hadoop/hadoop-yarn-api/jars/hadoop-yarn-api-2.6.5.jar:org/apache/hadoop/yarn/factory/providers/package-info.class
[error] /home/hadoop/.ivy2/cache/org.apache.hadoop/hadoop-yarn-common/jars/hadoop-yarn-common-2.6.5.jar:org/apache/hadoop/yarn/factory/providers/package-info.class

考虑到一个是hadoop-yarn-api-2.6.5.jar,另一个是hadoop-yarn-common-2.6.5.jar,我不明白为什么上面的内容是重复的。不同的名字那为什么要被骗?
其他的似乎是版本;

[error] deduplicate: different file contents found in the following:
[error] /home/hadoop/.ivy2/cache/javax.inject/javax.inject/jars/javax.inject-1.jar:javax/inject/Named.class
[error] /home/hadoop/.ivy2/cache/org.glassfish.hk2.external/javax.inject/jars/javax.inject-2.4.0-b34.jar:javax/inject/Named.class

有些文件名相同,但路径/jar不同。。。

[error] deduplicate: different file contents found in the following:
[error] /home/hadoop/.ivy2/cache/org.apache.arrow/arrow-format/jars/arrow-format-0.10.0.jar:git.properties
[error] /home/hadoop/.ivy2/cache/org.apache.arrow/arrow-memory/jars/arrow-memory-0.10.0.jar:git.properties
[error] /home/hadoop/.ivy2/cache/org.apache.arrow/arrow-vector/jars/arrow-vector-0.10.0.jar:git.properties

和这些一样。。。

[error] deduplicate: different file contents found in the following:
[error] /home/hadoop/.ivy2/cache/org.apache.spark/spark-catalyst_2.11/jars/spark-catalyst_2.11-2.4.3.jar:org/apache/spark/unused/UnusedStubClass.class
[error] /home/hadoop/.ivy2/cache/org.apache.spark/spark-core_2.11/jars/spark-core_2.11-2.4.3.jar:org/apache/spark/unused/UnusedStubClass.class
[error] /home/hadoop/.ivy2/cache/org.apache.spark/spark-graphx_2.11/jars/spark-graphx_2.11-2.4.3.jar:org/apache/spark/unused/UnusedStubClass.class

其他信息仅供参考
导入到我的scala对象

import org.apache.spark.sql.SparkSession
import java.time.LocalDateTime
import com.amazonaws.regions.Regions
import com.amazonaws.services.secretsmanager.AWSSecretsManagerClientBuilder
import com.amazonaws.services.secretsmanager.model.GetSecretValueRequest
import org.json4s.{DefaultFormats, MappingException}
import org.json4s.jackson.JsonMethods._
import com.datarobot.prediction.spark.Predictors.{getPredictorFromServer, getPredictor}

我的版本.sbt

libraryDependencies ++= Seq(
  "net.snowflake" % "snowflake-jdbc" % "3.12.5",
  "net.snowflake" % "spark-snowflake_2.11" % "2.7.1-spark_2.4",
  "com.datarobot" % "scoring-code-spark-api_2.4.3" % "0.0.19",
  "com.datarobot" % "datarobot-prediction" % "2.1.4",
  "com.amazonaws" % "aws-java-sdk-secretsmanager" % "1.11.789",
  "software.amazon.awssdk" % "regions" % "2.13.23"
)

思想?请告知。

mpgws1up

mpgws1up1#

你需要一个 mergeStrategy 设置(文档)。
“随机示例:”

assemblyMergeStrategy in assembly := {
  case PathList("META-INF", _)       => MergeStrategy.discard
  case PathList("git.properties", _) => MergeStrategy.discard
  case "application.conf"            => MergeStrategy.concat
  case "reference.conf"              => MergeStrategy.concat
  case _                             => MergeStrategy.first
}

相关问题