spark submit命令正在返回丢失的应用程序资源

siotufzp  于 2021-05-27  发布在  Spark
关注(0)|答案(1)|浏览(397)

首先,我创建了一个jar文件,使用这个文件可以正确地从intellij构建jar?。
我的jar文件路径是

out/artifacts/sparkProgram_jar/sparkProgram.jar

一般来说,我的spark程序从mongodb读取一个表,使用spark的mllib进行转换并将其写入mysql。这是我的build.sbt文件。

name := "sparkProgram"

version := "0.1"

scalaVersion := "2.12.4"
val sparkVersion = "3.0.0"
val postgresVersion = "42.2.2"

resolvers ++= Seq(
  "bintray-spark-packages" at "https://dl.bintray.com/spark-packages/maven",
  "Typesafe Simple Repository" at "https://repo.typesafe.com/typesafe/simple/maven-releases",
  "MavenRepository" at "https://mvnrepository.com"
)

libraryDependencies ++= Seq(
  "org.apache.spark" %% "spark-core" % sparkVersion,
  "org.apache.spark" %% "spark-sql" % sparkVersion,
  "org.apache.spark" %% "spark-mllib" % sparkVersion,
  // logging
  "org.apache.logging.log4j" % "log4j-api" % "2.4.1",
  "org.apache.logging.log4j" % "log4j-core" % "2.4.1",
  "org.mongodb.spark" %% "mongo-spark-connector" % "2.4.1",

  //"mysql" % "mysql-connector-java" % "5.1.12",
  "mysql" % "mysql-connector-java" % "8.0.18"
).

我的主类在名为com.testing的scala对象中的包中

mainObject

当我运行以下spark submit命令时

spark-submit --master local --class com.testing.mainObject
--packages mysql:mysql-connector-java:8.0.18,org.mongodb.spark:mongo-spark-connector_2.12:2.4.1 out/artifacts/sparkProgram_jar/sparkProgram.jar

我收到这个错误

Error: Missing application resource.

Usage: spark-submit [options] <app jar | python file | R file> [app arguments]
Usage: spark-submit --kill [submission ID] --master [spark://...]
Usage: spark-submit --status [submission ID] --master [spark://...]
Usage: spark-submit run-example [options] example-class [example args]

Options:

... zsh: command not found: --packages

然后,当我试图运行spark submit而不使用--packages(只是为了检查会发生什么)时,我收到了这个错误。
命令:

spark-submit --master local --class com.testing.mainObject out/artifacts/sparkProgram_jar/sparkProgram.jar

错误:错误:未能加载类com.testing.mainobject
我以前使用过spark submit,它很有效(几个月前)。我不知道为什么这仍然给我一个错误。我的manifest.mf如下

Manifest-Version: 1.0
Main-Class: com.testing.mainObject
628mspwn

628mspwn1#

到目前为止,我的答案是首先以不同的方式构建jar文件 File -> Project Structure -> Project Settings -> Artifacts -> Jar ,但是我没有提取到jar,而是单击

Copy to Output and link to manifest

在那里,我做了一个spark submit命令,它没有包的一部分。是的

spark-submit --class com.testing.mainObject --master local out/artifacts/sparkProgram_jar/sparkProgram.jar

还要注意间距,以及复制和粘贴到终端。空白会给你带来奇怪的错误。
从那里我有另一个错误,这是显示在这里。https://github.com/intel-bigdata/hibench/issues/466. 解决办法在评论中

"This seems to happen with hadoop 3. I solved it removing a hadoop-hdfs-2.4.0.jar that was in the classpath."

相关问题