线程“delete spark local dirs”java.lang.nullpointerexception中出现异常

gopyfrb3  于 2021-06-03  发布在  Hadoop
关注(0)|答案(1)|浏览(413)

嗨,我正在通过shell脚本运行sparkr progrm。我将输入文件指向local意味着它工作正常,但当我指向hdfs意味着它抛出错误。

Exception in thread "delete Spark local dirs" java.lang.NullPointerException

Exception in thread "delete Spark local dirs" java.lang.NullPointerException
at  org.apache.spark.storage.DiskBlockManager.org$apache$spark$storage$DiskBlockManager$$doStop(DiskBlockManager.scala:161)
at  org.apache.spark.storage.DiskBlockManager$$anon$1$$anonfun$run$1.apply$mcV$sp(DiskBlockManager.scala:141)
at org.apache.spark.storage.DiskBlockManager$$anon$1$$anonfun$run$1.apply(DiskBlockManager.scala:139)
at org.apache.spark.storage.DiskBlockManager$$anon$1$$anonfun$run$1.apply(DiskBlockManager.scala:139)
at org.apache.spark.util.Utils$.logUncaughtExceptions(Utils.scala:1617)
at org.apache.spark.storage.DiskBlockManager$$anon$1.run(DiskBlockManager.scala:139)

任何帮助都将不胜感激。

b1payxdu

b1payxdu1#

我在scala脚本中也遇到了同样的问题。问题出在主url上,所以我删除了设置主url。
以前:

val conf = new org.apache.spark.SparkConf().setMaster(masterURL).set("spark.ui.port",port).setAppName("TestScalaApp")

固定代码:

val conf = new org.apache.spark.SparkConf().setAppName("TestScalaApp")

相关问题