exception不能是目录

vkc1a9a2  于 2021-06-03  发布在  Hadoop
关注(0)|答案(0)|浏览(194)

嗨,我正在用spark java进行字数计算练习。当我在hdfs中执行时,我会变成这样

Exception in thread "main" java.lang.IllegalArgumentException: /home/karun cannot be a directory.
at org.apache.spark.HttpFileServer.addFileToDir(HttpFileServer.scala:70)
at org.apache.spark.HttpFileServer.addJar(HttpFileServer.scala:60)
at org.apache.spark.SparkContext.addJar(SparkContext.scala:1162)
at org.apache.spark.SparkContext$$anonfun$11.apply(SparkContext.scala:276)
at org.apache.spark.SparkContext$$anonfun$11.apply(SparkContext.scala:276)
at scala.collection.immutable.List.foreach(List.scala:318)
at org.apache.spark.SparkContext.<init>(SparkContext.scala:276)
at org.apache.spark.api.java.JavaSparkContext.<init>(JavaSparkContext.scala:61)
at WordCount.WordCount.WordCountSpark.main(WordCountSpark.java:46)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at org.apache.spark.deploy.SparkSubmit$.launch(SparkSubmit.scala:358)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:75)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)

这是完整的执行结构

spark-submit --jars $(echo "$JARFILE,./lib_bd/lucene-core-3.6.0.jar,./lib_bd/hive-jdbc-0.13.1-cdh5.2.0.jar,./lib_bd/hive-metastore-0.13.1-cdh5.2.0.jar,./lib_bd/hive-common-0.13.1-cdh5.2.0.jar,./lib_bd/hive-contrib-0.13.1-cdh5.2.0.jar,./lib_bd/hive-cli-0.13.1-cdh5.2.0.jar,./lib_bd/hive-exec-0.13.1-cdh5.2.0.jar,./lib_bd/hive-service-0.13.1-cdh5.2.0.jar,./lib_bd/opennlp-tools-1.5.3.jar,./lib_bd/opennlp-maxent-3.0.3.jar,./lib_bd/mahout-core-0.9-cdh5.2.0-job.jar,./lib_bd/mahout-core-0.9-cdh5.2.0.jar,./lib_bd/mahout-examples-0.9-cdh5.2.0-job.jar,./lib_bd/mahout-examples-0.9-cdh5.2.0.jar,./lib_bd/mahout-integration-0.9-cdh5.2.0.jar,./lib_bd/mahout-math-0.9-cdh5.2.0.jar") --class WordCount.WordCount.WordCountSpark --master local[4] WordCount-0.0.1-SNAPSHOT.jar 4

暂无答案!

目前还没有任何答案,快来回答吧!

相关问题