spark提交:使用jar找不到表或视图

insrf1ej  于 2021-05-29  发布在  Hadoop
关注(0)|答案(1)|浏览(615)

当我跑的时候 HiveRead.java 从intellijide我可以成功运行并得到结果。然后我创建了jar文件(这是一个maven项目),然后我尝试从ide运行,它给了我

ClassLoaderResolver for class "" gave error on creation : {1}

然后我看了这么多答案,发现我必须添加datanulcues jars,我做了这样的事情

java -jar /home/saurab/sparkProjects/spark_hive/target/myJar-jar-with-dependencies.jar --jars jars/datanucleus-api-jdo-3.2.6.jar,jars/datanucleus-core-3.2.10.jar,jars/datanucleus-rdbms-3.2.9.jar,/home/saurab/hadoopec/hive/lib/mysql-connector-java-5.1.38.jar

然后我犯了这个错误

org.datanucleus.exceptions.NucleusUserException: Persistence process has been specified to use a ClassLoaderResolver of name "datanucleus" yet this has not been found by the DataNucleus plugin mechanism. Please check your CLASSPATH and plugin specification.

我发现我该去的地方 spark-submit . 所以我喜欢这个

./bin/spark-submit --class HiveRead --master yarn  --jars jars/datanucleus-api-jdo-3.2.6.jar,jars/datanucleus-core-3.2.10.jar,jars/datanucleus-rdbms-3.2.9.jar,/home/saurab/hadoopec/hive/lib/mysql-connector-java-5.1.38.jar --files /home/saurab/hadoopec/spark/conf/hive-site.xml /home/saurab/sparkProjects/spark_hive/target/myJar-jar-with-dependencies.jar

现在我得到了新的错误类型

Table or view not found: `bigmart`.`o_sales`;

救救我!!:)
我抄了我的书 hive-site.xml/spark/conf ,已启动配置单元元存储服务( hiveserver2 --service metastore )
如果有人感兴趣,下面是hiveread.java代码。

mutmk8jj

mutmk8jj1#

spark会话无法读取配置单元目录。
使用spark submit命令提供hive-site.xml文件路径,如下所示。
对于hortonworks-文件路径/usr/hdp/current/spark2 client/conf/hive-site.xml
在spark submit命令中将其作为--files/usr/hdp/current/spark2 client/conf/hive-site.xml传递。

相关问题