library包不能与oozie一起使用

c9qzyr3d  于 2021-06-03  发布在  Hadoop
关注(0)|答案(0)|浏览(151)

嗨,我在用shell脚本运行oozie。在这个shell脚本中,我使用的是sparkr作业。每当运行oozie作业时,我都会遇到库错误。
这是我的错误。

Stdoutput Running /opt/cloudera/parcels/CDH-5.4.2-1.cdh5.4.2.p0.2/lib/spark/bin/spark-submit --class edu.berkeley.cs.amplab.sparkr.SparkRRunner --files pi.R --master yarn-client   /SparkR-pkg/lib/SparkR/sparkr-assembly-0.1.jar pi.R yarn-client 4
  Stdoutput Error in library(SparkR) : there is no package called ‘SparkR’
  Stdoutput Execution halted
  Exit code of the Shell command 1
  <<< Invocation of Shell command completed <<<

我的job.properties文件

nameNode=hdfs://ip-172-31-41-199.us-west-2.compute.internal:8020
jobTracker=ip-172-31-41-199.us-west-2.compute.internal:8032
queueName=default
oozie.libpath=hdfs://ip-172-31-41-199.us-west- 2.compute.internal:8020/SparkR-pkg/lib/
oozie.use.system.libpath=true
oozie.wf.rerun.failnodes=true

oozieprojectroot=shell\u示例oozie.wf.application.path=${oozieprojectroot}/apps/shell
我的工作流.xml

<workflow-app xmlns="uri:oozie:workflow:0.1" name="Test">
<start to="shell-node"/>
<action name="shell-node">
<shell xmlns="uri:oozie:shell-action:0.1">
<job-tracker>${jobTracker}</job-tracker>
<name-node>${nameNode}</name-node>
 <configuration>
            <property>
                <name>mapred.job.queue.name</name>
                <value>${queueName}</value>
            </property>

        </configuration>

    <exec>script.sh</exec>
    <file>oozie-oozi/script.sh#script.sh</file>
    <file>/user/karun/examples/pi.R</file>
        <capture-output/>

    </shell>
    <ok to="end"/>
     <error to="fail"/>
     </action>
       <kill name="fail">
        <message>Incorrect output</message>

</kill>
<end name="end"/>

</workflow-app>

我的shell脚本文件

export SPARK_HOME=/opt/cloudera/parcels/CDH-5.4.2-1.cdh5.4.2.p0.2/lib/spark
export YARN_CONF_DIR=/etc/hadoop/conf
export JAVA_HOME=/usr/java/jdk1.7.0_67-cloudera
export HADOOP_CMD=/usr/bin/hadoop

/SparkR-pkg/lib/SparkR/sparkR-submit --master yarn-client pi.R yarn-client 4

我不知道如何解决这个问题。任何帮助都将不胜感激。。。

暂无答案!

目前还没有任何答案,快来回答吧!

相关问题