hive-e在使用< shell>从oozie子工作流调用时抛出nosuchmethoderror,在从主工作流调用时运行良好

q35jwt9p  于 2021-06-24  发布在  Hive
关注(0)|答案(0)|浏览(247)

我在重构一个oozie工作流,所有的工作流都写在一个文件中,试图把它分解成子工作流。但在重构之后,它开始抛出

Exception in thread "main" java.lang.NoSuchMethodError: org.apache.hadoop.io.retry.RetryPolicies.retryForeverWithFixedSleep(JLjava/util/concurrent/TimeUnit;)Lorg/apache/hadoop/io/retry/RetryPolicy;

原始工作流:

<?xml version="1.0" encoding="UTF-8"?>
    <workflow-app xmlns="uri:oozie:workflow:0.5" name="Main">
 <start to="loadToHive"/>
<action name="loadToHive">
    <shell xmlns="uri:oozie:shell-action:0.2">
      <job-tracker>${jobTracker}</job-tracker>
      <name-node>${nameNode}</name-node>
      <configuration>
        <property>
          <name>yarn.nodemanager.container-executor.class</name>
          <value>LinuxContainerExecutor</value>
        </property>
        <property>
          <name>yarn.nodemanager.linux-container-executor.nonsecure-mode.limit-user</name>
          <value>true</value>
        </property>
      </configuration>
      <exec>${loadToHiveActionScript}</exec>
      <argument>${outPutPath}</argument>
      <argument>${dataSetPath}</argument>
      <argument>${hiveDB}</argument>
      <env-var>HADOOP_USER_NAME=${wf:user()}</env-var>
      <file>${loadToHiveActionScriptPath}#${loadToHiveActionScript}</file>
    </shell>
    <ok to="uplaodToMysql"/>
    <error to="handleFailure"/>
  </action>

重构文件:

<?xml version="1.0" encoding="UTF-8"?>
    <workflow-app xmlns="uri:oozie:workflow:0.5" name="Main">
 <start to="loadToHive"/>
<action name="loadToHive">
  <sub-workflow>
    <app-path>${oozieProjectRoot}/commonWorkflows/mongoTransform.xml</app-path>
<propagate-configuration/>
  </sub-workflow>
  <ok to="uplaodToMysql"/>
  <error to="handleFailure"/>
</action>

子工作流文件:

<?xml version="1.0" encoding="UTF-8"?>
<workflow-app name="mongoTransform-${module}" xmlns="uri:oozie:workflow:0.5">
    <start to="loadToHiveSub"/>
<action name="loadToHive">
                <shell xmlns="uri:oozie:shell-action:0.2">
                <job-tracker>${jobTracker}</job-tracker>
                <name-node>${nameNode}</name-node>
                <configuration>
                         <property>
                                   <name>yarn.nodemanager.container-executor.class</name>
                                   <value>LinuxContainerExecutor</value>
                         </property>
                         <property>
                                   <name>yarn.nodemanager.linux-container-executor.nonsecure-mode.limit-user</name>
                                   <value>true</value>
                         </property>
                </configuration>
                <exec>${loadToHiveActionScript}</exec>
                <argument>${outPutPath}</argument>
                <argument>${dataSetPath}</argument>
                <argument>${hiveDB}</argument>
                 <env-var>HADOOP_USER_NAME=${wf:user()}</env-var>
                <file>${loadToHiveActionScriptPath}#${loadToHiveActionScript}</file>
                </shell>
                <ok to="end"/>
                <error to="handleFailure"/>
        </action>

加载到hiveActionScript.sh

hive -e "Drop table if exists ${3}.${i}_intermediate";
...
hive -e " Alter table ${3}.${i}_intermediate RENAME TO ${3}.$i";

当它在主工作流文件中运行时,执行得非常好。这是env var:hadoop\u user\u name=${wf:user()}的问题吗

暂无答案!

目前还没有任何答案,快来回答吧!

相关问题