spark作业在启用配置单元支持的oozie中失败

af7jpaap  于 2021-05-27  发布在  Spark
关注(0)|答案(0)|浏览(265)

我正在尝试安排oozie工作流,使用spark操作和启用的配置单元支持。当它是没有Hive支持的普通spark作业时,时间操作运行正常。添加配置单元支持后,我可以通过spark submit运行spark job。但当我试着在oozie跑的时候

Unable to instantiate SparkSession with Hive support because Hive classes are not found.

下面是创建spark会话的代码:

static SparkSession initializeSparkSession() {
    SparkSession sparkSession = SparkSession.builder().appName("DataLoad").enableHiveSupport().getOrCreate();

    sparkSession.sparkContext().conf().set("spark.sql.sources.partitionOverwriteMode", "dynamic");
    sparkSession.sparkContext().conf().set("hive.exec.dynamic.partition.mode", "nonstrict");

    return sparkSession;
}

以下是相关性:

<dependency>
        <groupId>org.apache.spark</groupId>
        <artifactId>spark-core_2.11</artifactId>
        <version>2.2.1</version>
        <scope>provided</scope>
    </dependency>

    <dependency>
        <groupId>org.apache.spark</groupId>
        <artifactId>spark-sql_2.11</artifactId>
        <version>2.2.1</version>
        <scope>provided</scope>
    </dependency>

     <dependency>
        <groupId>org.apache.spark</groupId>
        <artifactId>spark-hive_2.11</artifactId>
        <version>2.2.1</version>
        <scope>provided</scope>
    </dependency>

下面是oozie工作流操作:

<action name="data_load">
    <spark xmlns="uri:oozie:spark-action:0.1">
        <job-tracker>${jobTracker}</job-tracker>
        <name-node>${nameNode}</name-node>
        <master>yarn</master>
        <mode>cluster</mode>
        <name>DataMovement</name>
        <class>{package}.Job</class>
        <jar>${sparkJarPath}/s3_etl-0.0.1.jar</jar>
        <spark-opts>--files=/etc/spark/conf/hive-site.xml --conf spark.yarn.dist.files=file:/etc/spark/conf/hive-site.xml</spark-opts>
        <arg>${market}</arg>
        <arg>${market_lag}</arg>
        <arg>${data_bucket}</arg>
        <arg>${trigger_bucket}</arg>
        <arg>ALL</arg>
    </spark>
    <ok to="notifyJobSuccess" />
    <error to="notifyJobFailure" />
</action>

我是否需要在share lib目录中添加更多内容或删除任何内容。
--如果我没有在全局属性中添加配置单元,则会出现上述错误。如果我们在全局属性中添加配置单元

<global>
    <job-tracker>${jobTracker}</job-tracker>
    <name-node>${nameNode}</name-node>
    <configuration>
        <property>
            <name>mapred.job.queue.name</name>
            <value>${queueName}</value>
        </property>
        <property>
            <name>oozie.action.sharelib.for.spark</name>
            <value>spark,oozie,hive</value>
        </property>
    </configuration>
</global>

如果抛出另一个异常

ERROR ApplicationMaster: User class threw exception: java.lang.NoSuchFieldError: HIVE_STATS_JDBC_TIMEOUT

暂无答案!

目前还没有任何答案,快来回答吧!

相关问题