无法通过将hive-site.xml更改为与spark hivecontext连接来运行配置单元

ddhy6vgd  于 2021-06-01  发布在  Hadoop
关注(0)|答案(1)|浏览(275)

下面是我的hive/conf/hive-site.xml:

<configuration>
   <property>
      <name>javax.jdo.option.ConnectionURL</name>
      <value>jdbc:mysql://127.0.0.1/metastore?createDatabaseIfNotExist=true</value>
      <description>metadata is stored in a MySQL server</description>
   </property>
   <property>
      <name>javax.jdo.option.ConnectionDriverName</name>
      <value>com.mysql.jdbc.Driver</value>
      <description>MySQL JDBC driver class</description>
   </property>
   <property>
      <name>javax.jdo.option.ConnectionUserName</name>
      <value>hiveuser</value>
      <description>user name for connecting to mysql server</description>
   </property>
   <property>
      <name>javax.jdo.option.ConnectionPassword</name>\
      <value>hivepassword</value>
      <description>password for connecting to mysql server</description>
   </property>
</configuration>

我想使用spark hivecontext访问配置单元现有的数据库和表。因此,在hive/conf/hive-site.xml中添加了以下行:

<configuration>
   <property>
      <name>javax.jdo.option.ConnectionURL</name>
      <value>jdbc:mysql://127.0.0.1/metastore?createDatabaseIfNotExist=true</value>
      <description>metadata is stored in a MySQL server</description>
   </property>
   <property>
      <name>javax.jdo.option.ConnectionDriverName</name>
      <value>com.mysql.jdbc.Driver</value>
      <description>MySQL JDBC driver class</description>
   </property>
   <property>
      <name>javax.jdo.option.ConnectionUserName</name>
      <value>hiveuser</value>
      <description>user name for connecting to mysql server</description>
   </property>
   <property>
      <name>javax.jdo.option.ConnectionPassword</name>\
      <value>hivepassword</value>
      <description>password for connecting to mysql server</description>
   </property>
   <property>
      <name>hive.metastore.uris</name>
      <value>thrift://127.0.0.1:9083</value>
   </property>
</configuration>

如上图所示编辑hive-site.xml后,配置单元外壳不起作用。请帮助我以正确的方式更新hive-site.xml,并帮助我使用hivecontext访问spark shell上的配置单元表,如下所示:

val hc = new org.apache.spark.sql.hive.HiveContext(sc);
hc.setConf("hive.metastore.uris","thrift://127.0.0.1:9083");
val a = hc.sql("show databases");
a.show //should display all my hive databases.

请在这个问题上帮助我。

fae0ux8s

fae0ux8s1#

@chaithu您需要使用hive--service metastore启动您的hive metastore,然后以这种方式在启用hivesupport的情况下创建sparksession

val spark= SparkSession
  .builder()
  .master("local")
  .appName("HiveExample").config("hive.metastore.uris","thrift://hadoop-master:9083")
  .enableHiveSupport()
  .getOrCreate()

相关问题