hadoop 配置单元创建外部表指向mysql表与java.lang.NullPonterException

cxfofazt  于 8个月前  发布在  Hadoop
关注(0)|答案(1)|浏览(89)

hive版本:3.1.0.3.1.4.0-315 Ambari版本:hdp-3.1.4.0说明:我想创建一个hive外部表,指向另一个服务器上的exists mysql表,我的hive配置文件是:

sudo beeline -u 'jdbc:hive2://hadoop1.4482.interconnect-hy2:2181,hadoop2.4482.interconnect-hy2:2181,hadoop3.4482.interconnect-hy2:2181/;serviceDiscoveryMode=zooKeeper;zooKeeperNamespace=hiveserver2' --showHeader=false --silent=true --verbose=false -e"
CREATE EXTERNAL TABLE dpyy_test.dim_order_info(
     id bigint,
     order_id string,
     order_status int,
     external_product_id string,
     product_code string,
     product_type int,
     product_name string,
     open_id string,
     custom_id string,
     charge_phone string,
     order_price int,
     order_count int,
     total_order_price int,
     freight int,
     sync_status int,
     remark string,
     app_id string,
     biz_code string,
     province_code string,
     external_order_id string,
     order_type string,
     transact_channel string,
     pay_type string,
     order_date string,
     cancel_date string,
     expand_receiver string,
     order_effect_time string,
     order_expire_time string,
     create_time timestamp,
     update_time timestamp
) STORED BY 'org.apache.hive.storage.jdbc.JdbcStorageHandler' 
LOCATION 'hdfs://hdp5/dpyy/test/hive/dpyy_test.db/dim_order_info' 
TBLPROPERTIES (
  'hive.sql.databsae.type' = 'MYSQL',
  'hive.sql.jdbc.driver' = 'com.mysql.jdbc.Driver',
  'hive.sql.jdbc.url' = 'jdbc:mysql://172.20.XXX.XXX/dpyy_vas',
  'hive.sql.dbcp.username' = 'dpyy_vas',
  'hive.sql.dbcp.passowrd' = 'XXXXXX',
  'hive.sql.table' = 'order_info',
  'hive.sql.query' = 'select id,order_id,order_status,external_product_id,product_code,product_type,product_name,open_id,custom_id,charge_phone,order_price,order_count,total_order_price,freight,sync_status,remark,app_id,biz_code,province_code,external_order_id,order_type,transact_channel,pay_type,order_date,cancel_date,expand_receiver,order_effect_time,order_expire_time,create_time,update_time from order_info',
  'hive.sql.dbcp.maxActive' = '1'
) 
"

然而,结果是:

23/09/15 15:37:21 [main]: INFO jdbc.HiveConnection: Connected to hadoop2.4482.interconnect-hy2:10000
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/usr/hdp/3.1.4.0-315/hive/lib/log4j-slf4j-impl-2.10.0.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/usr/hdp/3.1.4.0-315/hadoop/lib/slf4j-log4j12-1.7.25.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.apache.logging.slf4j.Log4jLoggerFactory]
23/09/15 15:37:26 [main]: INFO jdbc.HiveConnection: Connected to hadoop1.4482.interconnect-hy2:10000
Error: Error while processing statement: FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.DDLTask. java.lang.NullPointerException (state=08S01,code=1)

我在hiveserver2.log中找到一些信息:

2023-09-16T21:49:15,629 INFO  [HiveServer2-Background-Pool: Thread-634534]: plan.CreateTableDesc (:()) - Use StorageHandler-supplied org.apache.hive.storage.jdbc.JdbcSerDe for table dim_dpyy_stbinfo_kafka_offsets
2023-09-16T21:49:15,630 ERROR [HiveServer2-Background-Pool: Thread-634534]: metadata.Table (:()) - Unable to get field from serde: org.apache.hive.storage.jdbc.JdbcSerDe
java.lang.NullPointerException: null
2023-09-16T21:49:15,630 ERROR [HiveServer2-Background-Pool: Thread-634534]: metadata.Table (:()) - Unable to get field from serde: org.apache.hive.storage.jdbc.JdbcSerDe
java.lang.NullPointerException: null
2023-09-16T21:49:15,630 ERROR [HiveServer2-Background-Pool: Thread-634534]: exec.DDLTask (:()) - Failed
org.apache.hadoop.hive.ql.metadata.HiveException: java.lang.NullPointerException

我认为关键信息是:

ERROR [HiveServer2-Background-Pool: Thread-634534]: metadata.Table (:()) - Unable to get field from serde: org.apache.hive.storage.jdbc.JdbcSerDe

我仍然不知道如何解决这个问题,我的数据库是从Hive JDBC存储库复制的,但它不工作,谁能给予我一些建议?我应该导入任何依赖jar吗?

fdbelqdn

fdbelqdn1#

我终于找到了为什么不能在hive元数据库中创建表的原因,因为在上面提到的ddl的tblproperties中,我犯了两个拼写错误,一个是'hive.sql.databsae. type',正确的属性名称应该是'hive.sql.database. type',另一个是'hive.sql. dbcp. password',正确的属性名称应该是'hive.sql.dbcp. password'。我浪费了很多时间去检查hive jdbc存储处理程序的源代码,结果证明是我愚蠢的错误

相关问题