无法从配置单元加载hbase表中的数据

mzaanser  于 2021-05-29  发布在  Hadoop
关注(0)|答案(3)|浏览(299)

我使用的是hadoop版本2.7.0、hive版本1.1.0、hbase版本hbase-0.98.14-hadoop2。
我已经成功地从配置单元创建了一个hbase表。

hive (Koushik)> CREATE TABLE hive_hbase_emp_test(eid int, ename string, esal double) 
              > STORED BY 'org.apache.hadoop.hive.hbase.HBaseStorageHandler'
              > WITH SERDEPROPERTIES 
              > ("hbase.columns.mapping" = ":key,cfstr:enm,cfsal:esl")
              > TBLPROPERTIES ("hbase.table.name" = "hive_hbase_emp_test");
OK
Time taken: 0.874 seconds

hbase(main):004:0> describe 'hive_hbase_emp_test'
Table hive_hbase_emp_test is ENABLED                                                                                                            
hive_hbase_emp_test                                                                                                                             
COLUMN FAMILIES DESCRIPTION                                                                                                                     
{NAME => 'cfsal', DATA_BLOCK_ENCODING => 'NONE', BLOOMFILTER => 'ROW', REPLICATION_SCOPE => '0', VERSIONS => '1', COMPRESSION => 'NONE', MIN_VER
SIONS => '0', TTL => 'FOREVER', KEEP_DELETED_CELLS => 'FALSE', BLOCKSIZE => '65536', IN_MEMORY => 'false', BLOCKCACHE => 'true'}                
{NAME => 'cfstr', DATA_BLOCK_ENCODING => 'NONE', BLOOMFILTER => 'ROW', REPLICATION_SCOPE => '0', VERSIONS => '1', COMPRESSION => 'NONE', MIN_VER
SIONS => '0', TTL => 'FOREVER', KEEP_DELETED_CELLS => 'FALSE', BLOCKSIZE => '65536', IN_MEMORY => 'false', BLOCKCACHE => 'true'}                
2 row(s) in 3.0650 seconds

但当我试图从Hive加载表时,它失败了。

hive (Koushik)> INSERT OVERWRITE TABLE hive_hbase_emp_test SELECT empid,empname,empsal FROM hive_employee;
Query ID = hduser_20150921110000_249675d5-9da7-49fe-b03e-3a2d813ac898
Total jobs = 1
Launching Job 1 out of 1
Number of reduce tasks is set to 0 since there's no reduce operator
Starting Job = job_1442836788507_0011, Tracking URL = http://localhost:8088/proxy/application_1442836788507_0011/
Kill Command = /usr/local/hadoop/bin/hadoop job  -kill job_1442836788507_0011
Hadoop job information for Stage-0: number of mappers: 1; number of reducers: 0
2015-09-21 11:01:39,041 Stage-0 map = 0%,  reduce = 0%
2015-09-21 11:02:39,429 Stage-0 map = 0%,  reduce = 0%
2015-09-21 11:02:45,814 Stage-0 map = 100%,  reduce = 0%
Ended Job = job_1442836788507_0011 with errors
Error during job, obtaining debugging information...
Examining task ID: task_1442836788507_0011_m_000000 (and more) from job job_1442836788507_0011

Task with the most failures(4): 
-----
Task ID:
  task_1442836788507_0011_m_000000

URL:
  http://0.0.0.0:8088/taskdetails.jsp?jobid=job_1442836788507_0011&tipid=task_1442836788507_0011_m_000000
-----
Diagnostic Messages for this Task:
Error: java.lang.RuntimeException: Error in configuring object
	at org.apache.hadoop.util.ReflectionUtils.setJobConf(ReflectionUtils.java:112)
	at org.apache.hadoop.util.ReflectionUtils.setConf(ReflectionUtils.java:78)
	at org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:136)
	at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:449)
	at org.apache.hadoop.mapred.MapTask.run(MapTask.java:343)
	at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:163)
	at java.security.AccessController.doPrivileged(Native Method)
	at javax.security.auth.Subject.doAs(Subject.java:415)
	at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1657)
	at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:158)
Caused by: java.lang.reflect.InvocationTargetException
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:601)
	at org.apache.hadoop.util.ReflectionUtils.setJobConf(ReflectionUtils.java:109)
	... 9 more
Caused by: java.lang.RuntimeException: Error in configuring object
	at org.apache.hadoop.util.ReflectionUtils.setJobConf(ReflectionUtils.java:112)
	at org.apache.hadoop.util.ReflectionUtils.setConf(ReflectionUtils.java:78)
	at org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:136)
	at org.apache.hadoop.mapred.MapRunner.configure(MapRunner.java:38)
	... 14 more
Caused by: java.lang.reflect.InvocationTargetException
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:601)
	at org.apache.hadoop.util.ReflectionUtils.setJobConf(ReflectionUtils.java:109)
	... 17 more
Caused by: java.lang.RuntimeException: Map operator initialization failed
	at org.apache.hadoop.hive.ql.exec.mr.ExecMapper.configure(ExecMapper.java:147)
	... 22 more
Caused by: java.lang.NoSuchMethodError: org.apache.hadoop.hive.serde2.lazy.LazyUtils.getByte(Ljava/lang/String;B)B
	at org.apache.hadoop.hive.serde2.lazy.LazySerDeParameters.collectSeparators(LazySerDeParameters.java:223)
	at org.apache.hadoop.hive.serde2.lazy.LazySerDeParameters.<init>(LazySerDeParameters.java:90)
	at org.apache.hadoop.hive.hbase.HBaseSerDeParameters.<init>(HBaseSerDeParameters.java:95)
	at org.apache.hadoop.hive.hbase.HBaseSerDe.initialize(HBaseSerDe.java:117)
	at org.apache.hadoop.hive.ql.exec.FileSinkOperator.initializeOp(FileSinkOperator.java:344)
	at org.apache.hadoop.hive.ql.exec.Operator.initialize(Operator.java:385)
	at org.apache.hadoop.hive.ql.exec.Operator.initialize(Operator.java:469)
	at org.apache.hadoop.hive.ql.exec.Operator.initializeChildren(Operator.java:425)
	at org.apache.hadoop.hive.ql.exec.SelectOperator.initializeOp(SelectOperator.java:65)
	at org.apache.hadoop.hive.ql.exec.Operator.initialize(Operator.java:385)
	at org.apache.hadoop.hive.ql.exec.Operator.initialize(Operator.java:469)
	at org.apache.hadoop.hive.ql.exec.Operator.initializeChildren(Operator.java:425)
	at org.apache.hadoop.hive.ql.exec.TableScanOperator.initializeOp(TableScanOperator.java:193)
	at org.apache.hadoop.hive.ql.exec.Operator.initialize(Operator.java:385)
	at org.apache.hadoop.hive.ql.exec.MapOperator.initializeOp(MapOperator.java:427)
	at org.apache.hadoop.hive.ql.exec.Operator.initialize(Operator.java:385)
	at org.apache.hadoop.hive.ql.exec.mr.ExecMapper.configure(ExecMapper.java:126)
	... 22 more

FAILED: Execution Error, return code 2 from org.apache.hadoop.hive.ql.exec.mr.MapRedTask
MapReduce Jobs Launched: 
Stage-Stage-0: Map: 1   HDFS Read: 0 HDFS Write: 0 FAIL
Total MapReduce CPU Time Spent: 0 msec

配置单元中auxlib文件夹的内容如下

hduser@ubuntu:/usr/lib/hive/auxlib$ ls
activation-1.1.jar
aopalliance-1.0.jar
apacheds-i18n-2.0.0-M15.jar
apacheds-kerberos-codec-2.0.0-M15.jar
api-asn1-api-1.0.0-M20.jar
api-util-1.0.0-M20.jar
asm-3.1.jar
avro-1.7.4.jar
aws-java-sdk-1.7.4.jar
azure-storage-2.0.0.jar
commons-beanutils-1.7.0.jar
commons-beanutils-core-1.8.0.jar
commons-cli-1.2.jar
commons-codec-1.7.jar
commons-collections-3.2.1.jar
commons-compress-1.4.1.jar
commons-configuration-1.6.jar
commons-daemon-1.0.13.jar
commons-digester-1.8.jar
commons-el-1.0.jar
commons-httpclient-3.1.jar
commons-io-2.4.jar
commons-lang-2.6.jar
commons-lang3-3.3.2.jar
commons-logging-1.1.1.jar
commons-math-2.1.jar
commons-math3-3.1.1.jar
commons-net-3.1.jar
curator-client-2.7.1.jar
curator-framework-2.7.1.jar
curator-recipes-2.7.1.jar
findbugs-annotations-1.3.9-1.jar
gmbal-api-only-3.0.0-b023.jar
grizzly-framework-2.1.2.jar
grizzly-http-2.1.2.jar
grizzly-http-server-2.1.2.jar
grizzly-http-servlet-2.1.2.jar
grizzly-rcm-2.1.2.jar
gson-2.2.4.jar
guava-12.0.1.jar
guice-3.0.jar
guice-servlet-3.0.jar
hadoop-annotations-2.7.0.jar
hadoop-ant-2.7.0.jar
hadoop-archives-2.7.0.jar
hadoop-auth-2.7.0.jar
hadoop-aws-2.7.0.jar
hadoop-azure-2.7.0.jar
hadoop-client-2.2.0.jar
hadoop-common-2.2.0.jar
hadoop-datajoin-2.7.0.jar
hadoop-distcp-2.7.0.jar
hadoop-extras-2.7.0.jar
hadoop-gridmix-2.7.0.jar
hadoop-hdfs-2.7.0.jar
hadoop-hdfs-2.7.0-tests.jar
hadoop-hdfs-nfs-2.7.0.jar
hadoop-mapreduce-client-app-2.7.0.jar
hadoop-mapreduce-client-common-2.7.0.jar
hadoop-mapreduce-client-core-2.7.0.jar
hadoop-mapreduce-client-hs-2.7.0.jar
hadoop-mapreduce-client-hs-plugins-2.7.0.jar
hadoop-mapreduce-client-jobclient-2.7.0.jar
hadoop-mapreduce-client-jobclient-2.7.0-tests.jar
hadoop-mapreduce-client-shuffle-2.7.0.jar
hadoop-mapreduce-examples-2.7.0.jar
hadoop-openstack-2.7.0.jar
hadoop-rumen-2.7.0.jar
hadoop-sls-2.7.0.jar
hadoop-streaming-2.7.0.jar
hadoop-yarn-api-2.7.0.jar
hadoop-yarn-applications-distributedshell-2.7.0.jar
hadoop-yarn-applications-unmanaged-am-launcher-2.7.0.jar
hadoop-yarn-client-2.7.0.jar
hadoop-yarn-common-2.7.0.jar
hadoop-yarn-registry-2.7.0.jar
hadoop-yarn-server-applicationhistoryservice-2.7.0.jar
hadoop-yarn-server-common-2.7.0.jar
hadoop-yarn-server-nodemanager-2.7.0.jar
hadoop-yarn-server-resourcemanager-2.7.0.jar
hadoop-yarn-server-sharedcachemanager-2.7.0.jar
hadoop-yarn-server-tests-2.7.0.jar
hadoop-yarn-server-web-proxy-2.7.0.jar
hamcrest-core-1.3.jar
hbase-annotations-0.98.14-hadoop2.jar
hbase-checkstyle-0.98.14-hadoop2.jar
hbase-client-0.98.14-hadoop2.jar
hbase-common-0.98.14-hadoop2.jar
hbase-common-0.98.14-hadoop2-tests.jar
hbase-examples-0.98.14-hadoop2.jar
hbase-hadoop2-compat-0.98.14-hadoop2.jar
hbase-hadoop-compat-0.98.14-hadoop2.jar
hbase-it-0.98.14-hadoop2.jar
hbase-it-0.98.14-hadoop2-tests.jar
hbase-prefix-tree-0.98.14-hadoop2.jar
hbase-protocol-0.98.14-hadoop2.jar
hbase-resource-bundle-0.98.14-hadoop2.jar
hbase-rest-0.98.14-hadoop2.jar
hbase-server-0.98.14-hadoop2.jar
hbase-server-0.98.14-hadoop2-tests.jar
hbase-shell-0.98.14-hadoop2.jar
hbase-testing-util-0.98.14-hadoop2.jar
hbase-thrift-0.98.14-hadoop2.jar
high-scale-lib-1.1.1.jar
hive-hbase-handler-1.2.1.jar
hive-serde-1.2.1.jar
htrace-core-2.04.jar
htrace-core-3.1.0-incubating.jar
httpclient-4.1.3.jar
httpclient-4.2.5.jar
httpcore-4.1.3.jar
httpcore-4.2.5.jar
jackson-annotations-2.2.3.jar
jackson-core-2.2.3.jar
jackson-core-asl-1.8.8.jar
jackson-core-asl-1.9.13.jar
jackson-databind-2.2.3.jar
jackson-jaxrs-1.8.8.jar
jackson-jaxrs-1.9.13.jar
jackson-mapper-asl-1.8.8.jar
jackson-mapper-asl-1.9.13.jar
jackson-xc-1.9.13.jar
jamon-runtime-2.3.1.jar
jasper-compiler-5.5.23.jar
jasper-runtime-5.5.23.jar
javax.inject-1.jar
java-xmlbuilder-0.4.jar
javax.servlet-3.1.jar
javax.servlet-api-3.0.1.jar
jaxb-api-2.2.2.jar
jaxb-impl-2.2.3-1.jar
jcodings-1.0.8.jar
jersey-client-1.8.jar
jersey-core-1.8.jar
jersey-core-1.9.jar
jersey-grizzly2-1.9.jar
jersey-guice-1.9.jar
jersey-json-1.9.jar
jersey-server-1.9.jar
jersey-test-framework-core-1.9.jar
jersey-test-framework-grizzly2-1.9.jar
jets3t-0.9.0.jar
jettison-1.1.jar
jettison-1.3.1.jar
jetty-6.1.26.jar
jetty-sslengine-6.1.26.jar
jetty-util-6.1.26.jar
joda-time-2.7.jar
joni-2.1.2.jar
jruby-complete-1.6.8.jar
jsch-0.1.42.jar
jsp-2.1-6.1.14.jar
jsp-api-2.1-6.1.14.jar
jsp-api-2.1.jar
jsr305-3.0.0.jar
junit-4.11.jar
leveldbjni-all-1.8.jar
libthrift-0.9.0.jar
log4j-1.2.17.jar
management-api-3.0.0-b012.jar
metrics-core-3.0.1.jar
mockito-all-1.8.5.jar
netty-3.6.6.Final.jar
paranamer-2.3.jar
protobuf-java-2.5.0.jar
servlet-api-2.5-6.1.14.jar
servlet-api-2.5.jar
slf4j-api-1.6.4.jar
slf4j-log4j12-1.6.4.jar1
snappy-java-1.0.4.1.jar
stax-api-1.0-2.jar
xmlenc-0.52.jar
xz-1.0.jar
zookeeper-3.4.6.jar

我在这里错过了什么??

s4chpxco

s4chpxco1#

似乎存在版本兼容性问题。这个 org.apache.hadoop.hive.serde2.lazy.LazyUtils.getByte 在Hive1.2中发布的提交中添加到此类。看到这里了吗

nszi6y05

nszi6y052#

nosuchmethoderror表示jvm无法找到该类,但找不到该方法。可能是类(运行时),与您的配置单元版本不同。可以在调试模式(bin/hive-hiveconf hive.root.logger=debug,console)下启动hive cli。它将显示所有jar,并在日志中找到jar版本。

mzmfm0qo

mzmfm0qo3#

实际上我犯了个错误。我一直 hive-hbase-handler-1.2.1.jar & hive-serde-1.2.1.jar 在auxlib路径中,导致了问题。当我删除了jars的1.2.1版本之后,它就可以正常工作了 hive-hbase-handler-1.1.0.jar & hive-serde-1.1.0.jar . 因此,这个问题仅通过hive版本1.1.0(habse版本0.98.14和hadoop版本2.7.0)解决。

相关问题