hiveexception:意外异常:org.apache.hadoop.io.text不能强制转换为org.apache.hadoop.io.intwriteable

ut6juiuv  于 2021-05-29  发布在  Hadoop
关注(0)|答案(0)|浏览(1239)

我经常面临配置单元查询的强制转换错误。下面给出了一个错误日志示例。错误包含以下详细信息:
完整行数据

HiveException: Hive Runtime Error while processing row    {"company_id":"3M","oper_carrier":"3M","flt_num":"4062","equip_id":"0331","equip_type":"SF3","act_fleet_type_cd":"U","shares_ship_id":"0331","report_dt":"2013-12-12","origin":"CKB","destination":"IAD"}

-成本核算例外

Caused by: java.lang.ClassCastException: org.apache.hadoop.io.Text cannot be cast to org.apache.hadoop.io.IntWritable

我不知道如何调试这种异常。它描述的是完整的行,而不是列的值。我已经检查了两个表中的值及其数据类型。
请帮助我确定问题并解决它。
查询已运行:

SELECT
  vss.company_id,vss.shares_ship_id,vss.seatmap_cd,vss.cabin,vss.seat,  vss.seat_loc_dscr, vss.ep_seat AS EPlus_Seat, vss.ep_win_seat, vss.ep_asle_seat
FROM rvsed11 zz
LEFT OUTER JOIN rvsed22 vss
 ON zz.company_id = vss.company_id
 AND zz.shares_ship_id = vss.shares_ship_id

查看完整日志:

Query ID = root_20160111201839_bd6220b3-cca8-48ee-8d32-4e26fdaab8bf
Total jobs = 1
Launching Job 1 out of 1
Number of reduce tasks is set to 0 since there's no reduce operator
Starting Job = job_1450202704586_0448, Tracking URL = http://server-D912:8088/proxy/application_1450202704586_0448/
Kill Command = /usr/share/hadoop_ecosystem/hadoop-2.6.0/bin/hadoop job  -kill job_1450202704586_0448
Hadoop job information for Stage-3: number of mappers: 1; number of    reducers: 0
2016-01-11 20:18:43,960 Stage-3 map = 0%,  reduce = 0%
2016-01-11 20:19:02,202 Stage-3 map = 100%,  reduce = 0%
Ended Job = job_1450202704586_0448 with errors
Error during job, obtaining debugging information...
Examining task ID: task_1450202704586_0448_m_000000 (and more) from job job_1450202704586_0448

Task with the most failures(4): 
-----
Task ID:
  task_1450202704586_0448_m_000000

URL:
  http://server-D912:8088/taskdetails.jsp?jobid=job_1450202704586_0448&tipid=task_1450202704586_0448_m_000000
-----
Diagnostic Messages for this Task:
Error: java.lang.RuntimeException: org.apache.hadoop.hive.ql.metadata.HiveException: Hive Runtime Error while processing row {"company_id":"3M","oper_carrier":"3M","flt_num":"4062","equip_id":"0331","equip_type":"SF3","act_fleet_type_cd":"U","shares_ship_id":"0331","report_dt":"2013-12-12","origin":"CKB","destination":"IAD"}

at org.apache.hadoop.hive.ql.exec.mr.ExecMapper.map(ExecMapper.java:172)
at org.apache.hadoop.mapred.MapRunner.run(MapRunner.java:54)
at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:450)
at org.apache.hadoop.mapred.MapTask.run(MapTask.java:343)
at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:163)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:415)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1628)
at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:158)
Caused by: org.apache.hadoop.hive.ql.metadata.HiveException: Hive Runtime Error while processing row {"company_id":"3M","oper_carrier":"3M","flt_num":"4062","equip_id":"0331","equip_type":"SF3","act_fleet_type_cd":"U","shares_ship_id":"0331","report_dt":"2013-12-12","origin":"CKB","destination":"IAD"}
at org.apache.hadoop.hive.ql.exec.MapOperator.process(MapOperator.java:518)
at org.apache.hadoop.hive.ql.exec.mr.ExecMapper.map(ExecMapper.java:163)
... 8 more
Caused by: org.apache.hadoop.hive.ql.metadata.HiveException: Unexpected exception:**org.apache.hadoop.io.Text cannot be cast to org.apache.hadoop.io.IntWritable**
at org.apache.hadoop.hive.ql.exec.MapJoinOperator.process(MapJoinOperator.java:426)
at org.apache.hadoop.hive.ql.exec.Operator.forward(Operator.java:837)
at org.apache.hadoop.hive.ql.exec.TableScanOperator.process(TableScanOperator.java:97)
at org.apache.hadoop.hive.ql.exec.MapOperator$MapOpCtx.forward(MapOperator.java:162)
at org.apache.hadoop.hive.ql.exec.MapOperator.process(MapOperator.java:508)
... 9 more
Caused by: java.lang.ClassCastException: org.apache.hadoop.io.Text cannot be cast to org.apache.hadoop.io.IntWritable
at org.apache.hadoop.hive.serde2.objectinspector.primitive.WritableIntObjectInspector.get(WritableIntObjectInspector.java:36)
at org.apache.hadoop.hive.serde2.lazy.LazyUtils.writePrimitiveUTF8(LazyUtils.java:202)
at org.apache.hadoop.hive.serde2.lazy.LazySimpleSerDe.serialize(LazySimpleSerDe.java:307)
at org.apache.hadoop.hive.serde2.lazy.LazySimpleSerDe.serializeField(LazySimpleSerDe.java:262)
at org.apache.hadoop.hive.serde2.lazy.LazySimpleSerDe.doSerialize(LazySimpleSerDe.java:246)
at org.apache.hadoop.hive.serde2.AbstractEncodingAwareSerDe.serialize(AbstractEncodingAwareSerDe.java:50)
at org.apache.hadoop.hive.ql.exec.FileSinkOperator.process(FileSinkOperator.java:720)
at org.apache.hadoop.hive.ql.exec.Operator.forward(Operator.java:837)
at org.apache.hadoop.hive.ql.exec.SelectOperator.process(SelectOperator.java:88)
at org.apache.hadoop.hive.ql.exec.Operator.forward(Operator.java:837)
at org.apache.hadoop.hive.ql.exec.CommonJoinOperator.internalForward(CommonJoinOperator.java:644)
at org.apache.hadoop.hive.ql.exec.CommonJoinOperator.genUniqueJoinObject(CommonJoinOperator.java:657)
at org.apache.hadoop.hive.ql.exec.CommonJoinOperator.genUniqueJoinObject(CommonJoinOperator.java:660)
at org.apache.hadoop.hive.ql.exec.CommonJoinOperator.checkAndGenObject(CommonJoinOperator.java:756)
at org.apache.hadoop.hive.ql.exec.MapJoinOperator.process(MapJoinOperator.java:414)
... 13 more

Hive>描述扩展rvsed11;

OK
company_id              string                                      
oper_carrier            string                                      
flt_num                 string                                      
equip_id                string                                      
equip_type              string                                      
act_fleet_type_cd       string                                      
shares_ship_id          string                                      
report_dt               date                                        
origin                  string                                      
destination             string                                      

Detailed Table Information  Table(tableName:rvsed11, dbName:DB_123, owner:root, createTime:1452081461, lastAccessTime:0, retention:0, sd:StorageDescriptor(cols:[FieldSchema(name:company_id, type:string, comment:null), FieldSchema(name:oper_carrier, type:string, comment:null), FieldSchema(name:flt_num, type:string, comment:null), FieldSchema(name:equip_id, type:string, comment:null), FieldSchema(name:equip_type, type:string, comment:null), FieldSchema(name:act_fleet_type_cd, type:string, comment:null), FieldSchema(name:shares_ship_id, type:string, comment:null), FieldSchema(name:report_dt, type:date, comment:null), FieldSchema(name:origin, type:string, comment:null), FieldSchema(name:destination, type:string, comment:null)], location:hdfs://server-D912:9090/user/hive/warehouse/DB_123.db/rvsed11, inputFormat:org.apache.hadoop.mapred.TextInputFormat, outputFormat:org.apache.hadoop.hive.ql.io.HiveIgnoreKeyTextOutputFormat, compressed:false, numBuckets:-1, serdeInfo:SerDeInfo(name:null, serializationLib:org.apache.hadoop.hive.serde2.lazy.LazySimpleSerDe, parameters:{serialization.format=,, field.delim=,}), bucketCols:[], sortCols:[], parameters:{}, skewedInfo:SkewedInfo(skewedColNames:[], skewedColValues:[], skewedColValueLocationMaps:{}), storedAsSubDirectories:false), partitionKeys:[], parameters:{numFiles=1, transient_lastDdlTime=1452081474, COLUMN_STATS_ACCURATE=true, totalSize=1883, numRows=0, rawDataSize=0}, viewOriginalText:null, viewExpandedText:null, tableType:MANAGED_TABLE)

Hive>描述扩展rvsed22;

OK
company_id              string                                      
equip_id                string                                      
shares_ship_id          string                                      
seatmap_cd              string                                      
cabin                   string                                      
seat                    string                                      
seat_loc_dscr           string                                      
ep_win_seat             int                                         
ep_mid_seat             int                                         
ep_asle_seat            int                                         
em_win_seat             int                                         
em_mid_seat             int                                         
em_asle_seat            int                                         
ep_seat                 int                                         
y_win_seat              int                                         
y_mid_seat              int                                         
y_asle_seat             int                                         
fj_win_seat             int                                         
fj_mid_seat             int                                         
fj_asle_seat            int                                         
exit_row                int                                         
bulkhead_row            int                                         
eff_dt                  date                                        
disc_dt                 date                                        

Detailed Table Information  Table(tableName:rvsed22, dbName:DB_123, owner:root, createTime:1452081506, lastAccessTime:0, retention:0, sd:StorageDescriptor(cols:[FieldSchema(name:company_id, type:string, comment:null), FieldSchema(name:equip_id, type:string, comment:null), FieldSchema(name:shares_ship_id, type:string, comment:null), FieldSchema(name:seatmap_cd, type:string, comment:null), FieldSchema(name:cabin, type:string, comment:null), FieldSchema(name:seat, type:string, comment:null), FieldSchema(name:seat_loc_dscr, type:string, comment:null), FieldSchema(name:ep_win_seat, type:int, comment:null), FieldSchema(name:ep_mid_seat, type:int, comment:null), FieldSchema(name:ep_asle_seat, type:int, comment:null), FieldSchema(name:em_win_seat, type:int, comment:null), FieldSchema(name:em_mid_seat, type:int, comment:null), FieldSchema(name:em_asle_seat, type:int, comment:null), FieldSchema(name:ep_seat, type:int, comment:null), FieldSchema(name:y_win_seat, type:int, comment:null), FieldSchema(name:y_mid_seat, type:int, comment:null), FieldSchema(name:y_asle_seat, type:int, comment:null), FieldSchema(name:fj_win_seat, type:int, comment:null), FieldSchema(name:fj_mid_seat, type:int, comment:null), FieldSchema(name:fj_asle_seat, type:int, comment:null), FieldSchema(name:exit_row, type:int, comment:null), FieldSchema(name:bulkhead_row, type:int, comment:null), FieldSchema(name:eff_dt, type:date, comment:null), FieldSchema(name:disc_dt, type:date, comment:null)], location:hdfs://server-D912:9090/user/hive/warehouse/DB_123.db/rvsed22, inputFormat:org.apache.hadoop.mapred.TextInputFormat, outputFormat:org.apache.hadoop.hive.ql.io.HiveIgnoreKeyTextOutputFormat, compressed:false, numBuckets:-1, serdeInfo:SerDeInfo(name:null, serializationLib:org.apache.hadoop.hive.serde2.lazy.LazySimpleSerDe, parameters:{serialization.format=,, field.delim=,}), bucketCols:[], sortCols:[], parameters:{}, skewedInfo:SkewedInfo(skewedColNames:[], skewedColValues:[], skewedColValueLocationMaps:{}), storedAsSubDirectories:false), partitionKeys:[], parameters:{numFiles=1, transient_lastDdlTime=1452081534, COLUMN_STATS_ACCURATE=true, totalSize=14506, numRows=0, rawDataSize=0}, viewOriginalText:null, viewExpandedText:null, tableType:MANAGED_TABLE)

暂无答案!

目前还没有任何答案,快来回答吧!

相关问题