sqoop正在尝试将字符串数据类型导入为bigint-将nvarchar值转换为jdbc数据类型bigint时出错

q3aa0525  于 2021-06-03  发布在  Sqoop
关注(0)|答案(0)|浏览(255)

我正在尝试使用导入我的sql表 sqoop . 导入数据时, sqoop 正在尝试转换 nvarchar 数据类型到 JDBC 数据类型 BIGINT ,它不应转换。该列应读取为字符串。我尝试使用--map column java col1=string,但这没有帮助,它仍在尝试转换数据类型。
导入数据的sqoop命令:

sqoop import  --connection-manager org.apache.sqoop.manager.SQLServerManager --driver com.microsoft.sqlserver.jdbc.SQLServerDriver --connect 'jdbc:sqlserver://fodb-ew1a-05-poc.ceqaqbxumwfk.eu-west-1.rds.amazonaws.com:1433;DatabaseName=FOTDUB-ECM16' --username 'trendappadmins' --password 'Trend!@App1' --map-column-java "pset_name=String,pset_id=Integer,pset_descrip=String,pset_color=String" --boundary-query "Select CAST(pset_id as int) as psetid, CAST(pset_descrip as nvarchar) as pset_descrip  ,[pset_color] from propset" --table 'propset' --target-dir $S3_output_dir

错误:
2020-12-21t09:37:35289 info[main]org.apache.sqoop.manager.sqlmanager-执行sql语句:从[propset]中选择t.*作为t,其中1=0 2020-12-21t09:37:35408 info[main]org.apache.sqoop.orm.compilationmanager-hadoop\u mapred\u home is/usr/lib/hadoop mapreduce 2020-12-21t09:37:39,266 info[main]org.apache.sqoop.orm.compilationmanager-编写jar文件:/tmp/sqoop hadoop/compile/fae3dc20752e2c9bfc8125c9cf782315/propset.jar 2020-12-21t09:37:39308 info[main]org.apache.sqoop.mapreduce.importjobbase-开始导入propset 2020-12-21t09:37:39,544 info[main]org.apache.hadoop.conf.configuration.deprecation-mapred.jar已弃用。相反,使用mapreduce.job.jar 2020-12-21t09:37:39570 warn[main]org.apache.sqoop.mapreduce.db.datadrivendbinputformat-在查询中找不到$conditions令牌:从propset中选择cast(pset\u id as int)作为psetid,cast(pset\u descrip as nvarchar)作为pset\u descripp,[pset\u color];拆分不能对数据进行分区。2020-12-21t09:37:42229 info[main]org.apache.hadoop.conf.configuration.deprecation-fs.s3a.server-side-encryption-key已弃用。相反,请使用fs.s3a.server-side-encryption.key 2020-12-21t09:37:42290 info[main]org.apache.hadoop.conf.configuration.deprecation-mapred.map.tasks。相反,请使用mapreduce.job.maps 2020-12-21t09:37:42462 info[main]org.apache.hadoop.yarn.client.rmproxy-连接到resourcemanager,地址为ip-172-31-6-251.eu-west-1.compute.internal/172.31.6.251:8032 2020-12-21t09:37:42,660 info[main]org.apache.hadoop.yarn.client.ahsproxy-连接到ip-172-31-6-251.eu-west-1.compute.internal/172.31.6.251:10200 2020-12-21t09:37:49168 info[main]org.apache.sqoop.mapreduce.db.dbinputformat-使用读取提交事务隔离2020-12-21t09:37:49,169 info[main]org.apache.sqoop.mapreduce.db.datadrivendbinputformat-boundingvalsquery:选择cast(pset_id as int)作为psetid,cast(pset_descripp as nvarchar)作为pset_descripp,[pset_color]来自propset 2020-12-21t09:37:49,177 info[main]org.apache.hadoop.mapreduce.jobsubmitter-清理暂存区/tmp/hadoop-yarn/staging/hadoop/.staging/job\u 1608543028860\u 0002**2020-12-21t09:37:49,185 error[main]org.apache.sqoop.tool.importtool-导入失败:java.io.ioexception:com.microsoft.sqlserver.jdbc.sqlserverexception:将nvarchar值转换为jdbc数据类型bigint时出错。**原因:com.microsoft.sqlserver.jdbc.sqlserverexception:将varchar值转换为jdbc数据类型bigint时出错。在com.microsoft.sqlserver.jdbc.ddc.convertstreamtoobject(ddc。java:689)在com.microsoft.sqlserver.jdbc.serverdtvimpl.getvalue(dtv。java:3849)在com.microsoft.sqlserver.jdbc.dtv.getvalue(dtv。java:286)..... 位于org.apache.sqoop.mapreduce.db.datadrivendbinputformat.getsplits(datadrivendbinputformat)。java:201) ... 23更多原因:java.lang.numberformatexception:对于输入字符串:“inventory”
样品表:

id  |name   |  Project  |Address
2   |Vicky  |           |NULL
3   |Vikram |           |NULL
4   |Preeti |  Testing  |NULL
5   |Harry  |           |NULL
6   |Gaurav |           |NULL
7   |Mani   |           |NULL

有没有办法强迫 sqoop 把这个专栏当作 string 而不是一个 integer ?

暂无答案!

目前还没有任何答案,快来回答吧!

相关问题