配置单元失败:parseexception行2:0无法识别列规范中“macaddress”、“char”()附近的输入

vtwuwzda  于 2021-06-03  发布在  Hadoop
关注(0)|答案(1)|浏览(345)

我试过跑步 hive -v -f sqlfile.sql 这是文件的内容

CREATE TABLE UpStreamParam (
'macaddress' CHAR(50),
'datats' BIGINT,
'cmtstimestamp' BIGINT,
'modulation' INT,
'chnlidx'   INT,
'severity' BIGINT,
'rxpower'  FLOAT,
'sigqnoise' FLOAT,
'noisedeviation'  FLOAT,
'prefecber'  FLOAT,
'postfecber'  FLOAT,
'txpower'  FLOAT,
'txpowerdrop' FLOAT,
'nmter'  FLOAT,
'premtter'  FLOAT,
'postmtter'  FLOAT,
'unerroreds'  BIGINT,
'corrected'  BIGINT,
'uncorrectables'  BIGINT)
STORED AS ORC TBLPROPERTIES ("orc.compress"="SNAPPY","orc.bloom.filters.columns"="macaddress")
PARTITIONED BY ('cmtsid' CHAR(50),' date' INT)
LOCATION '/usr/hive/warehouse/UpStreamParam' ;

我得到以下错误:
失败:parseexception行2:0无法识别列规范中“macaddress”、“char”()附近的输入

sd2nnvve

sd2nnvve1#

首先,列名必须用 ``` (反勾号),不是 ' (单引号)。
因此你必须替换 'macaddress'macaddress ,以及所有其他列名。
第二,顺序 STORED AS 以及 TBLPROPERTIES 以及 PARTITIONED BY 以及 LOCATION 这是错误的。正确的顺序是 PARTITIONED BY , STORED AS , LOCATION , TBLPROPERTIES .
有关详细信息,请参阅配置单元语言手册。https://cwiki.apache.org/confluence/display/hive/languagemanual+ddl#languagemanualddl-创建表
所以正确的代码是

CREATE TABLE UpStreamParam (
`macaddress` CHAR(50),
`datats` BIGINT,
`cmtstimestamp` BIGINT,
`modulation` INT,
`chnlidx`   INT,
`severity` BIGINT,
`rxpower`  FLOAT,
`sigqnoise` FLOAT,
`noisedeviation`  FLOAT,
`prefecber`  FLOAT,
`postfecber`  FLOAT,
`txpower`  FLOAT,
`txpowerdrop` FLOAT,
`nmter`  FLOAT,
`premtter`  FLOAT,
`postmtter`  FLOAT,
`unerroreds`  BIGINT,
`corrected`  BIGINT,
`uncorrectables`  BIGINT)
PARTITIONED BY (`cmtsid` CHAR(50), `date` INT)
STORED AS ORC
LOCATION '/usr/hive/warehouse/UpStreamParam'
TBLPROPERTIES ("orc.compress"="SNAPPY","orc.bloom.filters.columns"="macaddress");

相关问题