hive update:使用if语句时出现空指针异常

xoefb8l8  于 2021-05-29  发布在  Hadoop
关注(0)|答案(0)|浏览(356)

我在配置单元中有一个表,我必须在其中更新某些记录。我正在使用hive 0.13版本。我在google上搜索了一下,发现可以使用带有insert overwrite的if else语句来执行此操作,但在运行一个查询之后,它抛出了空指针异常。
这是我的员工表:

1       emp1  
2       emp2    
3       emp3    
4       emp4   
5       emp5

我创建了另一个表employee\u incr,其模式与employee相同,并运行此查询以获取更新的记录。

insert overwrite table employee_incr select employee.ename,if(employee.id="1",12,employee.id ) as employee.id from employee;

这是Hive的痕迹。

Concurrency mode is disabled, not creating a lock manager
2015-07-27 16:26:42,086 INFO  [main]: log.PerfLogger (PerfLogger.java:PerfLogBegin(108)) - <PERFLOG method=compile from=org.apache.hadoop.hive.ql.Driver>
2015-07-27 16:26:42,111 INFO  [main]: log.PerfLogger (PerfLogger.java:PerfLogBegin(108)) - <PERFLOG method=parse from=org.apache.hadoop.hive.ql.Driver>
2015-07-27 16:26:42,115 INFO  [main]: parse.ParseDriver (ParseDriver.java:parse(185)) - Parsing command: insert overwrite table employee_incr select employee.ename,if(employee.id="1",12,employee.id ) as employee.id from employee
2015-07-27 16:26:42,332 ERROR [main]: ql.Driver (SessionState.java:printError(547)) - FAILED: NullPointerException null
java.lang.NullPointerException
        at org.apache.hadoop.hive.ql.parse.HiveParser.regularBody(HiveParser.java:37646)
        at org.apache.hadoop.hive.ql.parse.HiveParser.queryStatementExpressionBody(HiveParser.java:36884)
        at org.apache.hadoop.hive.ql.parse.HiveParser.queryStatementExpression(HiveParser.java:36760)
        at org.apache.hadoop.hive.ql.parse.HiveParser.execStatement(HiveParser.java:1338)
        at org.apache.hadoop.hive.ql.parse.HiveParser.statement(HiveParser.java:1036)
        at org.apache.hadoop.hive.ql.parse.ParseDriver.parse(ParseDriver.java:199)
        at org.apache.hadoop.hive.ql.parse.ParseDriver.parse(ParseDriver.java:166)
        at org.apache.hadoop.hive.ql.Driver.compile(Driver.java:409)
        at org.apache.hadoop.hive.ql.Driver.compile(Driver.java:323)
        at org.apache.hadoop.hive.ql.Driver.compileInternal(Driver.java:980)
        at org.apache.hadoop.hive.ql.Driver.runInternal(Driver.java:1045)
        at org.apache.hadoop.hive.ql.Driver.run(Driver.java:916)
        at org.apache.hadoop.hive.ql.Driver.run(Driver.java:906)
        at org.apache.hadoop.hive.cli.CliDriver.processLocalCmd(CliDriver.java:268)
        at org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:220)
        at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:423)
        at org.apache.hadoop.hive.cli.CliDriver.executeDriver(CliDriver.java:793)
        at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:686)
        at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:625)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:606)
        at org.apache.hadoop.util.RunJar.main(RunJar.java:212)

2015-07-27 16:26:42,333 INFO  [main]: log.PerfLogger (PerfLogger.java:PerfLogEnd(135)) - </PERFLOG method=compile start=1437994602086 end=1437994602333 duration=247 from=org.apache.hadoop.hive.ql.Driver>
2015-07-27 16:26:42,333 INFO  [main]: log.PerfLogger (PerfLogger.java:PerfLogBegin(108)) - <PERFLOG method=releaseLocks from=org.apache.hadoop.hive.ql.Driver>
2015-07-27 16:26:42,333 INFO  [main]: log.PerfLogger (PerfLogger.java:PerfLogEnd(135)) - </PERFLOG method=releaseLocks start=1437994602333 end=1437994602333 duration=0 from=org.apache.hadoop.hive.ql.Driver>
2015-07-27 16:26:42,337 INFO  [main]: log.PerfLogger (PerfLogger.java:PerfLogBegin(108)) - <PERFLOG method=releaseLocks from=org.apache.hadoop.hive.ql.Driver>
2015-07-27 16:26:42,337 INFO  [main]: log.PerfLogger (PerfLogger.java:PerfLogEnd(135)) - </PERFLOG method=releaseLocks start=1437994602337 end=1437994602337 duration=0 from=org.apache.hadoop.hive.ql.Driver>

有什么帮助吗?

暂无答案!

目前还没有任何答案,快来回答吧!

相关问题