如何使用spark overwrite jdbc connect open kernethive?

bkkx9g8r  于 8个月前  发布在  Hive
关注(0)|答案(1)|浏览(66)

环境:原始Hadoop;开放式Hive; depoly-mode:yanr-clint;每个haoop节点放置证书;进程:重写spark jdbcsource,spark使用此源连接hive,连接前有auth,auth成功,但执行者执行连接时有异常:

sqlexception:could not open client for any of server uri is zookeeper:null

问题:如何解决这个错误,我已经尝试设置UserGroupInformation和auth成功,并设置spark javaextraoption但不生效

56lgkhnf

56lgkhnf1#

这是验证码:

public static void  initkerberos() {

        try {
            String configPath = "/opt/hbaseConfig/tx/"+krbConfig;
            String keytabPath = "/opt/hbaseConfig/tx/"+krbKeytab;

            System.setProperty("java.security.krb5.conf", configPath);
            Configuration conf = new Configuration();
            conf.set("hadoop.security.authentication", "kerberos");

            UserGroupInformation.setConfiguration(conf);
            UserGroupInformation.loginUserFromKeytab(krbUser, keytabPath);
        } catch (Exception e) {
            e.printStackTrace();
            logger.error("Kerberos 验证失败", e);

        }

    }

相关问题