hbase.mapreduce.tableoutputformat

nsc4cvqm  于 2021-05-31  发布在  Hadoop
关注(0)|答案(1)|浏览(700)

我使用的是hdp2.6和hbase1.1.2。
当我从不在hdp集群中的服务器提交map reduce作业时,
我得到了以下例外:

Class org.apache.hadoop.hbase.mapreduce.TableOutputFormat not found.

我的Map器在本地作业模式下正确地完成了,我确信这个类在我的类路径中,集群中的lib也有这个jar。
我在谷歌上搜索了其他人做过的几个步骤:
1使用-libjars
2将hbase\u classpath添加到hadoop.env并重新启动集群
三。将hbase master/lib添加到yarn-site.xml并重新启动集群
但这些不起作用。请帮助我,因为我只能用mr来完成这个任务。
这是我的密码:

package com.test.service;

import org.apache.hadoop.hbase.HBaseConfiguration;
import org.apache.hadoop.hbase.mapreduce.ImportTsv;
import org.apache.hadoop.security.UserGroupInformation;
import org.apache.hadoop.util.ToolRunner;
import org.springframework.beans.factory.annotation.Value;
import org.springframework.stereotype.Service;

@Service
public class Tsv2HBaseService {
    private org.apache.hadoop.conf.Configuration  config;
    private String principle;
    private String keytab;

    Tsv2HBaseService(@Value("${hbase.kerberos.REALM}") String principle,
                     @Value("${hbase.kerberos.keytab}") String keytab) {
        this.config = HBaseConfiguration.create();
        this.principle = principle;
        this.keytab = keytab;
    }

    public int importMultiKeyWithTableTSV(String table, String tsvfile, String className) throws Exception {
        String args[] = {
                "-Dimporttsv.mapper.class=" + className,
                "-DtableStructure.file=" + tsvfile.replace(".csv", ".xml"),
                table,
                tsvfile,
                "-libjars ./hbase-client-1.1.2.jar,./hbase-server-1.1.2.jar,./hbase-protocol-1.1.2.jar,./hbase-common-1.1.2.jar"
        };

        config.set("hadoop.security.authentication" , "Kerberos");
        UserGroupInformation.setConfiguration(this.config);
        UserGroupInformation.loginUserFromKeytab(this.principle, this.keytab);
        return ToolRunner.run(config, new ImportTsv(), args);
    }

}

package com.test.web;
import org.apache.hadoop.conf.Configuration;
import org.apache.hadoop.hbase.KeyValue;
import org.apache.hadoop.hbase.client.Put;
import org.apache.hadoop.hbase.io.ImmutableBytesWritable;
import org.apache.hadoop.hbase.mapreduce.CellCreator;
import org.apache.hadoop.hbase.util.Base64;
import org.apache.hadoop.io.LongWritable;
import org.apache.hadoop.io.Text;
import org.apache.hadoop.mapreduce.Mapper;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;

import java.io.IOException;

public class GRZHXXTsvImporterMapper extends Mapper<LongWritable, Text, ImmutableBytesWritable, Put> {

    private static final Logger logger = LoggerFactory.getLogger(Tsv2HbaseController.class);
    @Override
    public void map(LongWritable offset, Text value, Mapper<LongWritable, Text, ImmutableBytesWritable, Put>.Context context)
            throws IOException {
        try {
            String[] row = value.toString().split(",");
            ImmutableBytesWritable rowKey = new ImmutableBytesWritable((row[0]+row[1]+row[2]).getBytes());
            Put put = new Put(rowKey.copyBytes());
            KeyValue kv3 = new KeyValue(rowKey.copyBytes(), "PUBLIC".getBytes(),"GRJCJS".getBytes(), row[3].getBytes());
            put.add(kv3);
            // Write user table put
            context.write(rowKey, put);

        } catch (Exception e) {
            e.printStackTrace();
        }
    }
}

hadoop-env.sh文件:

...lots of settings...
HADOOP_CLASSPATH=${HADOOP_CLASSPATH}{JAVA_JDBC_LIBS}${HBASE_CLASSPATH}
...lots of settings...

yarn-site.xml:

...lots of settings...
yarn.application.classpath=.......,/usr/hdp/current/hbase-master/lib/*
...lots of settings...
fbcarpbf

fbcarpbf1#

最后我找到了解决办法。
将hbase主机添加到下面的类路径:yarn-site.xml mapred-site.xml
xml和您自己的jar资源都应该在集群中修改。

相关问题