用于辅助服务器的用户定义hbase协处理器实现

1tu0hz3e  于 2021-05-27  发布在  Hadoop
关注(0)|答案(2)|浏览(362)

我正在尝试在我的系统上本地添加用于辅助存储的用户定义协处理器,我参考了“https://www.3pillarglobal.com/insights/hbase-coprocessors“执行链接。当我这样做的时候,当我尝试进行静态协处理器加载时,我得到了一些与regioncoprocessor相关的错误,这些错误表示所提到的classed不是regioncoprocessor的类型。请帮我完成这个功能。
DatabaseCrudoProcessor.java>实现>

@InterfaceAudience.LimitedPrivate(HBaseInterfaceAudience.COPROC)
@InterfaceStability.Evolving
public class DatabaseCrudCoprocessor  implements RegionObserver {
    private static final String className = "DatabaseCrudCoprocessor";
    private static JsonObject object = new JsonObject();
    static final CloudLogger logger = CloudLogger.getLogger("DatabaseCrudCoprocessor");

    public void postPut(ObserverContext<RegionCoprocessorEnvironment> c, Put put, WALEdit edit, Durability durability)
            throws IOException {
        try {
            Connection con = c.getEnvironment().getConnection();
            logger.info("---------------------This Code Is Excecute---------------------------");
        }catch(Exception e) {
            logger.error("In "+className+" postPut : Exception : "+e);
        }
    }
}

hbase上的log.errors>>>
在hadoop master上>>>>>>>

SLF4J: Class path contains multiple SLF4J bindings.

slf4j:在中找到绑定[jar:file:/home/hadoop/hbase/hbase/lib/demo-0.0.2-snapshot-jar-with-dependencies.jar/org/slf4j/impl/staticloggerbinder.class]slf4j:在中找到绑定[jar:file:/home/hadoop/hbase/hbase/lib/phoenix-5.0.0-hbase-2.0-client.jar/org/slf4j/impl/staticloggerbinder.class]slf4j:在中找到绑定[jar:file:/home/hadoop/hbase/hbase/lib/phoenix-5.0.0-hbase-2.0-hive.jar/org/slf4j/impl/staticloggerbinder.class]slf4j:在中找到绑定[jar:file:/home/hadoop/hbase/hbase/lib/phoenix-5.0.0-hbase-2.0-pig.jar/org/slf4j/impl/staticloggerbinder.class]slf4j:在中找到绑定[jar:file:/home/hadoop/hbase/hbase/lib/phoenix-5.0.0-hbase-2.0-thin-client.jar/org/slf4j/impl/staticloggerbinder.class]slf4j:在中找到绑定[jar:file:/home/hadoop/hbase/hbase/lib/slf4j-log4j12-1.7.25.jar/org/slf4j/impl/staticloggerbinder.class]slf4j:请参阅http://www.slf4j.org/codes.html#multiple_bindings 为了解释。slf4j:实际绑定的类型为[org.slf4j.impl.log4jloggerfactory]0[rs-eventloopgroup-1-2]info securitylogger.org.apache.hadoop.hbase.server-连接来自127.0.0.1:45903,版本=2.2.2,sasl=false,ugi=hadoop(auth:simple),服务=regionserverstatusservice 1266[主/localhost:60000:becomeactivemaster]error org.apache.hadoop.hbase.master.regionservertracker-localhost,600201576052978374没有匹配的servercrashprocedure 1266[master/localhost:60000:becomeactivemaster]error org.apache.hadoop.hbase.master.regionservertracker-localhost,60020,1539250172019没有匹配的servercrashprocedure 1266[master/localhost:60000:becomeactivemaster]error org.apache.hadoop.hbase.master.regionservertracker-localhost,600201576071410923没有匹配的servercrashprocedure 1266[master/localhost:60000:becomeactivemaster]error org.apache.hadoop.hbase.master.regionservertracker-localhost,60020,157612772238没有匹配的servercrashprocedure
在regionserver上>>>>>>>>>

0    [RS-EventLoopGroup-1-3] INFO  SecurityLogger.org.apache.hadoop.hbase.Server  - Connection from 127.0.0.1:33828, version=2.2.2, sasl=false, ugi=hadoop (auth:SIMPLE), service=AdminService

167[rs\u close\u meta-regionserver]元区域服务器/localhost:60020-0]错误org.apache.hadoop.hbase.regionserver.regionProcessorHost-demo.databaseCrudoProcessor不是regionProcessor类型。检查hbase.coprocessor.region.classes 167[rs\u close\u meta-regionserver]的配置/localhost:60020-0]错误org.apache.hadoop.hbase.coprocessor.coprocessorhost-无法加载协处理器数据库crudcoprocessor 1302[rs-eventloopgroup-1-4]info securitylogger.org.apache.hadoop.hbase.server-连接来自127.0.0.1:33830,版本=2.2.2,sasl=false,ugi=hadoop(auth:simple),服务=客户端服务

1cosmwyk

1cosmwyk1#

尝试实现RegionProcessor和regionobserver并重写getregionobserver()方法。如果您使用的是hbase 2.0版本

public class RegionObserverExample implements RegionCoprocessor, RegionObserver {
         @Override
        public Optional<RegionObserver> getRegionObserver() {
          return Optional.of(this);
        }
       ...
     }
egdjgwm8

egdjgwm82#

成为一个 RegionCoprocessor 你的类也应该实现 org.apache.hadoop.hbase.coprocessor.RegionCoprocessor

相关问题