运行定制的kafka streams dsl应用程序返回java.lang.classnotfoundexception

sirbozc5  于 2021-06-07  发布在  Kafka
关注(0)|答案(1)|浏览(226)

我试图从包含json数据的kafka主题中读取内容,并根据字段“entity”的值写入新主题。我用下面的代码来读写Kafka

import org.apache.kafka.common.serialization.Serde;
import org.apache.kafka.common.serialization.Serdes;
import org.apache.kafka.streams.KafkaStreams;
import org.apache.kafka.streams.KeyValue;
import org.apache.kafka.streams.StreamsConfig;
import org.apache.kafka.streams.kstream.KStream;
import org.apache.kafka.streams.kstream.KStreamBuilder;
import java.util.Properties;

public class entityDataLoader {
public static void main(final String[] args) throws Exception {
final Properties streamsConfiguration = new Properties();
streamsConfiguration.put(StreamsConfig.APPLICATION_ID_CONFIG, "map-function-lambda-example");
streamsConfiguration.put(StreamsConfig.BOOTSTRAP_SERVERS_CONFIG, "localhost:9092");
streamsConfiguration.put(StreamsConfig.KEY_SERDE_CLASS_CONFIG, Serdes.ByteArray().getClass().getName());
streamsConfiguration.put(StreamsConfig.VALUE_SERDE_CLASS_CONFIG, Serdes.String().getClass().getName());

// Set up serializers and deserializers, which we will use for overriding the default serdes
// specified above.
final Serde<String> stringSerde = Serdes.String();
final Serde<byte[]> byteArraySerde = Serdes.ByteArray();

// In the subsequent lines we define the processing topology of the Streams application.
final KStreamBuilder builder = new KStreamBuilder();

// Read the input Kafka topic into a KStream instance.
final KStream<byte[], String> textLines = builder.stream(byteArraySerde, stringSerde, "postilion-events");

String content = textLines.toString();
String entity = JSONExtractor.returnJSONValue(content, "entity");
System.out.println(entity);

textLines.to(entity);

final KafkaStreams streams = new KafkaStreams(builder, streamsConfiguration);
streams.cleanUp();
streams.start();

// Add shutdown hook to respond to SIGTERM and gracefully close Kafka Streams
Runtime.getRuntime().addShutdownHook(new Thread(streams::close));
}
}

你知道怎样才能成功运行这个应用程序吗?
使用netbeans,我用依赖项构建jar文件,并将其放在/home/kafka路径中,尝试将其作为类路径运行,并指定我创建的类(使用命令 java -cp mavenproject.jar postilionkafka.entityDataLoader ). 我得到以下错误

Error: A JNI error has occurred, please check your installation and try again
Exception in thread "main" java.lang.NoClassDefFoundError: org/apache/kafka/streams/processor/TopologyBuilder
    at java.lang.Class.getDeclaredMethods0(Native Method)
    at java.lang.Class.privateGetDeclaredMethods(Class.java:2701)
    at java.lang.Class.privateGetMethodRecursive(Class.java:3048)
    at java.lang.Class.getMethod0(Class.java:3018)
    at java.lang.Class.getMethod(Class.java:1784)
    at sun.launcher.LauncherHelper.validateMainClass(LauncherHelper.java:544)
    at sun.launcher.LauncherHelper.checkAndLoadMain(LauncherHelper.java:526)
Caused by: java.lang.ClassNotFoundException: org.apache.kafka.streams.processor.TopologyBuilder
    at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
    at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:331)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
    ... 7 more

多亏了@james,我才解决了这个问题。我无法从主题中的记录中提取实体数据。主题中的记录是json,示例是 {"date":{"string":"2017-03-20"},"time":{"string":"20:04:13:563"},"event_nr":1572470,"interface":"Transaction Manager","event_id":5001,"date_time":1490040253563,"entity":"Transaction Manager","state":0,"msg_param_1":{"string":"ISWSnk"},"msg_param_2":{"string":"Application startup"},"msg_param_3":null,"msg_param_4":null,"msg_param_5":null,"msg_param_6":null,"msg_param_7":null,"msg_param_8":null,"msg_param_9":null,"long_msg_param_1":null,"long_msg_param_2":null,"long_msg_param_3":null,"long_msg_param_4":null,"long_msg_param_5":null,"long_msg_param_6":null,"long_msg_param_7":null,"long_msg_param_8":null,"long_msg_param_9":null,"last_sent":{"long":1490040253563},"transmit_count":{"int":1},"team_id":null,"app_id":{"int":4},"logged_by_app_id":{"int":4},"entity_type":{"int":3},"binary_data":null} 我想根据entity字段的值写入主题(对于下面的json示例,它应该写入主题事务管理器)。如果我运行我当前的代码,我会得到下面的错误
slf4j:未能加载类“org.slf4j.impl.staticloggerbinder”。slf4j:默认为无操作(nop)记录器实现slf4j:请参阅http://www.slf4j.org/codes.html#staticloggerbinder 更多细节。org.apache.kafka.streams.kstream.internals。kstreamimpl@568db2f2 在位置0处未找到意外字符(o)。线程“main”java.lang.nullpointerexception中的null异常:在java.util.objects.RequirenNull(objects)中,主题不能为null。java:228)在org.apache.kafka.streams.kstream.internals.kstreamimpl.to(kstreamimpl。java:353)在org.apache.kafka.streams.kstream.internals.kstreamimpl.to(kstreamimpl。java:337)在postilionkafka.dataload.main(数据加载。java:35)
jsonextractor类定义为

import org.json.simple.JSONObject;
import org.json.simple.parser.ParseException;
import org.json.simple.parser.JSONParser;
class JSONExtractor {

/**
 *
 */
public static String returnJSONValue(String args, String value){
    JSONParser parser = new JSONParser();
    String app= null;
    System.out.println(args);
    try{
        Object obj = parser.parse(args);
        JSONObject JObj = (JSONObject)obj;
        app= (String) JObj.get(value);
        return app;
    }
    catch(ParseException pe){
        System.out.println("No Object found");
        System.out.println(pe);
    }
    return app;
}
}
r8xiu3jd

r8xiu3jd1#

这看起来像是一个简单的类路径问题,请尝试在classpath参数中添加所有非标准java的jar,例如:

java -cp kafka-stream.jar:mavenproject.jar postilionkafka.entityDataLoader

这往往会很快变得过于复杂,这也是我们使用maven来管理依赖关系的原因之一。我通常直接从ide运行我正在处理的任何应用程序,这也是一种更容易调试的方法。如果我必须在ide之外启动,我仍然会从ide中尝试,intellij会注销包含所需依赖项的执行命令,并节省我重新建立这些依赖项以及如何从本地maven repo中提取它们的时间。
如果从ide运行不适合您,另一种方法是使用maven exec。从maven运行一个项目可以看到这个答案。

相关问题