org.apache.spark.streaming.kafka.KafkaUtils.createDirectStream()方法的使用及代码示例

x33g5p2x  于2022-01-23 转载在 其他  
字(4.4k)|赞(0)|评价(0)|浏览(243)

本文整理了Java中org.apache.spark.streaming.kafka.KafkaUtils.createDirectStream()方法的一些代码示例,展示了KafkaUtils.createDirectStream()的具体用法。这些代码示例主要来源于Github/Stackoverflow/Maven等平台,是从一些精选项目中提取出来的代码,具有较强的参考意义,能在一定程度帮忙到你。KafkaUtils.createDirectStream()方法的具体详情如下:
包路径:org.apache.spark.streaming.kafka.KafkaUtils
类名称:KafkaUtils
方法名:createDirectStream

KafkaUtils.createDirectStream介绍

暂无

代码示例

代码示例来源:origin: io.zipkin.sparkstreaming/zipkin-sparkstreaming-stream-kafka

@Override public JavaDStream<byte[]> create(JavaStreamingContext jsc) {
 return KafkaUtils.createDirectStream(
   jsc,
   byte[].class,
   byte[].class,
   DefaultDecoder.class,
   DefaultDecoder.class,
   kafkaParams(),
   Collections.singleton(topic()))
   .map(m -> m._2); // get value
}

代码示例来源:origin: aseigneurin/kafka-sandbox

public static void main(String[] args) {
  SparkConf conf = new SparkConf()
      .setAppName("kafka-sandbox")
      .setMaster("local[*]");
  JavaSparkContext sc = new JavaSparkContext(conf);
  JavaStreamingContext ssc = new JavaStreamingContext(sc, new Duration(2000));
  Set<String> topics = Collections.singleton("mytopic");
  Map<String, String> kafkaParams = new HashMap<>();
  kafkaParams.put("metadata.broker.list", "localhost:9092");
  JavaPairInputDStream<String, String> directKafkaStream = KafkaUtils.createDirectStream(ssc,
      String.class, String.class, StringDecoder.class, StringDecoder.class, kafkaParams, topics);
  directKafkaStream.foreachRDD(rdd -> {
    System.out.println("--- New RDD with " + rdd.partitions().size()
        + " partitions and " + rdd.count() + " records");
    rdd.foreach(record -> System.out.println(record._2));
  });
  ssc.start();
  ssc.awaitTermination();
}

代码示例来源:origin: aseigneurin/kafka-sandbox

public static void main(String[] args) {
  SparkConf conf = new SparkConf()
      .setAppName("kafka-sandbox")
      .setMaster("local[*]");
  JavaSparkContext sc = new JavaSparkContext(conf);
  JavaStreamingContext ssc = new JavaStreamingContext(sc, new Duration(2000));
  Set<String> topics = Collections.singleton("mytopic");
  Map<String, String> kafkaParams = new HashMap<>();
  kafkaParams.put("metadata.broker.list", "localhost:9092");
  JavaPairInputDStream<String, byte[]> directKafkaStream = KafkaUtils.createDirectStream(ssc,
      String.class, byte[].class, StringDecoder.class, DefaultDecoder.class, kafkaParams, topics);
  directKafkaStream
      .map(message -> recordInjection.invert(message._2).get())
      .foreachRDD(rdd -> {
        rdd.foreach(record -> {
          System.out.println("str1= " + record.get("str1")
              + ", str2= " + record.get("str2")
              + ", int1=" + record.get("int1"));
        });
      });
  ssc.start();
  ssc.awaitTermination();
}

代码示例来源:origin: baghelamit/iot-traffic-monitor

topicsSet.add(topic);
JavaPairInputDStream<String, IoTData> directKafkaStream = KafkaUtils.createDirectStream(
     jssc,
     String.class,

代码示例来源:origin: org.apache.spark/spark-streaming-kafka-0-8

kafkaParams.put("auto.offset.reset", "smallest");
JavaDStream<String> stream1 = KafkaUtils.createDirectStream(
  ssc,
  String.class,
JavaDStream<String> stream2 = KafkaUtils.createDirectStream(
  ssc,
  String.class,

代码示例来源:origin: org.apache.spark/spark-streaming-kafka-0-8_2.11

kafkaParams.put("auto.offset.reset", "smallest");
JavaDStream<String> stream1 = KafkaUtils.createDirectStream(
  ssc,
  String.class,
JavaDStream<String> stream2 = KafkaUtils.createDirectStream(
  ssc,
  String.class,

代码示例来源:origin: org.apache.spark/spark-streaming-kafka

kafkaParams.put("auto.offset.reset", "smallest");
JavaDStream<String> stream1 = KafkaUtils.createDirectStream(
  ssc,
  String.class,
JavaDStream<String> stream2 = KafkaUtils.createDirectStream(
  ssc,
  String.class,

代码示例来源:origin: org.apache.spark/spark-streaming-kafka_2.10

kafkaParams.put("auto.offset.reset", "smallest");
JavaDStream<String> stream1 = KafkaUtils.createDirectStream(
  ssc,
  String.class,
JavaDStream<String> stream2 = KafkaUtils.createDirectStream(
  ssc,
  String.class,

代码示例来源:origin: sectong/SparkToParquet

JavaPairInputDStream<String, String> messages = KafkaUtils.createDirectStream(jssc, String.class, String.class,
    StringDecoder.class, StringDecoder.class, kafkaParams, topicsSet);

代码示例来源:origin: org.apache.spark/spark-streaming-kafka_2.11

kafkaParams.put("auto.offset.reset", "smallest");
JavaDStream<String> stream1 = KafkaUtils.createDirectStream(
  ssc,
  String.class,
JavaDStream<String> stream2 = KafkaUtils.createDirectStream(
  ssc,
  String.class,

相关文章

微信公众号

最新文章

更多