constantinputdstream.print()不执行任何操作

wf82jlnq  于 2021-05-27  发布在  Spark
关注(0)|答案(0)|浏览(228)

我试图得到一个简单的数据流打印,但没有成功。参见下面的代码。我正在azure中使用databricks笔记本。

import org.apache.spark.streaming.{ StreamingContext, Seconds }
val ssc = new StreamingContext(sc, batchDuration = Seconds(5))

ssc.checkpoint(".")

val rdd = sc.parallelize(0 to 3)
import org.apache.spark.streaming.dstream.ConstantInputDStream
val stream = new ConstantInputDStream(ssc, rdd)

println("start")

stream.print()

ssc.start()

输出为:

start

warning: there was one feature warning; re-run with -feature for details
import org.apache.spark.streaming.{StreamingContext, Seconds}
ssc: org.apache.spark.streaming.StreamingContext = org.apache.spark.streaming.StreamingContext@4d01c7b1
rdd: org.apache.spark.rdd.RDD[Int] = MapPartitionsRDD[1] at map at command-3696830887613521:7
import org.apache.spark.streaming.dstream.ConstantInputDStream
stream: org.apache.spark.streaming.dstream.ConstantInputDStream[Int] = org.apache.spark.streaming.dstream.ConstantInputDStream@12b9db22

我期待着以这样或那样的方式看到0,1,2。
我也尝试过添加

ssc.awaitTermination()

但它永远不会结束。请参见屏幕截图:

暂无答案!

目前还没有任何答案,快来回答吧!

相关问题