scala从case类集合创建flink数据流时“找不到隐式”

0g0grzrc  于 2021-06-24  发布在  Flink
关注(0)|答案(1)|浏览(407)

我想做一个 flink ScalaAPI HelloWorld,但是我无法复制这里看到的开始教程片段https://www.slideshare.net/dataartisans/apache-flink-datastream-api-basics/20

我尝试的副本是:

val env = StreamExecutionEnvironment.getExecutionEnvironment
  case class Order(user: String, product: String, amount: Double, proctime: Int, rowtime: Int)

  def basic() = {
    val seq = (1 to 50).map { i => Order(s"User" + (i % 10).toString, "Product" + (i % 20), 2.0 * (4 * i +.5 * i * i -.1 * i * i * i), i * 10, i * 3) }
    val ds: DataStream[Order] = env.fromElements(seq:_*)

然而,隐含的是不起作用的,给予
错误:(21,30)找不到org.apache.flink.api.common.typeinfo.typeinformation[com.blazedb.spark.flinkdemo.order]类型的证据参数的隐式值

这里需要改变什么?

enyaitl3

enyaitl31#

就这么做吧

import org.apache.flink.streaming.api.scala._

为了导入隐式 TypeInformation[T] https://github.com/apache/flink/blob/master/flink-scala/src/main/scala/org/apache/flink/api/scala/package.scala#l49

相关问题