flinkkafkaconsumer011

mrwjdhj3  于 2021-06-21  发布在  Flink
关注(0)|答案(1)|浏览(839)

我正在群集上运行flink作业。这项工作在我的发展(当地)环境下运行良好。但当我使用以下命令将其部署到集群上时:

./bin/flink run -c org.example.CointegrationOfPairs ../coint.jar

失败,错误如下:

java.lang.NoClassDefFoundError: org/apache/flink/streaming/connectors/kafka/FlinkKafkaConsumer011
    at org.example.CointegrationOfPairs$.main(CointegrationOfPairs.scala:38)
    at org.example.CointegrationOfPairs.main(CointegrationOfPairs.scala)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:498)
    at org.apache.flink.client.program.PackagedProgram.callMainMethod(PackagedProgram.java:528)
    at org.apache.flink.client.program.PackagedProgram.invokeInteractiveModeForExecution(PackagedProgram.java:420)
    at org.apache.flink.client.program.ClusterClient.run(ClusterClient.java:404)
    at org.apache.flink.client.cli.CliFrontend.executeProgram(CliFrontend.java:785)
    at org.apache.flink.client.cli.CliFrontend.runProgram(CliFrontend.java:279)
    at org.apache.flink.client.cli.CliFrontend.run(CliFrontend.java:214)
    at org.apache.flink.client.cli.CliFrontend.parseParameters(CliFrontend.java:1025)
    at org.apache.flink.client.cli.CliFrontend.lambda$main$9(CliFrontend.java:1101)
    at org.apache.flink.runtime.security.NoOpSecurityContext.runSecured(NoOpSecurityContext.java:30)
    at org.apache.flink.client.cli.CliFrontend.main(CliFrontend.java:1101)
Caused by: java.lang.ClassNotFoundException: org.apache.flink.streaming.connectors.kafka.FlinkKafkaConsumer011
    at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:357)

我也添加了所需的依赖项

val flinkDependencies = Seq(
  "org.apache.flink" %% "flink-scala" % flinkVersion % "provided",
  "org.apache.flink" %% "flink-streaming-scala" % flinkVersion % "provided",
  "org.apache.flink" %% "flink-connector-kafka-0.11" % flinkVersion % "provided",
  "org.apache.flink" %% "flink-ml" % flinkVersion % "provided"
)

我正在使用 sbt clean assembly

xhv8bpkk

xhv8bpkk1#

连接器不包括在flink的二进制发行版中,以避免其依赖项和用户代码的版本冲突。因此,默认情况下不会将相应的类加载到flink进程的类路径中。
有两种方法可以解决此问题:
不要设置 flink-connnector-kafka 提供的依赖关系。相反,构建一个包含连接器依赖关系的胖jar。这样,连接器将与应用程序一起提供。这是首选方法。
添加 flink-connector-kafka 依赖于 ./lib flink设置的文件夹。这将分发文件并将其包含在flink进程的类路径中。

相关问题