我在用spark数据框向Kafka推送数据时遇到了问题。
让我用示例详细解释我的场景。我想将数据加载到spark并将spark输出发送到kafka。我正在使用Gradle3.5和spark 2.3.1以及kafka 1.0.1
这是build.gradle
buildscript {
ext {
springBootVersion = '1.5.15.RELEASE'
}
repositories {
mavenCentral()
}
dependencies {
classpath("org.springframework.boot:spring-boot-gradle-plugin:${springBootVersion}")
}
}
apply plugin: 'scala'
apply plugin: 'java'
apply plugin: 'eclipse'
apply plugin: 'org.springframework.boot'
group = 'com.sample'
version = '0.0.1-SNAPSHOT'
sourceCompatibility = 1.8
repositories {
mavenCentral()
}
dependencies {
compile('org.springframework.boot:spring-boot-starter')
compile ('org.apache.spark:spark-core_2.11:2.3.1')
compile ('org.apache.spark:spark-sql_2.11:2.3.1')
compile ('org.apache.spark:spark-streaming-kafka-0-10_2.11:2.3.1')
compile ('org.apache.spark:spark-sql-kafka-0-10_2.11:2.3.1')
testCompile('org.springframework.boot:spring-boot-starter-test')
}
这是我的密码:
package com.sample
import org.apache.spark.SparkConf
import org.apache.spark.SparkContext
import org.apache.spark.sql.SQLContext
import org.apache.spark.sql.functions._
object SparkConnection {
case class emp(empid:Integer, empname:String, empsal:Float)
def main(args:Array[String]) {
val sparkConf = new SparkConf().setAppName("Spark
Connection").setMaster("local[*]")
val sc = new SparkContext(sparkConf)
val dataRdd = sc.textFile("/home/sample/data/sample.txt")
val mapRdd = dataRdd.map(row => row.split(","))
val empRdd = mapRdd.map( row => emp(row(0).toInt, row(1), row(2).toFloat))
val sqlContext = new SQLContext(sc)
import sqlContext.implicits._
val empDF = empRdd.toDF()
empDF.
select(to_json(struct(empDF.columns.map(column):_*)).alias("value"))
.write.format("kafka")
.option("kafka.bootstrap.servers", "localhost:9092")
.option("topic", "my-kafka-topic").save()
}
}
请忽略build.gradle中的spring引导框架api。
使用gradle构建包之后,我可以看到.gradle文件中提到的所有依赖类。
但是当我用spark运行代码时
spark-submit --class com.sample.SparkConnection spark_kafka_integration.jar
我得到以下错误
Exception in thread "main" java.lang.ClassNotFoundException: Failed to find data source: kafka. Please find packages at http://spark.apache.org/third-party-projects.html
at org.apache.spark.sql.execution.datasources.DataSource$.lookupDataSource(DataSource.scala:635)
at org.apache.spark.sql.DataFrameWriter.save(DataFrameWriter.scala:241)
at com.iniste.SparkConnection$.main(SparkConnection.scala:29)
at com.iniste.SparkConnection.main(SparkConnection.scala)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.spark.deploy.JavaMainApplication.start(SparkApplication.scala:52)
at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:894)
at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:198)
at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:228)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:137)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
Caused by: java.lang.ClassNotFoundException: kafka.DefaultSource
at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
at org.apache.spark.sql.execution.datasources.DataSource$$anonfun$23$$anonfun$apply$15.apply(DataSource.scala:618)
at org.apache.spark.sql.execution.datasources.DataSource$$anonfun$23$$anonfun$apply$15.apply(DataSource.scala:618)
at scala.util.Try$.apply(Try.scala:192)
at org.apache.spark.sql.execution.datasources.DataSource$$anonfun$23.apply(DataSource.scala:618)
at org.apache.spark.sql.execution.datasources.DataSource$$anonfun$23.apply(DataSource.scala:618)
at scala.util.Try.orElse(Try.scala:84)
at org.apache.spark.sql.execution.datasources.DataSource$.lookupDataSource(DataSource.scala:618)
... 13 more
2018-09-05 17:41:17 INFO SparkContext:54 - Invoking stop() from shutdown hook
2018-09-05 17:41:17 INFO AbstractConnector:318 - Stopped Spark@51684e4a{HTTP/1.1,[http/1.1]}{0.0.0.0:4040}
2018-09-05 17:41:17 INFO MapOutputTrackerMasterEndpoint:54 - MapOutputTrackerMasterEndpoint stopped!
2018-09-05 17:41:17 INFO MemoryStore:54 - MemoryStore cleared
2018-09-05 17:41:17 INFO BlockManager:54 - BlockManager stopped
2018-09-05 17:41:17 INFO BlockManagerMaster:54 - BlockManagerMaster stopped
2018-09-05 17:41:17 INFO OutputCommitCoordinator$OutputCommitCoordinatorEndpoint:54 - OutputCommitCoordinator stopped!
2018-09-05 17:41:17 INFO SparkContext:54 - Successfully stopped SparkContext
2018-09-05 17:41:17 INFO ShutdownHookManager:54 - Shutdown hook called
2018-09-05 17:41:17 INFO ShutdownHookManager:54 - Deleting directory /tmp/spark-bd4cb4ef-3883-4c26-a93f-f355b13ef306
2018-09-05 17:41:17 INFO ShutdownHookManager:54 - Deleting directory /tmp/spark-156dfdbd-cff4-4c70-943f-35ef403a01ed
请帮我摆脱这个错误。还有一些他们建议我使用的博客——spark submit的packages选项。但我有一些代理限制,需要下载上述软件包。但我不明白为什么spark submit不能拿到已经有的jar。请纠正我哪里做错了。
1条答案
按热度按时间a6b3iqyw1#
与任何spark应用程序一样,spark submit用于启动应用程序。spark-sql-kafka-0-10_.11及其依赖项可以使用--packages直接添加到spark submit中,如下所示
这个可以在这里找到
不过,根据cricket\u007的建议,我已经将shadowjar添加到了your build.gradle中,因此新的shadowjar看起来可能与此类似。
所以要创建jar,命令应该是:gradle中的shadowjar。