如何解决这个错误值tods不是org.apache.spark.rdd.rdd的成员?

jei2mxaa  于 2021-05-27  发布在  Spark
关注(0)|答案(1)|浏览(457)

我写这段代码是为了将来自twitter的流数据发送到elasticsearch,我添加了所有必要的依赖项,但是我对tods和toes这两个函数有问题,请帮助我解决这个问题这是我的代码:`package org.lansrod.visualization

import org.apache.spark.SparkConf
 import org.apache.spark.sql.SparkSession
 import org.apache.spark.streaming.twitter.TwitterUtils
 import org.apache.spark.streaming.{Seconds, StreamingContext}
 import twitter4j.auth.OAuthAuthorization
 import twitter4j.conf.ConfigurationBuilder
 import org.apache.spark.sql.{Row, SparkSession}

 object twitter {

 def main(args: Array[String]) {
val conf = new SparkConf().setMaster("local[*]").setAppName("twitter")
val ssc = new StreamingContext(conf, Seconds(5)) // spark streaming context

val ACCESS_TOKEN = "my access token"
val ACCESS_SECRET = "my access secret"
val CONSUMER_KEY = "my consumer key"
val CONSUMER_SECRET = "my consumer secret"
val cb = new ConfigurationBuilder
cb.setDebugEnabled(true).setOAuthConsumerKey(CONSUMER_KEY)
  .setOAuthConsumerSecret(CONSUMER_SECRET)
  .setOAuthAccessToken(ACCESS_TOKEN)
  .setOAuthAccessTokenSecret(ACCESS_SECRET)

val auth = new OAuthAuthorization(cb.build) //avoir l'authorisation
val tweets = TwitterUtils.createStream(ssc, Some(auth))

ssc.start()
ssc.awaitTermination()

tweets.foreachRDD  { rdd =>
  val spark = SparkSession.builder.config(rdd.sparkContext.getConf).getOrCreate()
  import spark.implicits._
  val caseClassDS = rdd.toDS()
  caseClassDS.saveToEs("spark/docs")

}
}
}

`
我的build.sbt如下:

scalaVersion := "2.11.0"
val sparkVersion = "2.4.6"
`libraryDependencies ++= Seq(
"org.apache.spark" %% "spark-core" % sparkVersion % "provided", //provided
"org.apache.spark" %% "spark-sql" % sparkVersion % "provided",
"org.elasticsearch" %% "elasticsearch-spark-20" % "7.6.1" ,

"org.apache.spark" %% "spark-streaming" % sparkVersion % "provided",
"org.apache.spark" % "spark-streaming-twitter_2.11" % "1.6.1" exclude("org.twitter4j", 
"twitter4j"),
"org.twitter4j" % "twitter4j-core" % "2.2.0",
"org.twitter4j" % "twitter4j-stream" % "2.2.0",
"org.apache.spark`enter code here`" %% "spark-mllib" % sparkVersion)
lsmd5eda

lsmd5eda1#

spark版本<2.x tods可与sqlcontext.implicits一起使用_

import sqlContext.implicits._
 val myrdd = testRDD.toDS()

Spark版本>=2.x

val spark: SparkSession = SparkSession.builder.config(conf).getOrCreate; 
import spark.implicits._ 
val myrdd = testRDD.toDS()

相关问题