sbt错误:objectspark不是org.apache包的成员

o7jaxewo  于 2021-05-27  发布在  Spark
关注(0)|答案(1)|浏览(503)

我安装了 sbt-1.3.4.msi 当试图建立一个样本时 SparkPi.scala 应用程序,我得到以下错误:

C:\myapps\sbt\sparksample>sbt
[info] Loading project definition from C:\myapps\sbt\sparksample\project
[info] Compiling 1 Scala source to C:\myapps\sbt\sparksample\project\target\scala-2.12\sbt-1.0\classes ...
[error] C:\myapps\sbt\sparksample\project\src\main\scala\SparkPi.scala:3:19: object spark is not a member of package org.apache
[error] import org.apache.spark._
[error]                   ^
[error] C:\myapps\sbt\sparksample\project\src\main\scala\SparkPi.scala:8:20: not found: type SparkConf
[error]     val conf = new SparkConf().setAppName("Spark Pi")
[error]                    ^
[error] C:\myapps\sbt\sparksample\project\src\main\scala\SparkPi.scala:9:21: not found: type SparkContext
[error]     val spark = new SparkContext(conf)
[error]                     ^
[error] three errors found
[error] (Compile / compileIncremental) Compilation failed

这个 SparkPi.scala 文件位于 C:\myapps\sbt\sparksample\project\src\main\scala (如上面的错误消息所示)。
我错过了什么?
这个 C:\myapps\sbt\sparksample\sparksample.sbt 文件如下:

name := "Spark Sample"

version := "1.0"

scalaVersion := "2.12.10"

libraryDependencies += "org.apache.spark" %% "spark-core" % "3.0.0"
rn0zuynd

rn0zuynd1#

c:\myapps\sbt\sparksample\project\src\main\scala目录有sparkpi.scala文件
这就是问题所在。您已将scala文件置于 project sbt本身拥有的目录(不是sbt管理的scala项目)。
移动 SparkPi.scala 以及其他scala文件 C:\myapps\sbt\sparksample\src\main\scala .

相关问题