sbt got error when run Spark hello world code?
I got the same error.
build.sbt
name := "Simple Project"
version := "1.0"
scalaVersion := "2.12.3"
libraryDependencies += "org.apache.spark" %% "spark-sql" % "2.2.0"
just change scalaVersion to 2.11.8 or lower. And it works.
Your have an error in built.sbt file, you must change %%
to %
:
name := "Mpa"
version := "1.0"
scalaVersion := "2.11.8"
libraryDependencies += "org.apache.spark" % "spark-core" % "2.1.1"
%%
asks Sbt to add the current scala version to the artifact
You can spark-core_2.11
with %
to get the issue solved.
// https://mvnrepository.com/artifact/org.apache.spark/spark-core_2.11
libraryDependencies += "org.apache.spark" % "spark-core_2.11" % "2.1.0"
Hope this helps!
I got the same error and resolved it by below steps. Basically, the filename did not match the sbt configuration.
- Check filename of the spark core jar in $SPARK_HOME/jars ( it is spark-core_2.11-2.1.1.jar).
- Install scala 2.11.11.
- Edit build.sbt to scalaVersion := "2.11.11".