ClassNotFoundException scala.runtime.LambdaDeserialize when spark-submit
I have similar issue while following the instructions provided at https://spark.apache.org/docs/2.4.3/quick-start.html
My setup details: Spark version: 2.4.3 Scala version: 2.12.8
However, when i changed my sbt file to below configuration everything worked fine.(both compilation and running the application jar)
name := "Simple Project"
version := "1.0"
scalaVersion := "2.11.11"
libraryDependencies += "org.apache.spark" %% "spark-sql" % "2.4.3"
It looks like spark 2.4.3 is compatible with 2.11.11 Scala version only. While compiling the sample project sbt has downloaded the Scala 2.11 library from "https://repo1.maven.org/maven2/org/scala-lang/scala-library/2.11.11"
as @Alexey said, change Scala version to 2.11 fixed the problem.
build.sbt
name := "Simple Project"
version := "1.0"
scalaVersion := "2.11.11"
libraryDependencies += "org.apache.spark" % "spark-core_2.11" % "2.2.0"
Note that Scala version MUST MATCH with Spark. Look at the artifactId, spark-core_2.11 mean it was compatible with scala 2.11 (No backward or forward compatible)
Following is the build.sbt entries for the latest Spark 2.4.1 release sample shown in Spark/Scala online guide:
name := "SimpleApp"
version := "1.0"
scalaVersion := "2.12.8"
libraryDependencies += "org.apache.spark" %% "spark-sql" % "2.4.1"
Though everything works fine inside IntelliJ IDE, the application still fails with the following exception,
Caused by: java.lang.NoClassDefFoundError: scala/runtime/LambdaDeserialize
after creating the package with 'sbt package' command and running the spark-submit from the command line as the following;
spark-submit -v --class SimpleApp --master local[*] target\scala-2.12\simpleapp_2.12-1.0.jar