Spark fails to start in local mode when disconnected [Possible bug in handling IPv6 in Spark??]
OK, I seem to be able to get around it by passing configuration directly --conf spark.driver.host=localhost
So I run:
./bin/spark-shell --conf spark.driver.host=localhost
Still if there is a better solution, please let me know.
[UPDATE]
Jacek Laskowski confirmed this is probably the only available solution for now.
To those who are working with spark through sbt and having the same issue. Just add .set("spark.driver.host", "localhost") to your SparkConf() so initialisation of spark context will look like this:
val conf =
new SparkConf()
.setAppName( "temp1" )
.setMaster( "local" )
.set( "spark.driver.host", "localhost" )
val sc =
SparkContext
.getOrCreate( conf )
This initial configuration must be done before any other getOrCreate of SparkContext.