How to set heap size in spark within the Eclipse environment?

In Eclipse go to Run > Run Configurations... > Arguments > VM arguments and set max heapsize like -Xmx512m.


I had this issue as well and this is how I solved it. Thought it might be helpful.

val conf: SparkConf = new SparkConf().setMaster("local[4]").setAppName("TestJsonReader").set("spark.driver.host", "localhost")
conf.set("spark.testing.memory", "2147480000")

Working fine for me once modifying the script as conf.set("spark.testing.memory", "2147480000")

complete code below:

import scala.math.random
import org.apache.spark._

object SparkPi {
  def main(args: Array[String]) {
    val conf: SparkConf = new SparkConf().setMaster("local").setAppName("Spark Pi").set("spark.driver.host", "localhost")

     conf.set("spark.testing.memory", "2147480000")         // if you face any memory issues


    val spark = new SparkContext(conf)
    val slices = if (args.length > 0) args(0).toInt else 2
    val n = math.min(100000L * slices, Int.MaxValue).toInt // avoid overflow

    val count = spark.parallelize(1 until n, slices).map { i =>
      val x = random * 2 - 1
      val y = random * 2 - 1
      if (x * x + y * y < 1) 1 else 0
    }.reduce(_ + _)

    println("Pi is roughly " + 4.0 * count / n)
    spark.stop()
  }
}

Step-2

Run it as “Scala Application”

Step-3 Creating JAR file and Execution:

bin/spark-submit --class SparkPi --master local SparkPi.jar