Multiple SparkContext detected in the same JVM
Are you sure you need the JavaSparkContext as a separate context? The previous question that you refer to doesn't say so. If you already have a Spark Context you can create a new JavaSparkContext from it, rather than create a separate context:
SparkConf conf = new SparkConf();
conf.setAppName("Spark MultipleContest Test");
conf.set("spark.driver.allowMultipleContexts", "true");
conf.setMaster("local");
SparkContext sc = new SparkContext(conf);
SQLContext sqlContext = new org.apache.spark.sql.SQLContext(sc);
//Create a Java Context which is the same as the scala one under the hood
JavaSparkContext.fromSparkContext(sc)
the SparkContext is running by default, so u have to stop this context: sc.stop then you can continue without any pb