Connecting to a remote Spark master - Java / Scala
For binding master host-name/IP go to your spark installation conf directory (spark-2.0.2-bin-hadoop2.7/conf) and create spark-env.sh file using below command.
cp spark-env.sh.template spark-env.sh
Open spark-env.sh file in vi editor and add below line with host-name/IP of your master.
SPARK_MASTER_HOST=ec2-54-245-111-320.compute-1.amazonaws.com
Stop and start Spark using stop-all.sh and start-all.sh. Now you can use it to connect remote master using
val spark = SparkSession.builder()
.appName("SparkSample")
.master("spark://ec2-54-245-111-320.compute-1.amazonaws.com:7077")
.getOrCreate()
For more information on setting environment variables please check http://spark.apache.org/docs/latest/spark-standalone.html#cluster-launch-scripts