how to use spark context and spark session in pyspark code example
Example: Gets an existing SparkSession or, if there is no existing one, creates a new
# Gets an existing SparkSession or, if there is no one, creates a new
s1 = SparkSession.builder.config("k1", "v1").getOrCreate()
s1.conf.get("k1") == s1.sparkContext.getConf().get("k1") == "v1"
# True
s2 = SparkSession.builder.config("k2", "v2").getOrCreate()
s1.conf.get("k1") == s2.conf.get("k1")
# True
s1.conf.get("k2") == s2.conf.get("k2")
# True