How to access SparkContext from SparkSession instance?
Asumming you have a spark session
spark_session = SparkSession \
.builder \
.enableHiveSupport() \
.getOrCreate()
Spark Context can be inferred using
spark_context = spark_session._sc
or
spark_context = spark_session.sparkContext
You almost got it right, it's lowercase s at the beginning:
>>> spark.sparkContext
<SparkContext master=local[*] appName=PySparkShell>