Is it possible to get the current spark context settings in PySpark?
Yes: sc.getConf().getAll()
Which uses the method:
SparkConf.getAll()
as accessed by
SparkContext.sc.getConf()
See it in action:
In [4]: sc.getConf().getAll()
Out[4]:
[(u'spark.master', u'local'),
(u'spark.rdd.compress', u'True'),
(u'spark.serializer.objectStreamReset', u'100'),
(u'spark.app.name', u'PySparkShell')]
Spark 2.1+
spark.sparkContext.getConf().getAll()
where spark
is your sparksession
(gives you a dict
with all configured settings)
Spark 1.6+
sc.getConf.getAll.foreach(println)