ValueError: Cannot run multiple SparkContexts at once in spark with pyspark
Your previous session is still on. You can run
sc.stop()
You can try:
sc = SparkContext.getOrCreate(conf=conf)
You can try out this
sc = SparkContext.getOrCreate();