How to detect Databricks environment programmatically
You can simply check for the existence of an environment variable e.g.:
def isRunningInDatabricks(): Boolean =
sys.env.contains("DATABRICKS_RUNTIME_VERSION")
You can look for spark configuration environment variable such as "spark.home" and value as /databricks/spark
python: sc._conf.get("spark.home")
result: '/databricks/spark'
How about this:
Python:
def isLocal():
setting = spark.conf.get("spark.master")
return ("local" in setting)
Scala:
def isLocal(): Boolean = {
val setting = spark.conf.get("spark.master")
return ("local" contains setting)
}