How can set the default spark logging level?
you can also update the log level programmatically like below, get hold of spark object from JVM and do like below
def update_spark_log_level(self, log_level='info'):
self.spark.sparkContext.setLogLevel(log_level)
log4j = self.spark._jvm.org.apache.log4j
logger = log4j.LogManager.getLogger("my custom Log Level")
return logger;
use:
logger = update_spark_log_level('debug')
logger.info('you log message')
feel free to comment if you need more details
http://spark.apache.org/docs/latest/configuration.html#configuring-logging
Configuring Logging
Spark uses log4j for logging. You can configure it by adding a log4j.properties file in the conf directory. One way to start is to copy the existing log4j.properties.template located there.
The following blog about "How to log in spark" https://www.mapr.com/blog/how-log-apache-spark suggest a way to configure log4j, and provide suggestion which includes directing INFO level logs into a file.