How to get rid of "Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties" message?
Even simpler you just cd SPARK_HOME/conf
then mv log4j.properties.template log4j.properties
then open log4j.properties
and change all INFO
to ERROR
. Here SPARK_HOME
is the root directory of your spark installation.
Some may be using hdfs
as their Spark storage backend and will find the logging messages are actually generated by hdfs
. To alter this, go to the HADOOP_HOME/etc/hadoop/log4j.properties
file. Simply change hadoop.root.logger=INFO,console
to hadoop.root.logger=ERROR,console
. Once again HADOOP_HOME
is the root of your hadoop installation for me this was /usr/local/hadoop
.
Okay, So I've figured out a way to do this. So basically, I had my own log4j.xml initially, that was being used, and hence we were seeing this property. Once I had my own "log4j.properties" file, this message went away.
If you put a log4j.properties
file under both the main/resources
and the test/resources
this also occurs. In this case, deleting the file from the test/resources
and using only the file from the main/resources
fixes the issue.