Why does pyspark fail with "Unable to locate hive jars to connect to metastore. Please set spark.sql.hive.metastore.jars."?
IllegalArgumentException: u'Unable to locate hive jars to connect to metastore. Please set spark.sql.hive.metastore.jars.'
I had the same issue and fixed it by using Java 8. Make sure you install JDK 8 and set the environment variables accordingly.
Do not use Java 11 with Spark / pyspark 2.4.
If you have several java versions you'll have to figure out which spark is using (I did this using trial and error , starting with
JAVA_HOME="/usr/lib/jvm/java-11-openjdk-amd64"
and ending with
JAVA_HOME="/usr/lib/jvm/java-8-openjdk-amd64"
export JAVA_HOME=/Library/Java/JavaVirtualMachines/jdk1.8.0_131.jdk/Contents/Home
Did the trick.