Spark 1.6-Failed to locate the winutils binary in the hadoop binary path

Download the bin file from here Hadoop Bin then System.setProperty("hadoop.home.dir", "Desktop\bin");


you can try set the HADOOP_HOME environment variable to:

C:\Users\GERAL\Desktop\hadoop-2.6.0

instead of

C:\Users\GERAL\Desktop\hadoop-2.6.0\bin  

Install JDK 1.8, Download Spark Binary from Apache Spark & Winutils from Git repo

Set the user variables path for JDK, Spark binary, Winutils

JAVA_HOME
C:\Program Files\Java\jdk1.8.0_73

HADOOP_HOME
C:\Hadoop

SPARK_HOME
C:\spark-2.3.1-bin-hadoop2.7

PATH
C:\Program Files\Java\jdk1.8.0_73\bin;%HADOOP_HOME%\bin;%SPARK_HOME%\bin;

Open command prompt and run spark-shell

Spark Shell


If you are running Spark on Windows with Hadoop, then you need to ensure your windows hadoop installation is properly installed. to run spark you need to have winutils.exe and winutils.dll in your hadoop home directory bin folder.

I would ask you to try this first:

1) You can download .dll and .exe fils from the bundle in below link.

https://codeload.github.com/sardetushar/hadooponwindows/zip/master

2) Copy winutils.exe and winutils.dll from that folder to your $HADOOP_HOME/bin.

3) Set the HADOOP_HOME either in your spark-env.sh or at the command, and add HADOOP_HOME/bin to PATH.

and then try running.

If you need any assistance for hadoop installation help, there is a nice link, you can try it.

http://toodey.com/2015/08/10/hadoop-installation-on-windows-without-cygwin-in-10-mints/

But, that can wait. you can try the first few steps.