Hadoop installation error, "error : cannot execute hdfs-config.sh."

@BIKI I just ran into the same problem, and the Hadoop release 3.0.0 has a weird file structure that does not work with the home directory set the way you would think it should be.

I am on a MAC High Sierra OS 10.13, and installed using brew but I think you would see something similar on Ubuntu, or any UNIX-like system.

Bottom line, if you want to track down the errors, check your HADOOP_HOME in your profile (.bash_profile) and the scripts that are started when you kick off Hadoop. In my case, I have an alias set in my profile called hstart and it calls the following files:

start-dfs.sh

AND

start-yarn.sh

These files call the hdfs-config.sh file which gets lost given the home directory setting.

My Hadoop home directory was set to:

export HADOOP_HOME=/usr/local/Cellar/hadoop/3.0.0

And I changed it to:

export HADOOP_HOME=/usr/local/Cellar/hadoop/3.0.0/libexec

Of course you have to source your configuration profile, and in my case it was:

source .bash_profile

For me, this did the trick. Hope that helps!


Looks like latest version has issues with Brew. I tried directly downloading version Hadoop-2.8.1 from here.

Follow the same instructions. It is working.


Same issues for Hadoop 3.1.1 and above installed via Brew. HADOOP_HOME was not set up properly. Execute:

$ echo $HADOOP_HOME

And if you will see ”/usr/local/Cellar/hadoop” you have to add your spesific Hadoop version

$ export HADOOP_HOME=/usr/local/Cellar/hadoop/3.1.1