how to find HADOOP_HOME path on Linux?

Navigate to the path where hadoop is installed. locate ${HADOOP_HOME}/etc/hadoop, e.g.

/usr/lib/hadoop-2.2.0/etc/hadoop

When you type the ls for this folder you should see all these files.

capacity-scheduler.xml      httpfs-site.xml
configuration.xsl           log4j.properties
container-executor.cfg      mapred-env.cmd
core-site.xml               mapred-env.sh
core-site.xml~              mapred-queues.xml.template
hadoop-env.cmd              mapred-site.xml
hadoop-env.sh               mapred-site.xml~
hadoop-env.sh~              mapred-site.xml.template
hadoop-metrics2.properties  slaves
hadoop-metrics.properties   ssl-client.xml.example
hadoop-policy.xml           ssl-server.xml.example
hdfs-site.xml               yarn-env.cmd
hdfs-site.xml~              yarn-env.sh
httpfs-env.sh               yarn-site.xml
httpfs-log4j.properties     yarn-site.xml~
httpfs-signature.secret

Core configuration settings are available in hadoop-env.sh.

You can see classpath settings in this file and I copied some sample here for your reference.

# The java implementation to use.
export JAVA_HOME=/usr/lib/jvm/jdk1.7.0_67

# The jsvc implementation to use. Jsvc is required to run secure datanodes.
#export JSVC_HOME=${JSVC_HOME}

export HADOOP_CONF_DIR=${HADOOP_CONF_DIR}

# Extra Java CLASSPATH elements.  Automatically insert capacity-scheduler.
for f in $HADOOP_HOME/contrib/capacity-scheduler/*.jar; do
    export HADOOP_CLASSPATH=${HADOOP_CLASSPATH+$HADOOP_CLASSPATH:}$f
done

Hope this helps!


hadoop-core jar file is in ${HADOOP_HOME}/share/hadoop/common directory, not in ${HADOOP_HOME} directory.

You can set the environment variable in your .bashrc file.

vim ~/.bashrc

Then add the following line to the end of .bashrc file.

export HADOOP_HOME=/your/hadoop/installation/directory

Just replace the path with your hadoop installation path.

Tags:

Linux

Hadoop