You are on page 1of 43

How to Setup Hadoop 1.2.1 on CentOS/RHEL 6/5 - TecAdmin.

net

Share this:

Home

1 of 43

Online Tools

Funny Tools

http://tecadmin.net/steps-to-install-hadoop-on-centosrhel-6/

Write For Us

About Us

Contact Us

Ask A Question

10/04/2014 11:20 AM

How to Setup Hadoop 1.2.1 on CentOS/RHEL 6/5 - TecAdmin.net

Share this:

http://tecadmin.net/steps-to-install-hadoop-on-centosrhel-6/

Search this Site...

How to Setup Hadoop 1.2.1 on CentOS/RHEL 6/5


Rahul | June 4, 2013 | Linux Tutorials, Opensource | 73 Comments
The Apache Hadoop software library is a framework that allows for the distributed processing of large data sets
across clusters of computers using simple programming models. Read More

Recent Posts
This article will help you for step by step install and configure single node hadoop cluster using Hadoop 1.2.1.

Top Reasons Why You Need a Digital Marketing


Strategy
How to Install phpMyAdmin 4.2.9 on

Step 1. Install Java


Before installing hadoop make sure you have java installed on your system. If you do not have java installed use
following article to install Java.
Steps to Install JAVA 7 on CentOS and RHEL 5/6

CentOS/RHEL 6/5 and Fedora 20/19


How to Check If Your Business Network Is Poorly
Defended
How to Install Oracle Java 8 (JDK 8u20) In
Ubuntu & LinuxMint Via PPA
How to Install Google Chrome 37 in
CentOS/RHEL 6 and Fedora 19/18
How to Install Git 2.1.0 on Ubuntu 14.04, 12.04

Step 2. Create User Account


Create a system user account to use for hadoop installation.

and LinuxMint 16/15


Steps to Install Wine 1.7.26 (Development
Release) on CentOS, RHEL and Fedora
How to Install Git 2.0.4 on CentOS/Redhat 6,5
and Fedora 20/19

# useradd hadoop
# passwd hadoop

Changing password for user hadoop.

2 of 43

10/04/2014 11:20 AM

How to Setup Hadoop 1.2.1 on CentOS/RHEL 6/5 - TecAdmin.net

Share this:

Pages

http://tecadmin.net/steps-to-install-hadoop-on-centosrhel-6/

Subscribe Newsletter

About Us

Email

Submit Article

Tag Cloud
webserver install mysql

mysql

ubuntu monitoring replication dns

Donate to Charity
FeedBack/Grievance
Sitemap

Accept Policy
Accept our privacy policy (http://tecadmin.net
/privacy-policy/)

Forum Topics

sendmail virtualhost git

Term Of Services

ssh mariadb s3 bucket backup

java install apache ftp tomcat cluster bind setup

node.js zabbix

Privacy Policy

mongodb mysql replication

security nagios php database rvm

tomcat7 ide raid jdk linux ssl

vsftpd

control panel hadoop

amazon cpanel nrpe mail apache2

Subscribe!

2014 TecAdmin.net. All Rights Reserved.

Back to Top

Step 4. Download and Extract Hadoop Source


Downlaod hadoop latest availabe version from its official site, and follow below steps.

#
#
#
#
#
#
#

mkdir /opt/hadoop
cd /opt/hadoop/
wget http://apache.mesi.com.ar/hadoop/common/hadoop-1.2.1/hadoop-1.2.1.tar.gz
tar -xzf hadoop-1.2.1.tar.gz
mv hadoop-1.2.1 hadoop
chown -R hadoop /opt/hadoop
cd /opt/hadoop/hadoop/

Step 5: Configure Hadoop


First edit hadoop configuration files and make following changes.
5.1 Edit core-site.xml

3 of 43

10/04/2014 11:20 AM

How to Setup Hadoop 1.2.1 on CentOS/RHEL 6/5 - TecAdmin.net

Share this:

http://tecadmin.net/steps-to-install-hadoop-on-centosrhel-6/

#Add the following inside the configuration tag


<property>
<name>fs.default.name</name>
<value>hdfs://localhost:9000/</value>
</property>
<property>
<name>dfs.permissions</name>
<value>false</value>
</property>
5.2 Edit hdfs-site.xml

# vim conf/hdfs-site.xml

# Add the following inside the configuration tag


<property>
<name>dfs.data.dir</name>
<value>/opt/hadoop/hadoop/dfs/name/data</value>
<final>true</final>
</property>
<property>
<name>dfs.name.dir</name>
<value>/opt/hadoop/hadoop/dfs/name</value>
<final>true</final>
</property>
<property>
<name>dfs.replication</name>
<value>2</value>
</property>
5.3 Edit mapred-site.xml

# vim conf/mapred-site.xml

4 of 43

10/04/2014 11:20 AM

How to Setup Hadoop 1.2.1 on CentOS/RHEL 6/5 - TecAdmin.net

Share this:

http://tecadmin.net/steps-to-install-hadoop-on-centosrhel-6/

<property>
<name>mapred.job.tracker</name>
<value>localhost:9001</value>
</property>
5.4 Edit hadoop-env.sh

# vim conf/hadoop-env.sh

export JAVA_HOME=/opt/jdk1.7.0_55
export HADOOP_OPTS=-Djava.net.preferIPv4Stack=true

Set JAVA_HOME path as per your system configuration for java.


Next to format Name Node

# su - hadoop
$ cd /opt/hadoop/hadoop
$ bin/hadoop namenode -format

13/06/02 22:53:48 INFO namenode.NameNode: STARTUP_MSG:


/************************************************************
STARTUP_MSG: Starting NameNode
STARTUP_MSG:
host = srv1.tecadmin.net/192.168.1.90
STARTUP_MSG:
args = [-format]
STARTUP_MSG:
version = 1.2.1
STARTUP_MSG:
build = https://svn.apache.org/repos/asf/hadoop/common/branches/branch-1.2 -r 1479473; compi
STARTUP_MSG:
java = 1.7.0_55
************************************************************/
13/06/02 22:53:48 INFO util.GSet: Computing capacity for map BlocksMap
13/06/02 22:53:48 INFO util.GSet: VM type
= 32-bit
13/06/02 22:53:48 INFO util.GSet: 2.0% max memory = 1013645312
13/06/02 22:53:48 INFO util.GSet: capacity
= 2^22 = 4194304 entries
13/06/02 22:53:48 INFO util.GSet: recommended=4194304, actual=4194304
13/06/02 22:53:49 INFO namenode.FSNamesystem: fsOwner=hadoop

5 of 43

10/04/2014 11:20 AM

How to Setup Hadoop 1.2.1 on CentOS/RHEL 6/5 - TecAdmin.net

Share this:

http://tecadmin.net/steps-to-install-hadoop-on-centosrhel-6/

Use the following command to start all hadoop services.

$ bin/start-all.sh

[sample output]

starting namenode, logging to /opt/hadoop/hadoop/libexec/../logs/hadoop-hadoop-namenode-ns1.tecadmin.net.ou


localhost: starting datanode, logging to /opt/hadoop/hadoop/libexec/../logs/hadoop-hadoop-datanode-ns1.teca
localhost: starting secondarynamenode, logging to /opt/hadoop/hadoop/libexec/../logs/hadoop-hadoop-secondar
starting jobtracker, logging to /opt/hadoop/hadoop/libexec/../logs/hadoop-hadoop-jobtracker-ns1.tecadmin.ne
localhost: starting tasktracker, logging to /opt/hadoop/hadoop/libexec/../logs/hadoop-hadoop-tasktracker-ns

Step 7: Test and Access Hadoop Services


Use jps command to check if all services are started well.

$ jps
or
$ $JAVA_HOME/bin/jps

26049
25929
26399
26129
26249
25807

SecondaryNameNode
DataNode
Jps
JobTracker
TaskTracker
NameNode

Web Access URLs for Services

6 of 43

10/04/2014 11:20 AM

How to Setup Hadoop 1.2.1 on CentOS/RHEL 6/5 - TecAdmin.net

Share this:

http://srv1.tecadmin.net:50070/
http://srv1.tecadmin.net:50060/

http://tecadmin.net/steps-to-install-hadoop-on-centosrhel-6/

for the Namenode


for the Tasktracker

Hadoop JobTracker:

Hadoop Namenode:

7 of 43

10/04/2014 11:20 AM

How to Setup Hadoop 1.2.1 on CentOS/RHEL 6/5 - TecAdmin.net

Share this:

http://tecadmin.net/steps-to-install-hadoop-on-centosrhel-6/

Hadoop TaskTracker:

Step 8: Stop Hadoop Services


If you do no need anymore hadoop. Stop all hadoop services using following command.

# bin/stop-all.sh

Article by Rahul Kumar


Founder of TecAdmin.net. I love to work with Linux and other open source application. Also
I like to share knowledge with my friends and other Linux users. Follow me on Twitter, Add
me on Google+ or Like Facebook page.

8 of 43

10/04/2014 11:20 AM

How to Setup Hadoop 1.2.1 on CentOS/RHEL 6/5 - TecAdmin.net

Share this:

http://tecadmin.net/steps-to-install-hadoop-on-centosrhel-6/

How to Install phpMyAdmin 4.2.9 on

How to Install Oracle Java 8 (JDK 8u20) In

CentOS/RHEL 6/5 a...

Ubuntu &#...

How to Install Git 2.1.0 on Ubuntu 14.04, 12.04

Steps to Install Wine 1.7.26 (Development

and ...

Release) o...

Comments (73)

maillot de foot 2013-2014 | June 29, 2013


whoah this blog is fantastic i like reading your articles. Keep up the good paintings! You
understand, a lot of people are hunting round for this info, you could aid them greatly.
Reply

arun | July 20, 2013


How to add a new not to this cluster ?

Reply

Raja | September 19, 2013


This is really great article for entry level person who like to see how hadoop looks like and play
around. great job!
Also it would be great if you add how to stop hadoop at the end of your article.
Thank you once again.

9 of 43

10/04/2014 11:20 AM

How to Setup Hadoop 1.2.1 on CentOS/RHEL 6/5 - TecAdmin.net

Share this:

http://tecadmin.net/steps-to-install-hadoop-on-centosrhel-6/

Reply

Rahul | September 20, 2013


Hi Raja,
Thanks for appreciation.
We have added last step ( Step 8 ) to how to stop all hadoop services.
Reply

Girish N Iyer | September 20, 2013


This article is really helpful and grear work..Thank You.
I step 4,when i type, tar -xzf hadoop-1.2.0.tar.gz
The following error is displaying..could u pleaSe help
tar (child): hadoop-1.2.0.tar.gz: Cannot open: No such file or directory
tar (child): Error is not recoverable: exiting now
tar: Child returned status 2
tar: Error is not recoverable: exiting now
Reply

Rahul | September 20, 2013


Hi Girish,
Now hadoop 1.2.1 latest stable version is available. So downloaded archive name is
changed.
We have updated article step 4. Kindly try now.
Reply

Baahu | September 21, 2013


Good article !!

Reply

girish iyer | September 22, 2013

10 of 43

10/04/2014 11:20 AM

How to Setup Hadoop 1.2.1 on CentOS/RHEL 6/5 - TecAdmin.net

Share this:

http://tecadmin.net/steps-to-install-hadoop-on-centosrhel-6/

also i cant able to format the name node.it says no such a directoryplease help.
is it due to giving the java path ?
starting namenode, logging to /opt/hadoop/hadoop/libexec/../logs/hadoop-hadoop-namenodegirishniyer.out
/opt/hadoop/hadoop/libexec/../bin/hadoop: line 350: /usr/bin/java/jdk1.7.0_25/bin/java: Not a
directory
/opt/hadoop/hadoop/libexec/../bin/hadoop: line 434: /usr/bin/java/jdk1.7.0_25/bin/java: Not a
directory
/opt/hadoop/hadoop/libexec/../bin/hadoop: line 434: exec: /usr/bin/java/jdk1.7.0_25/bin/java:
cannot execute: Not a directory
localhost: starting datanode, logging to /opt/hadoop/hadoop/libexec/../logs/hadoop-hadoopdatanode-girishniyer.out
localhost: /opt/hadoop/hadoop/libexec/../bin/hadoop: line 350: /usr/bin/java/jdk1.7.0_25
/bin/java: Not a directory
localhost: /opt/hadoop/hadoop/libexec/../bin/hadoop: line 434: /usr/bin/java/jdk1.7.0_25
/bin/java: Not a directory
localhost: /opt/hadoop/hadoop/libexec/../bin/hadoop: line 434: exec: /usr/bin/java/jdk1.7.0_25
/bin/java: cannot execute: Not a directory
localhost: starting secondarynamenode, logging to /opt/hadoop/hadoop/libexec/../logs/hadoophadoop-secondarynamenode-girishniyer.out
localhost: /opt/hadoop/hadoop/libexec/../bin/hadoop: line 350: /usr/bin/java/jdk1.7.0_25
/bin/java: Not a directory
localhost: /opt/hadoop/hadoop/libexec/../bin/hadoop: line 434: /usr/bin/java/jdk1.7.0_25
/bin/java: Not a directory
localhost: /opt/hadoop/hadoop/libexec/../bin/hadoop: line 434: exec: /usr/bin/java/jdk1.7.0_25
/bin/java: cannot execute: Not a directory
starting jobtracker, logging to /opt/hadoop/hadoop/libexec/../logs/hadoop-hadoop-jobtrackergirishniyer.out
/opt/hadoop/hadoop/libexec/../bin/hadoop: line 350: /usr/bin/java/jdk1.7.0_25/bin/java: Not a
directory
/opt/hadoop/hadoop/libexec/../bin/hadoop: line 434: /usr/bin/java/jdk1.7.0_25/bin/java: Not a
directory
/opt/hadoop/hadoop/libexec/../bin/hadoop: line 434: exec: /usr/bin/java/jdk1.7.0_25/bin/java:
cannot execute: Not a directory
localhost: starting tasktracker, logging to /opt/hadoop/hadoop/libexec/../logs/hadoop-hadooptasktracker-girishniyer.out
localhost: /opt/hadoop/hadoop/libexec/../bin/hadoop: line 350: /usr/bin/java/jdk1.7.0_25
/bin/java: Not a directory
localhost: /opt/hadoop/hadoop/libexec/../bin/hadoop: line 434: /usr/bin/java/jdk1.7.0_25
/bin/java: Not a directory
localhost: /opt/hadoop/hadoop/libexec/../bin/hadoop: line 434: exec: /usr/bin/java/jdk1.7.0_25
/bin/java: cannot execute: Not a directory
Reply

Rahul | September 23, 2013

11 of 43

10/04/2014 11:20 AM

How to Setup Hadoop 1.2.1 on CentOS/RHEL 6/5 - TecAdmin.net

Share this:

http://tecadmin.net/steps-to-install-hadoop-on-centosrhel-6/

Did you configured JAVA_HOME correctly ?


Check this article and make sure java is properly configured as in step 5.4 Edit hadoopenv.sh
http://tecadmin.net/steps-to-install-java-on-centos-5-6-or-rhel-5-6/
Reply

girish iyer | September 25, 2013


Thank you rahul for ur quick reply.
i configured java and export the java_home path as per ur blog
now when i give bin/hadoop namenode -format
the following is coming
bin/hadoop: line 350: /opt/jdk1.7.0_25/bin/java: No such file or directory
bin/hadoop: line 434: /opt/jdk1.7.0_25/bin/java: No such file or directory
bin/hadoop: line 434: exec: /opt/jdk1.7.0_25/bin/java:
cannot execute: No such file or directory
when i give which java command,my path displayed is as follows.
/usr/bin/java
when i typed java version,its as follows
[hadoop@girishniyer hadoop]$ java -version
java version 1.7.0_25
OpenJDK Runtime Environment (rhel-2.3.10.4.el6_4-x86_64)
OpenJDK 64-Bit Server VM (build 23.7-b01, mixed mode)
i am little confused hence.
please help mewaiting for ur reply
thanks in advance
Reply

Rahul | September 26, 2013


Hi Girish,
Step #5.4 still not configured properly. How did you installed Java on your
system. Source or RPM ?
Reply

Girish | September 29, 2013

12 of 43

10/04/2014 11:20 AM

How to Setup Hadoop 1.2.1 on CentOS/RHEL 6/5 - TecAdmin.net

Share this:

http://tecadmin.net/steps-to-install-hadoop-on-centosrhel-6/

i reinstalled the centos6.4 again


nw the java version is as follows

[girishniyer@girishniyer Desktop]$ which java


/usr/bin/java
[girishniyer@girishniyer Desktop]$ java -version
java version 1.7.0_09-icedtea
OpenJDK Runtime Environment (rhel-2.3.4.1.el6_3-x86_64)
OpenJDK 64-Bit Server VM (build 23.2-b09, mixed mode)

did i need to re install the java after removing this.


if so can u please help me

Rahul | September 30, 2013


Hi Girish,
OpenJDK default installation path should be /usr/lib/jvm/java-<java
version>-openjdk-<java version>.x86_64/. Please check, if this path
exists add it in configuration
else do a fresh java install http://tecadmin.net/steps-to-installjava-on-centos-5-6-or-rhel-5-6/

Danish | September 27, 2013


Hello,
I had the same error, it was because my jave home was /opt/jdk1.7.0.25/ .
There was no underscore in the jdk1.7.0_25. May be this helps you!
Reply

kedar | September 27, 2013


Excellent article . I could configure hadoop in 20 mins do you have any more learning article for
begineers like first wordcount program and all
Reply

13 of 43

10/04/2014 11:20 AM

How to Setup Hadoop 1.2.1 on CentOS/RHEL 6/5 - TecAdmin.net

Share this:

http://tecadmin.net/steps-to-install-hadoop-on-centosrhel-6/

girish | September 30, 2013


[hadoop@girishniyer hadoop]$ bin/hadoop namenode -format
13/09/30 04:53:34 INFO namenode.NameNode: STARTUP_MSG:
/************************************************************
STARTUP_MSG: Starting NameNode
STARTUP_MSG: host = java.net.UnknownHostException: girishniyer: girishniyer
STARTUP_MSG: args = [-format]
STARTUP_MSG: version = 1.2.1
STARTUP_MSG: build = https://svn.apache.org/repos/asf/hadoop/common/branches/branch-1.2 -r
1503152; compiled by mattf on Mon Jul 22 15:23:09 PDT 2013
STARTUP_MSG: java = 1.7.0_40
************************************************************/
13/09/30 04:53:34 INFO util.GSet: Computing capacity for map BlocksMap
13/09/30 04:53:34 INFO util.GSet: VM type = 64-bit
13/09/30 04:53:34 INFO util.GSet: 2.0% max memory = 932184064
13/09/30 04:53:34 INFO util.GSet: capacity = 2^21 = 2097152 entries
13/09/30 04:53:34 INFO util.GSet: recommended=2097152, actual=2097152
13/09/30 04:53:34 INFO namenode.FSNamesystem: fsOwner=hadoop
13/09/30 04:53:34 INFO namenode.FSNamesystem: supergroup=supergroup
13/09/30 04:53:34 INFO namenode.FSNamesystem: isPermissionEnabled=true
13/09/30 04:53:34 INFO namenode.FSNamesystem: dfs.block.invalidate.limit=100
13/09/30 04:53:34 INFO namenode.FSNamesystem: isAccessTokenEnabled=false
accessKeyUpdateInterval=0 min(s), accessTokenLifetime=0 min(s)
13/09/30 04:53:34 INFO namenode.FSEditLog: dfs.namenode.edits.toleration.length = 0
13/09/30 04:53:34 INFO namenode.NameNode: Caching file names occuring more than 10 times
13/09/30 04:53:34 ERROR namenode.NameNode: java.io.IOException: Cannot create directory
/opt/hadoop/hadoop/dfs/name/current
at
org.apache.hadoop.hdfs.server.common.Storage$StorageDirectory.clearDirectory(Storage.java:294)
at org.apache.hadoop.hdfs.server.namenode.FSImage.format(FSImage.java:1337)
at org.apache.hadoop.hdfs.server.namenode.FSImage.format(FSImage.java:1356)
at org.apache.hadoop.hdfs.server.namenode.NameNode.format(NameNode.java:1261)
at org.apache.hadoop.hdfs.server.namenode.NameNode.createNameNode(NameNode.java:1467)
at org.apache.hadoop.hdfs.server.namenode.NameNode.main(NameNode.java:1488)
13/09/30 04:53:34 INFO namenode.NameNode: SHUTDOWN_MSG:
/************************************************************
SHUTDOWN_MSG: Shutting down NameNode at java.net.UnknownHostException: girishniyer:
girishniyer
************************************************************/
[hadoop@girishniyer hadoop]$
i configured java as per ur article.
then till formatting name node no issues
bt here iam stuck again.
could u please help me?
Reply

14 of 43

10/04/2014 11:20 AM

How to Setup Hadoop 1.2.1 on CentOS/RHEL 6/5 - TecAdmin.net

Share this:

http://tecadmin.net/steps-to-install-hadoop-on-centosrhel-6/

Rahul | September 30, 2013


Hi Girish,
You systems hostname is girishniyer which is not resolving to any ip. Do a entry in
/etc/hosts file like below
<system_ip> girishniyer
Reply

Girish S | October 7, 2013


I just began learning hadoop. Your article was superbly simple and straight forward.
The patience with which you helped Girish Iyer is commendable.
Reply

Rahul | October 10, 2013


Thanks Girish,

Reply

Rodrigo Bittencourt | October 7, 2013


Hello,
Rahul I have a question, why the step 7 does not work in my cluster ?
Reply

Rahul | October 8, 2013


Hi Rodrigo,
What issue are you getting ?
Reply

15 of 43

10/04/2014 11:20 AM

How to Setup Hadoop 1.2.1 on CentOS/RHEL 6/5 - TecAdmin.net

Share this:

http://tecadmin.net/steps-to-install-hadoop-on-centosrhel-6/

Rodrigo Bittencourt | October 8, 2013


I can not open the URL of hadoop in my cluster.
where was configured the URL ?
Reply

tosh | October 9, 2013


A lot of factors play into the urls not showing the result. If you are on a local
environment ( working on the same machine that you are installing the
cluster on) then this tutorial should have worked.
To the best of my knowledge, the localhost in
hdfs://localhost:9000/ and
localhost:9001
defined inside the core-site.xml and mapred-site.xml should be changed to
the hostname of your system.
i.e. if the hostname of your system is linuxuser, then change the
localhost to linuxuser in these line, then add the following line in
/etc/hosts file:
127.0.0.1 linuxuser
If you are working on a remote server, then same as above except the host
file should have the static ip instead of 127.0.0.1 , and you will be able to
see the web services, but there is another problem I ran into. See my
question below .
It could be the firewall setup as well. try disabling iptables to check if it is
firewall issue. If you do this, make sure you re start iptables when you are
done
Reply

tosh | October 8, 2013


Hi,
I have successfully installed hadoop cluster, and it seems to run fine. I was able to access the
webservices after changing the localhost to the hosts ip address. However, when clicking the
browse file system in namenode webaccess, Browser does not find the webpage. the url shows as
http://master:50075/browseDirectory.jsp?namenodeInfoPort=50070&dir=/
(I get the same error at a few other places, but manually replacing the master with ip address
seems to solve the issue)
master is the hostname I chose for this machine. My guess is this is a DNS issue, but I was

16 of 43

10/04/2014 11:20 AM

How to Setup Hadoop 1.2.1 on CentOS/RHEL 6/5 - TecAdmin.net

Share this:

http://tecadmin.net/steps-to-install-hadoop-on-centosrhel-6/

wondering if we can change any config files so the links resolve to an ip address instead of master.
4

I also see that your mapred-site.xml and core-site.xml have hdfs://localhost:port_num in them,
but you are able to access the web interface using your domain name (in the screenshots you
provided). I am installing hadoop on remote system, and had to replace the localhost with the
servers ip address to access the web interface. Is this accepted practice? Or should I leave them as
localhost and make changes elsewhere?
Third, for hdfs-site.xml, the only property I defined is the replication value ( I was following a
separate tutorial until about halfway). So I was wondering if you could explain what leaving the
dfs.data.dir and dfs.name.dir would do? If we do define those values, Can it be any directory, or
does it have to be inside the hadoop installation?
And last,
when I do get to the file browser ( by manually replacing master in the url with the ip address), I
see a tmp directory there as well that contains the mapred folder and its subfolders . Is this
normal?
Sincerely,
Ashutosh
Reply

tosh | October 8, 2013


Clarification:
By I was able to access the web services after changing the localhost to the hosts ip
address, I meant changed the localhost in the hadoop/conf/*.xml files to a static ip
address that I am now using to access the web services.
Reply

Rakesh | October 9, 2013


Hi
I ran the format step and got below error message
bin/hadoop namenode -format
13/10/09 15:33:33 INFO namenode.NameNode: STARTUP_MSG:
/************************************************************
STARTUP_MSG: Starting NameNode
STARTUP_MSG: host = hadoop-01/10.49.14.42
STARTUP_MSG: args = [-format]
STARTUP_MSG: version = 1.2.1
STARTUP_MSG: build = https://svn.apache.org/repos/asf/hadoop/common/branches/branch-1.2 -r
1503152; compiled by mattf on Mon Jul 22 15:23:09 PDT 2013
STARTUP_MSG: java = 1.6.0_20
************************************************************/

17 of 43

10/04/2014 11:20 AM

How to Setup Hadoop 1.2.1 on CentOS/RHEL 6/5 - TecAdmin.net

Share this:

http://tecadmin.net/steps-to-install-hadoop-on-centosrhel-6/

[Fatal Error] core-site.xml:5:2: The markup in the document following the root element must be
4

13/10/09 15:33:33 FATAL conf.Configuration: error parsing conf file:


org.xml.sax.SAXParseException; systemId: file:/opt/hadoop/hadoop/conf/core-site.xml;
lineNumber: 5; columnNumber: 2; The markup in the document following the root element must be
well-formed.
13/10/09 15:33:33 ERROR namenode.NameNode: java.lang.RuntimeException:
org.xml.sax.SAXParseException; systemId: file:/opt/hadoop/hadoop/conf/core-site.xml;
lineNumber: 5; columnNumber: 2; The markup in the document following the root element must be
well-formed.
at org.apache.hadoop.conf.Configuration.loadResource(Configuration.java:1249)
at org.apache.hadoop.conf.Configuration.loadResources(Configuration.java:1107)
at org.apache.hadoop.conf.Configuration.getProps(Configuration.java:1053)
at org.apache.hadoop.conf.Configuration.set(Configuration.java:420)
at org.apache.hadoop.hdfs.server.namenode.NameNode.setStartupOption(NameNode.java:1374)
at org.apache.hadoop.hdfs.server.namenode.NameNode.createNameNode(NameNode.java:1463)
at org.apache.hadoop.hdfs.server.namenode.NameNode.main(NameNode.java:1488)
Caused by: org.xml.sax.SAXParseException; systemId: file:/opt/hadoop/hadoop/conf/coresite.xml; lineNumber: 5; columnNumber: 2; The markup in the document following the root
element must be well-formed.
at com.sun.org.apache.xerces.internal.parsers.DOMParser.parse(DOMParser.java:253)
at
com.sun.org.apache.xerces.internal.jaxp.DocumentBuilderImpl.parse(DocumentBuilderImpl.java:288)
at javax.xml.parsers.DocumentBuilder.parse(DocumentBuilder.java:177)
at org.apache.hadoop.conf.Configuration.loadResource(Configuration.java:1156)
6 more
13/10/09 15:33:33 INFO namenode.NameNode: SHUTDOWN_MSG:
/************************************************************
SHUTDOWN_MSG: Shutting down NameNode at hadoop-01/10.49.14.42
************************************************************/
Reply

Rakesh | October 9, 2013


Core xml file for reference
fs.default.name
hdfs://hadoop-01:9000/
dfs.permissions
false
Reply

18 of 43

10/04/2014 11:20 AM

How to Setup Hadoop 1.2.1 on CentOS/RHEL 6/5 - TecAdmin.net

Share this:

http://tecadmin.net/steps-to-install-hadoop-on-centosrhel-6/

Rahul | October 10, 2013


Hi Rakesh,
TecAdmin.net commnets does not support HTML tags properly. Can you please show the
screenshot of your file..
Reply

Rakesh | October 10, 2013


Hi Rahul,
sorry for my delay response.
I was able to resolve the above error ,By mistake i deleted the wrapper in the xml
which caused the error , I have now kept the right data in xml and found below
output after format.
[hadoop@hadoop-01 hadoop]$ /opt/jdk1.7.0_40/bin/jps
2148 NameNode
2628 TaskTracker
2767 Jps
2503 JobTracker
2415 SecondaryNameNode
2274 DataNode
However in the xml files i have used localhost instead of my server ipaddress is that
fine , because i am unable to launch the web urls for the same
Reply

Sudip | October 21, 2013


Hi Rahul,
I am trying to install a 3 node Hadoop Cluster and using v2.1.0-beta version.
Though I could sense there are a lot of changes in terems of directory
structure (as compared to v1.2.1) but finally after configuring most of the
steps mentioned here , when I issued the command $JAVA_HOME/bin/jps
on my master server I am seeing O/P:
16467
15966
16960
16255

Resource Manager
NameNode
Jps
SecondaryNameNode

But I cantt see my DataNode or NodeManager getting started.


I can send you the ouptut of start-all.sh script output (which throws a lot
of error lines but somehow able to start the above mentioned services), if
you think that can be of any help to you.

19 of 43

10/04/2014 11:20 AM

How to Setup Hadoop 1.2.1 on CentOS/RHEL 6/5 - TecAdmin.net

Share this:

http://tecadmin.net/steps-to-install-hadoop-on-centosrhel-6/

Secondly I am not able open any pages as mentioned part of web services;
4

http://:50030/ for the Jobtracker -> Resource Manager


http://:50070/ for the Namenode
Would appreciate any comment from your side on my queries here.
Reply

Rahul | October 21, 2013


Hi Sudeep,
Yes, Please post the output of start-all.sh command with log files. But
first plz empty your log files and them run start-all.sh after that post
all outputs.
Also I prefer, if you post your question on our new forum, so it will be
better to communicate.

Sudip | October 22, 2013


Here is the output of start-all.sh script.
[hadoop@hadoop-master sbin]$ ./start-all.sh
This script is Deprecated. Instead use start-dfs.sh and start-yarn.sh
13/10/20 18:45:47 WARN util.NativeCodeLoader: Unable to load
native-hadoop library for your platform using builtin-java classes
where applicable
Starting namenodes on [Java HotSpot(TM) 64-Bit Server VM warning:
You have loaded library /opt/hadoop/hadoop/lib/native
/libhadoop.so.1.0.0 which might have disabled stack guard. The VM
will try to fix the stack guard now.
It's highly recommended that you fix the library with 'execstack -c ', or
link it with '-z noexecstack'.
hadoop-master]
sed: -e expression #1, char 6: unknown option to `s
Java: ssh: Could not resolve hostname Java: Temporary failure in
name resolution
64-Bit: ssh: Could not resolve hostname 64-Bit: Temporary failure in
name resolution
HotSpot(TM): ssh: Could not resolve hostname HotSpot(TM):
Temporary failure in name resolution
Server: ssh: Could not resolve hostname Server: Temporary failure in
name resolution
VM: ssh: Could not resolve hostname VM: Temporary failure in name
resolution
warning:: ssh: Could not resolve hostname warning:: Temporary

20 of 43

10/04/2014 11:20 AM

How to Setup Hadoop 1.2.1 on CentOS/RHEL 6/5 - TecAdmin.net

Share this:

21 of 43

http://tecadmin.net/steps-to-install-hadoop-on-centosrhel-6/

4 resolve hostname You: Temporary failure in name


You: ssh: Could not
resolution
have: ssh: Could not resolve hostname have: Temporary failure in
name resolution
library: ssh: Could not resolve hostname library: Temporary failure in
name resolution
have: ssh: Could not resolve hostname have: Temporary failure in
name resolution
which: ssh: Could not resolve hostname which: Temporary failure in
name resolution
disabled: ssh: Could not resolve hostname disabled: Temporary failure
in name resolution
will: ssh: Could not resolve hostname will: Temporary failure in name
resolution
stack: ssh: Could not resolve hostname stack: Temporary failure in
name resolution
guard.: ssh: Could not resolve hostname guard.: Temporary failure in
name resolution
might: ssh: Could not resolve hostname might: Temporary failure in
name resolution
stack: ssh: Could not resolve hostname stack: Temporary failure in
name resolution
the: ssh: Could not resolve hostname the: Temporary failure in name
resolution
loaded: ssh: Could not resolve hostname loaded: Temporary failure in
name resolution
VM: ssh: Could not resolve hostname VM: Temporary failure in name
resolution
execstack: ssh: Could not resolve hostname execstack: Temporary
failure in name resolution
to: ssh: Could not resolve hostname to: Temporary failure in name
resolution
that: ssh: Could not resolve hostname that: Temporary failure in name
resolution
try: ssh: Could not resolve hostname try: Temporary failure in name
resolution
highly: ssh: Could not resolve hostname highly: Temporary failure in
name resolution
link: ssh: Could not resolve hostname link: Temporary failure in name
resolution
fix: ssh: Could not resolve hostname fix: Temporary failure in name
resolution
you: ssh: Could not resolve hostname you: Temporary failure in name
resolution
guard: ssh: Could not resolve hostname guard: Temporary failure in
name resolution
fix: ssh: Could not resolve hostname fix: Temporary failure in name
resolution
or: ssh: Could not resolve hostname or: Temporary failure in name
resolution

10/04/2014 11:20 AM

How to Setup Hadoop 1.2.1 on CentOS/RHEL 6/5 - TecAdmin.net

Share this:

http://tecadmin.net/steps-to-install-hadoop-on-centosrhel-6/

Its: ssh: Could not resolve hostname Its: Temporary failure in name
4

recommended: ssh: Could not resolve hostname recommended:


Temporary failure in name resolution
the: ssh: Could not resolve hostname the: Temporary failure in name
resolution
-c: Unknown cipher type cd
-z: ssh: Could not resolve hostname -z: Temporary failure in name
resolution
with: ssh: Could not resolve hostname with: Temporary failure in
name resolution
with: ssh: Could not resolve hostname with: Temporary failure in
name resolution
now.: ssh: Could not resolve hostname now.: Temporary failure in
name resolution
noexecstack.: ssh: Could not resolve hostname noexecstack.:
Temporary failure in name resolution
,: ssh: Could not resolve hostname ,: Temporary failure in name
resolution
The: ssh: Could not resolve hostname The: Temporary failure in name
resolution
it: ssh: Could not resolve hostname it: Temporary failure in name
resolution
library: ssh: Could not resolve hostname library: Temporary failure in
name resolution
hadoop-master: starting namenode, logging to /opt/hadoop/hadoop
/logs/hadoop-hadoop-namenode-hadoop-master.out
hadoop-master: Java HotSpot(TM) 64-Bit Server VM warning: You
have loaded library /opt/hadoop/hadoop/lib/native/libhadoop.so.1.0.0
which might have disabled stack guard. The VM will try to fix the stack
guard now.
hadoop-master: Its highly recommended that you fix the library with
execstack -c , or link it with -z noexecstack.
hadoop-slave-1: bash: line 0: cd: /opt/hadoop/hadoop: No such file or
directory
hadoop-slave-1: bash: /opt/hadoop/hadoop/sbin/hadoop-daemon.sh:
No such file or directory
hadoop-slave-2: bash: line 0: cd: /opt/hadoop/hadoop: No such file or
directory
hadoop-slave-2: bash: /opt/hadoop/hadoop/sbin/hadoop-daemon.sh:
No such file or directory
Starting secondary namenodes [Java HotSpot(TM) 64-Bit Server VM
warning: You have loaded library /opt/hadoop/hadoop/lib/native
/libhadoop.so.1.0.0 which might have disabled stack guard. The VM
will try to fix the stack guard now.
It's highly recommended that you fix the library with 'execstack -c ', or
link it with '-z noexecstack'.
0.0.0.0]
sed: -e expression #1, char 6: unknown option to `s
64-Bit: ssh: Could not resolve hostname 64-Bit: Temporary failure in
name resolution

22 of 43

10/04/2014 11:20 AM

How to Setup Hadoop 1.2.1 on CentOS/RHEL 6/5 - TecAdmin.net

Share this:

http://tecadmin.net/steps-to-install-hadoop-on-centosrhel-6/

Server: ssh: Could not resolve hostname Server: Temporary failure in


name resolution
HotSpot(TM): ssh: Could not resolve hostname HotSpot(TM):
Temporary failure in name resolution
VM: ssh: Could not resolve hostname VM: Temporary failure in name
resolution
warning:: ssh: Could not resolve hostname warning:: Temporary
failure in name resolution
have: ssh: Could not resolve hostname have: Temporary failure in
name resolution
You: ssh: Could not resolve hostname You: Temporary failure in name
resolution
library: ssh: Could not resolve hostname library: Temporary failure in
name resolution
might: ssh: Could not resolve hostname might: Temporary failure in
name resolution
loaded: ssh: Could not resolve hostname loaded: Temporary failure in
name resolution
try: ssh: Could not resolve hostname try: Temporary failure in name
resolution
guard.: ssh: Could not resolve hostname guard.: Temporary failure in
name resolution
stack: ssh: Could not resolve hostname stack: Temporary failure in
name resolution
which: ssh: Could not resolve hostname which: Temporary failure in
name resolution
have: ssh: Could not resolve hostname have: Temporary failure in
name resolution
now.: ssh: Could not resolve hostname now.: Temporary failure in
name resolution
stack: ssh: Could not resolve hostname stack: Temporary failure in
name resolution
The: ssh: Could not resolve hostname The: Temporary failure in name
resolution
guard: ssh: Could not resolve hostname guard: Temporary failure in
name resolution
will: ssh: Could not resolve hostname will: Temporary failure in name
resolution
the: ssh: Could not resolve hostname the: Temporary failure in name
resolution
recommended: ssh: Could not resolve hostname recommended:
Temporary failure in name resolution
disabled: ssh: Could not resolve hostname disabled: Temporary failure
in name resolution
VM: ssh: Could not resolve hostname VM: Temporary failure in name
resolution
to: ssh: Could not resolve hostname to: Temporary failure in name
resolution
execstack: ssh: Could not resolve hostname execstack: Temporary

23 of 43

10/04/2014 11:20 AM

How to Setup Hadoop 1.2.1 on CentOS/RHEL 6/5 - TecAdmin.net

Share this:

24 of 43

http://tecadmin.net/steps-to-install-hadoop-on-centosrhel-6/

4 not resolve hostname highly: Temporary failure in


highly: ssh: Could
name resolution
,: ssh: Could not resolve hostname ,: Temporary failure in name
resolution
that: ssh: Could not resolve hostname that: Temporary failure in name
resolution
-c: Unknown cipher type cd
-z: ssh: Could not resolve hostname -z: Temporary failure in name
resolution
link: ssh: Could not resolve hostname link: Temporary failure in name
resolution
fix: ssh: Could not resolve hostname fix: Temporary failure in name
resolution
or: ssh: Could not resolve hostname or: Temporary failure in name
resolution
library: ssh: Could not resolve hostname library: Temporary failure in
name resolution
it: ssh: Could not resolve hostname it: Temporary failure in name
resolution
you: ssh: Could not resolve hostname you: Temporary failure in name
resolution
Its: ssh: Could not resolve hostname Its: Temporary failure in name
resolution
with: ssh: Could not resolve hostname with: Temporary failure in
name resolution
with: ssh: Could not resolve hostname with: Temporary failure in
name resolution
fix: ssh: Could not resolve hostname fix: Temporary failure in name
resolution
the: ssh: Could not resolve hostname the: Temporary failure in name
resolution
noexecstack.: ssh: Could not resolve hostname noexecstack.:
Temporary failure in name resolution
0.0.0.0: starting secondarynamenode, logging to /opt/hadoop/hadoop
/logs/hadoop-hadoop-secondarynamenode-hadoop-master.out
0.0.0.0: Java HotSpot(TM) 64-Bit Server VM warning: You have loaded
library /opt/hadoop/hadoop/lib/native/libhadoop.so.1.0.0 which might
have disabled stack guard. The VM will try to fix the stack guard now.
0.0.0.0: Its highly recommended that you fix the library with
execstack -c , or link it with -z noexecstack.
13/10/20 18:45:59 WARN util.NativeCodeLoader: Unable to load
native-hadoop library for your platform using builtin-java classes
where applicable
starting yarn daemons
starting resourcemanager, logging to /opt/hadoop/hadoop/logs/yarnhadoop-resourcemanager-hadoop-master.out
hadoop-slave-1: bash: line 0: cd: /opt/hadoop/hadoop: No such file or
directory
hadoop-slave-1: bash: /opt/hadoop/hadoop/sbin/yarn-daemon.sh: No
such file or directory

10/04/2014 11:20 AM

How to Setup Hadoop 1.2.1 on CentOS/RHEL 6/5 - TecAdmin.net

Share this:

http://tecadmin.net/steps-to-install-hadoop-on-centosrhel-6/

hadoop-slave-2: bash: line 0: cd: /opt/hadoop/hadoop: No such file or


4

hadoop-slave-2: bash: /opt/hadoop/hadoop/sbin/yarn-daemon.sh: No


such file or directory
[hadoop@hadoop-master sbin]$ $JAVA_HOME/bin/jps
15130 Jps
14697 SecondaryNameNode
14872 ResourceManager
14408 NameNode
[hadoop@hadoop-master sbin]$

Rakesh | October 10, 2013


I have tried below urls for job tracker but none of them worked out . Also i confirmed with network
team they confirmed that there is no firewall blocking .
http://ipaddress:50030/
http://localhost:50030/
http://servername:50030/
Reply

Rakesh | October 14, 2013


I have found solutions for my issues, thanks rahul for your post it helped me a lot . !!!

Reply

Rahul | October 22, 2013


Sounds good Rakesh Thanks

Reply

Toan Nguyen | October 15, 2013


Please help me, when I run $ bin/start-all.sh, output have a error,
starting namenode, logging to /opt/hadoop/hadoop/libexec/../logs/hadoop-hadoop-namenodehadoop-master.out

25 of 43

10/04/2014 11:20 AM

How to Setup Hadoop 1.2.1 on CentOS/RHEL 6/5 - TecAdmin.net

Share this:

http://tecadmin.net/steps-to-install-hadoop-on-centosrhel-6/

RSA key fingerprint is 7b:6d:cb:fc:48:7b:c6:42:a5:6a:64:83:ab:a8:95:95.


Are you sure you want to continue connecting (yes/no)? hadoop-slave-2: starting datanode,
logging to /opt/hadoop/hadoop/libexec/../logs/hadoop-hadoop-datanode-hadoop-slave-2.out
hadoop-slave-1: starting datanode, logging to /opt/hadoop/hadoop/libexec/../logs/hadoop-hadoopdatanode-hadoop-slave-1.out
Reply

Rahul | October 16, 2013


Hi Toan,
Are you configuring hadoop multinode cluster ?
Reply

Shannon | October 16, 2013


Can you please suggest the steps to uninstall the apache hadoop . I am planning to test cloudera as
well , if you can share steps for cloudera as well that would be awesome ! .
Reply

Rakesh | October 24, 2013


I have configured cloudera on a single node successfully how ever when i lanuch Hue
@//myaddress.8888 i see below error message at configuration level.

Configuration files located in /var/run/cloudera-scm-agent/process/57-hue-HUE_SERVER


Potential misconfiguration detected. Fix and restart Hue.
and my hue_safety_valve.ini looks as below
[hadoop]
[[mapred_clusters]]
[[[default]]]
jobtracker_host=servername
thrift_port=9290
jobtracker_port=8021
submit_to=True
hadoop_mapred_home={{HADOOP_MR1_HOME}}

26 of 43

10/04/2014 11:20 AM

How to Setup Hadoop 1.2.1 on CentOS/RHEL 6/5 - TecAdmin.net

Share this:

http://tecadmin.net/steps-to-install-hadoop-on-centosrhel-6/

security_enabled=false
Reply

Dheeraj | October 30, 2013


heyy
please help.
wen i am doing the step 6 its asking ffor some password even though i havent set any
password.please tell wat to do.
bin/start-all.sh
starting namenode, logging to /opt/hadoop/hadoop/libexec/../logs/hadoop-hadoop-namenodedemo.example.com.out
hadoop@localhosts password:
Thanx
Reply

Rahul | October 31, 2013


Hi Dheeraj,
I think you dont have completed Step #3 properly. Please check and complete it. Make
sure you are able to ssh without password.
Reply

Rodrigo Bittencourt | November 4, 2013


Hello Rahul,
I need your help , my cluster does not work because it must have something wrong in the
configuration .
because when I put start-all.sh it does not initialize the package secondarynamenode .
shows this error
starting secondarynamenode , logging to / opt/hadoop/hadoop-1.2.1/libexec/../logs/hadoophadoop-secondarynamenode-lbad012.out
lbad012 : Exception in thread main java.lang.IllegalArgumentException : Does not contain a

27 of 43

10/04/2014 11:20 AM

How to Setup Hadoop 1.2.1 on CentOS/RHEL 6/5 - TecAdmin.net

Share this:

http://tecadmin.net/steps-to-install-hadoop-on-centosrhel-6/

lbad012 : at org.apache.hadoop.net.NetUtils.createSocketAddr ( NetUtils.java : 164 )


lbad012 : at org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress ( NameNode.java :
212 )
lbad012 : at org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress ( NameNode.java :
244 )
lbad012 : at org.apache.hadoop.hdfs.server.namenode.NameNode.getServiceAddress (
NameNode.java : 236 )
lbad012 : at org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode.initialize (
SecondaryNameNode.java : 194 )
lbad012 : at org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode . (
SecondaryNameNode.java : 150 )
lbad012 : at org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode.main (
SecondaryNameNode.java : 676 )
Reply

jk | November 9, 2013
Hi
I downloaded and installed Apache hadoop 2.2 latest. Followed the above setup for single node (
First time setup.) RHEL 5.5
Name node, DataNode, ResourceManager, NodeManager started fine. Had some issue with
datanode & had to update the IPtables for opening ports.
when I run
-bash-3.2$ sbin/mr-jobhistory-daemon.sh start historyserver
starting historyserver, logging to /hadoop/hadoop-2.2.0/logs/mapred-hduser-historyserverserver.out
when I run jps, I dont see the JobHistoryServer listed . There are no errors in the out file above.
Can someone please assist?
Thanks
JKH
Reply

jk | November 9, 2013
Hi
I tried changing the entry in core-site.xml

28 of 43

10/04/2014 11:20 AM

How to Setup Hadoop 1.2.1 on CentOS/RHEL 6/5 - TecAdmin.net

Share this:

http://tecadmin.net/steps-to-install-hadoop-on-centosrhel-6/

instead of
hdfs://drwdt001:9000
and that helped with the startup of the HistoryServer. Now I can see it.
Although, now I am not sure if my datanode will restart. I had issues with that startup & thats the
reason I removed the domain and just had the servername as the value.
Any ideas? I will test before I shut and restart
Thanks.
Reply

jk | November 9, 2013
oops. Spoke too soon. I do see the JobHistoryServer start then it stopped it looks like.
-bash-3.2$ jps
21984 NameNode
27080 DataNode
1638 ResourceManager
1929 NodeManager
5718 JobHistoryServer
6278 Jps
-bash-3.2$ jps
21984 NameNode
27080 DataNode
11037 Jps
1638 ResourceManager
1929 NodeManager
Where can I see the information as to why this stopped? Can you please suggest. Sorry for multiple
posts. But I had to update and dont seem to find any help googling.
Thanks
Reply

jk | November 9, 2013
Fixed these errors. Checked the logs and got the help from websites.
Have a question. How can I use/test the wordcount on multiple files in a folder.
bin/hadoop dfs -copyFromLocal files /testfiles
bin/hadoop jar share/hadoop/mapreduce/hadoop-mapreduce-examples-2.2.0.jar wordcount /files
/testfiles This throws error saying testfiles is not a file. It should work for a directory too right ?

29 of 43

10/04/2014 11:20 AM

How to Setup Hadoop 1.2.1 on CentOS/RHEL 6/5 - TecAdmin.net

Share this:

http://tecadmin.net/steps-to-install-hadoop-on-centosrhel-6/

Reply

Azim | November 13, 2013


I am getting following below error when I run bin/start-all.sh Please suggest
hadoop@localhost hadoop]$ bin/start-all.sh
/opt/hadoop/hadoop/libexec/../conf/hadoop-env.sh: line 1: i#: command not found
/opt/hadoop/hadoop/libexec/../conf/hadoop-env.sh: line 1: i#: command not found
/opt/hadoop/hadoop/libexec/../conf/hadoop-env.sh: line 1: i#: command not found
/opt/hadoop/hadoop/libexec/../conf/hadoop-env.sh: line 1: i#: command not found
starting namenode, logging to /opt/hadoop/hadoop/libexec/../logs/hadoop-hadoop-namenodelocalhost.localdomain.out
/opt/hadoop/hadoop/libexec/../conf/hadoop-env.sh: line 1: i#: command not found
/opt/hadoop/hadoop/libexec/../conf/hadoop-env.sh: line 1: i#: command not found
/opt/hadoop/hadoop/libexec/../conf/hadoop-env.sh: line 1: i#: command not found
localhost: /opt/hadoop/hadoop/libexec/../conf/hadoop-env.sh: line 1: i#: command not found
localhost: /opt/hadoop/hadoop/libexec/../conf/hadoop-env.sh: line 1: i#: command not found
localhost: datanode running as process 3602. Stop it first.
/opt/hadoop/hadoop/libexec/../conf/hadoop-env.sh: line 1: i#: command not found
/opt/hadoop/hadoop/libexec/../conf/hadoop-env.sh: line 1: i#: command not found
localhost: /opt/hadoop/hadoop/libexec/../conf/hadoop-env.sh: line 1: i#: command not found
localhost: /opt/hadoop/hadoop/libexec/../conf/hadoop-env.sh: line 1: i#: command not found
localhost: secondarynamenode running as process 3715. Stop it first.
/opt/hadoop/hadoop/libexec/../conf/hadoop-env.sh: line 1: i#: command not found
/opt/hadoop/hadoop/libexec/../conf/hadoop-env.sh: line 1: i#: command not found
/opt/hadoop/hadoop/libexec/../conf/hadoop-env.sh: line 1: i#: command not found
jobtracker running as process 3793. Stop it first.
/opt/hadoop/hadoop/libexec/../conf/hadoop-env.sh: line 1: i#: command not found
/opt/hadoop/hadoop/libexec/../conf/hadoop-env.sh: line 1: i#: command not found
localhost: /opt/hadoop/hadoop/libexec/../conf/hadoop-env.sh: line 1: i#: command not found
localhost: /opt/hadoop/hadoop/libexec/../conf/hadoop-env.sh: line 1: i#: command not found
localhost: tasktracker running as process 3921. Stop it first.
[hadoop@localhost hadoop]$ jvs
-bash: jvs: command not found
Reply

Sravan | January 8, 2014


This explanation is not less to any other I have seen so far Great efforts Rahul, I ve personally
installed psuedo and normal distributed clusters in my workplace using this tutorials.
Thanks alot buddy

30 of 43

10/04/2014 11:20 AM

How to Setup Hadoop 1.2.1 on CentOS/RHEL 6/5 - TecAdmin.net

Share this:

http://tecadmin.net/steps-to-install-hadoop-on-centosrhel-6/

Reply

Rahul Kumar | January 8, 2014


Thanks Sravan

Reply

Paul | January 8, 2014


Hi Rahul!
Just to say your instructions worked like a dream.
In my hadoop-env.sh I used.
export JAVA_HOME=/usr/lib/jvm/jre-1.6.0 <- might help others its a vanilla Centos 6.5 install.
Cheers Paul
Reply

Mursaleen Javed | February 2, 2014


[hadoop@RHEL hadoop]$ bin/hadoop namenode -format
DEPRECATED: Use of this script to execute hdfs command is deprecated.
Instead use the hdfs command for it.
14/02/03 03:38:23 INFO namenode.NameNode: STARTUP_MSG:
/************************************************************
STARTUP_MSG: Starting NameNode
STARTUP_MSG: host = java.net.UnknownHostException: RHEL: RHEL
STARTUP_MSG: args = [-format]
STARTUP_MSG: version = 2.2.0
STARTUP_MSG: classpath = /opt/hadoop/hadoop/etc/hadoop:/opt/hadoop/hadoop/share/hadoop
/common/lib/jsr305-1.3.9.jar:/opt/hadoop/hadoop/share/hadoop/common/lib/servlet-api-2.5.jar:
/opt/hadoop/hadoop/share/hadoop/common/lib/jaxb-impl-2.2.3-1.jar:/opt/hadoop/hadoop/share
/hadoop/common/lib/commons-codec-1.4.jar:/opt/hadoop/hadoop/share/hadoop/common
/lib/jets3t-0.6.1.jar:/opt/hadoop/hadoop/share/hadoop/common/lib/commons-net-3.1.jar:
/opt/hadoop/hadoop/share/hadoop/common/lib/commons-digester-1.8.jar:/opt/hadoop/hadoop
/share/hadoop/common/lib/jsch-0.1.42.jar:/opt/hadoop/hadoop/share/hadoop/common/lib/log4j1.2.17.jar:/opt/hadoop/hadoop/share/hadoop/common/lib/activation-1.1.jar:/opt/hadoop/hadoop
/share/hadoop/common/lib/jackson-jaxrs-1.8.8.jar:/opt/hadoop/hadoop/share/hadoop/common
/lib/jackson-mapper-asl-1.8.8.jar:/opt/hadoop/hadoop/share/hadoop/common/lib/jetty-

31 of 43

10/04/2014 11:20 AM

How to Setup Hadoop 1.2.1 on CentOS/RHEL 6/5 - TecAdmin.net

Share this:

32 of 43

http://tecadmin.net/steps-to-install-hadoop-on-centosrhel-6/

4
4
/hadoop/share/hadoop/common/lib/jettison-1.1.jar:/opt/hadoop/hadoop/share/hadoop/common
/lib/jasper-runtime-5.5.23.jar:/opt/hadoop/hadoop/share/hadoop/common/lib/commonscollections-3.2.1.jar:/opt/hadoop/hadoop/share/hadoop/common/lib/slf4j-api-1.7.5.jar:
/opt/hadoop/hadoop/share/hadoop/common/lib/snappy-java-1.0.4.1.jar:/opt/hadoop/hadoop
/share/hadoop/common/lib/asm-3.2.jar:/opt/hadoop/hadoop/share/hadoop/common/lib/commonsbeanutils-1.7.0.jar:/opt/hadoop/hadoop/share/hadoop/common/lib/jackson-core-asl-1.8.8.jar:
/opt/hadoop/hadoop/share/hadoop/common/lib/jsp-api-2.1.jar:/opt/hadoop/hadoop/share/hadoop
/common/lib/stax-api-1.0.1.jar:/opt/hadoop/hadoop/share/hadoop/common/lib/jerseyjson-1.9.jar:/opt/hadoop/hadoop/share/hadoop/common/lib/avro-1.7.4.jar:/opt/hadoop/hadoop
/share/hadoop/common/lib/commons-configuration-1.6.jar:/opt/hadoop/hadoop/share/hadoop
/common/lib/commons-math-2.1.jar:/opt/hadoop/hadoop/share/hadoop/common/lib/jaspercompiler-5.5.23.jar:/opt/hadoop/hadoop/share/hadoop/common/lib/netty-3.6.2.Final.jar:
/opt/hadoop/hadoop/share/hadoop/common/lib/commons-cli-1.2.jar:/opt/hadoop/hadoop/share
/hadoop/common/lib/jackson-xc-1.8.8.jar:/opt/hadoop/hadoop/share/hadoop/common
/lib/commons-beanutils-core-1.8.0.jar:/opt/hadoop/hadoop/share/hadoop/common/lib/commonslang-2.5.jar:/opt/hadoop/hadoop/share/hadoop/common/lib/commons-httpclient-3.1.jar:
/opt/hadoop/hadoop/share/hadoop/common/lib/commons-logging-1.1.1.jar:/opt/hadoop/hadoop
/share/hadoop/common/lib/commons-io-2.1.jar:/opt/hadoop/hadoop/share/hadoop/common
/lib/commons-el-1.0.jar:/opt/hadoop/hadoop/share/hadoop/common/lib/jersey-core-1.9.jar:
/opt/hadoop/hadoop/share/hadoop/common/lib/xmlenc-0.52.jar:/opt/hadoop/hadoop/share
/hadoop/common/lib/hadoop-annotations-2.2.0.jar:/opt/hadoop/hadoop/share/hadoop/common
/lib/junit-4.8.2.jar:/opt/hadoop/hadoop/share/hadoop/common/lib/jersey-server-1.9.jar:
/opt/hadoop/hadoop/share/hadoop/common/lib/zookeeper-3.4.5.jar:/opt/hadoop/hadoop/share
/hadoop/common/lib/commons-compress-1.4.1.jar:/opt/hadoop/hadoop/share/hadoop/common
/lib/jetty-6.1.26.jar:/opt/hadoop/hadoop/share/hadoop/common/lib/hadoop-auth-2.2.0.jar:
/opt/hadoop/hadoop/share/hadoop/common/lib/jaxb-api-2.2.2.jar:/opt/hadoop/hadoop/share
/hadoop/common/lib/slf4j-log4j12-1.7.5.jar:/opt/hadoop/hadoop/share/hadoop/common/lib/xz1.0.jar:/opt/hadoop/hadoop/share/hadoop/common/lib/protobuf-java-2.5.0.jar:/opt/hadoop
/hadoop/share/hadoop/common/lib/mockito-all-1.8.5.jar:/opt/hadoop/hadoop/share/hadoop
/common/lib/guava-11.0.2.jar:/opt/hadoop/hadoop/share/hadoop/common/hadoop-common2.2.0.jar:/opt/hadoop/hadoop/share/hadoop/common/hadoop-nfs-2.2.0.jar:/opt/hadoop/hadoop
/share/hadoop/common/hadoop-common-2.2.0-tests.jar:/opt/hadoop/hadoop/share/hadoop/hdfs:
/opt/hadoop/hadoop/share/hadoop/hdfs/lib/jsr305-1.3.9.jar:/opt/hadoop/hadoop/share/hadoop
/hdfs/lib/servlet-api-2.5.jar:/opt/hadoop/hadoop/share/hadoop/hdfs/lib/commons-codec-1.4.jar:
/opt/hadoop/hadoop/share/hadoop/hdfs/lib/commons-daemon-1.0.13.jar:/opt/hadoop/hadoop
/share/hadoop/hdfs/lib/log4j-1.2.17.jar:/opt/hadoop/hadoop/share/hadoop/hdfs/lib/jacksonmapper-asl-1.8.8.jar:/opt/hadoop/hadoop/share/hadoop/hdfs/lib/jetty-util-6.1.26.jar:/opt/hadoop
/hadoop/share/hadoop/hdfs/lib/jasper-runtime-5.5.23.jar:/opt/hadoop/hadoop/share/hadoop
/hdfs/lib/asm-3.2.jar:/opt/hadoop/hadoop/share/hadoop/hdfs/lib/jackson-core-asl-1.8.8.jar:
/opt/hadoop/hadoop/share/hadoop/hdfs/lib/jsp-api-2.1.jar:/opt/hadoop/hadoop/share/hadoop
/hdfs/lib/netty-3.6.2.Final.jar:/opt/hadoop/hadoop/share/hadoop/hdfs/lib/commons-cli-1.2.jar:
/opt/hadoop/hadoop/share/hadoop/hdfs/lib/commons-lang-2.5.jar:/opt/hadoop/hadoop/share
/hadoop/hdfs/lib/commons-logging-1.1.1.jar:/opt/hadoop/hadoop/share/hadoop/hdfs/lib
/commons-io-2.1.jar:/opt/hadoop/hadoop/share/hadoop/hdfs/lib/commons-el-1.0.jar:/opt/hadoop
/hadoop/share/hadoop/hdfs/lib/jersey-core-1.9.jar:/opt/hadoop/hadoop/share/hadoop/hdfs/lib
/xmlenc-0.52.jar:/opt/hadoop/hadoop/share/hadoop/hdfs/lib/jersey-server-1.9.jar:/opt/hadoop
/hadoop/share/hadoop/hdfs/lib/jetty-6.1.26.jar:/opt/hadoop/hadoop/share/hadoop/hdfs/lib
/protobuf-java-2.5.0.jar:/opt/hadoop/hadoop/share/hadoop/hdfs/lib/guava-11.0.2.jar:/opt/hadoop
/hadoop/share/hadoop/hdfs/hadoop-hdfs-nfs-2.2.0.jar:/opt/hadoop/hadoop/share/hadoop
/hdfs/hadoop-hdfs-2.2.0.jar:/opt/hadoop/hadoop/share/hadoop/hdfs/hadoop-hdfs-2.2.0-tests.jar:

10/04/2014 11:20 AM

How to Setup Hadoop 1.2.1 on CentOS/RHEL 6/5 - TecAdmin.net

Share this:

http://tecadmin.net/steps-to-install-hadoop-on-centosrhel-6/

/opt/hadoop/hadoop/share/hadoop/yarn/lib/aopalliance-1.0.jar:/opt/hadoop/hadoop/share/hadoop
4

asl-1.8.8.jar:/opt/hadoop/hadoop/share/hadoop/yarn/lib/junit-4.10.jar:/opt/hadoop/hadoop/share
/hadoop/yarn/lib/paranamer-2.3.jar:/opt/hadoop/hadoop/share/hadoop/yarn/lib/hamcrestcore-1.1.jar:/opt/hadoop/hadoop/share/hadoop/yarn/lib/snappy-java-1.0.4.1.jar:/opt/hadoop
/hadoop/share/hadoop/yarn/lib/asm-3.2.jar:/opt/hadoop/hadoop/share/hadoop/yarn/lib/jacksoncore-asl-1.8.8.jar:/opt/hadoop/hadoop/share/hadoop/yarn/lib/avro-1.7.4.jar:/opt/hadoop/hadoop
/share/hadoop/yarn/lib/javax.inject-1.jar:/opt/hadoop/hadoop/share/hadoop/yarn/lib/netty3.6.2.Final.jar:/opt/hadoop/hadoop/share/hadoop/yarn/lib/jersey-guice-1.9.jar:/opt/hadoop
/hadoop/share/hadoop/yarn/lib/guice-3.0.jar:/opt/hadoop/hadoop/share/hadoop/yarn/lib
/commons-io-2.1.jar:/opt/hadoop/hadoop/share/hadoop/yarn/lib/jersey-core-1.9.jar:/opt/hadoop
/hadoop/share/hadoop/yarn/lib/hadoop-annotations-2.2.0.jar:/opt/hadoop/hadoop/share/hadoop
/yarn/lib/guice-servlet-3.0.jar:/opt/hadoop/hadoop/share/hadoop/yarn/lib/jersey-server-1.9.jar:
/opt/hadoop/hadoop/share/hadoop/yarn/lib/commons-compress-1.4.1.jar:/opt/hadoop/hadoop
/share/hadoop/yarn/lib/xz-1.0.jar:/opt/hadoop/hadoop/share/hadoop/yarn/lib/protobufjava-2.5.0.jar:/opt/hadoop/hadoop/share/hadoop/yarn/hadoop-yarn-applications-unmanagedam-launcher-2.2.0.jar:/opt/hadoop/hadoop/share/hadoop/yarn/hadoop-yarn-server-tests2.2.0.jar:/opt/hadoop/hadoop/share/hadoop/yarn/hadoop-yarn-api-2.2.0.jar:/opt/hadoop/hadoop
/share/hadoop/yarn/hadoop-yarn-server-common-2.2.0.jar:/opt/hadoop/hadoop/share/hadoop
/yarn/hadoop-yarn-client-2.2.0.jar:/opt/hadoop/hadoop/share/hadoop/yarn/hadoopyarn-applications-distributedshell-2.2.0.jar:/opt/hadoop/hadoop/share/hadoop/yarn/hadoopyarn-site-2.2.0.jar:/opt/hadoop/hadoop/share/hadoop/yarn/hadoop-yarn-server-web-proxy2.2.0.jar:/opt/hadoop/hadoop/share/hadoop/yarn/hadoop-yarn-server-resourcemanager-2.2.0.jar:
/opt/hadoop/hadoop/share/hadoop/yarn/hadoop-yarn-server-nodemanager-2.2.0.jar:/opt/hadoop
/hadoop/share/hadoop/yarn/hadoop-yarn-common-2.2.0.jar:/opt/hadoop/hadoop/share/hadoop
/mapreduce/lib/aopalliance-1.0.jar:/opt/hadoop/hadoop/share/hadoop/mapreduce/lib/log4j1.2.17.jar:/opt/hadoop/hadoop/share/hadoop/mapreduce/lib/jackson-mapper-asl-1.8.8.jar:
/opt/hadoop/hadoop/share/hadoop/mapreduce/lib/junit-4.10.jar:/opt/hadoop/hadoop/share
/hadoop/mapreduce/lib/paranamer-2.3.jar:/opt/hadoop/hadoop/share/hadoop/mapreduce
/lib/hamcrest-core-1.1.jar:/opt/hadoop/hadoop/share/hadoop/mapreduce/lib/snappyjava-1.0.4.1.jar:/opt/hadoop/hadoop/share/hadoop/mapreduce/lib/asm-3.2.jar:/opt/hadoop
/hadoop/share/hadoop/mapreduce/lib/jackson-core-asl-1.8.8.jar:/opt/hadoop/hadoop/share
/hadoop/mapreduce/lib/avro-1.7.4.jar:/opt/hadoop/hadoop/share/hadoop/mapreduce
/lib/javax.inject-1.jar:/opt/hadoop/hadoop/share/hadoop/mapreduce/lib/netty-3.6.2.Final.jar:
/opt/hadoop/hadoop/share/hadoop/mapreduce/lib/jersey-guice-1.9.jar:/opt/hadoop/hadoop/share
/hadoop/mapreduce/lib/guice-3.0.jar:/opt/hadoop/hadoop/share/hadoop/mapreduce/lib/commonsio-2.1.jar:/opt/hadoop/hadoop/share/hadoop/mapreduce/lib/jersey-core-1.9.jar:/opt/hadoop
/hadoop/share/hadoop/mapreduce/lib/hadoop-annotations-2.2.0.jar:/opt/hadoop/hadoop/share
/hadoop/mapreduce/lib/guice-servlet-3.0.jar:/opt/hadoop/hadoop/share/hadoop/mapreduce
/lib/jersey-server-1.9.jar:/opt/hadoop/hadoop/share/hadoop/mapreduce/lib/commons-compress1.4.1.jar:/opt/hadoop/hadoop/share/hadoop/mapreduce/lib/xz-1.0.jar:/opt/hadoop/hadoop/share
/hadoop/mapreduce/lib/protobuf-java-2.5.0.jar:/opt/hadoop/hadoop/share/hadoop/mapreduce
/hadoop-mapreduce-client-hs-plugins-2.2.0.jar:/opt/hadoop/hadoop/share/hadoop/mapreduce
/hadoop-mapreduce-client-common-2.2.0.jar:/opt/hadoop/hadoop/share/hadoop/mapreduce
/hadoop-mapreduce-client-jobclient-2.2.0.jar:/opt/hadoop/hadoop/share/hadoop/mapreduce
/hadoop-mapreduce-client-shuffle-2.2.0.jar:/opt/hadoop/hadoop/share/hadoop/mapreduce
/hadoop-mapreduce-client-jobclient-2.2.0-tests.jar:/opt/hadoop/hadoop/share/hadoop/mapreduce
/hadoop-mapreduce-client-core-2.2.0.jar:/opt/hadoop/hadoop/share/hadoop/mapreduce/hadoopmapreduce-client-app-2.2.0.jar:/opt/hadoop/hadoop/share/hadoop/mapreduce/hadoopmapreduce-examples-2.2.0.jar:/opt/hadoop/hadoop/share/hadoop/mapreduce/hadoop-mapreduceclient-hs-2.2.0.jar:/contrib/capacity-scheduler/*.jar:/contrib/capacity-scheduler/*.jar

33 of 43

10/04/2014 11:20 AM

How to Setup Hadoop 1.2.1 on CentOS/RHEL 6/5 - TecAdmin.net

Share this:

http://tecadmin.net/steps-to-install-hadoop-on-centosrhel-6/

-r 1529768; compiled

STARTUP_MSG: java = 1.6.0_20


************************************************************/
14/02/03 03:38:23 INFO namenode.NameNode: registered UNIX signal handlers for [TERM, HUP,
INT]
14/02/03 03:38:24 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your
platform using builtin-java classes where applicable
14/02/03 03:38:25 WARN common.Util: Path /opt/hadoop/hadoop/dfs/name should be specified as
a URI in configuration files. Please update hdfs configuration.
14/02/03 03:38:25 WARN common.Util: Path /opt/hadoop/hadoop/dfs/name should be specified as
a URI in configuration files. Please update hdfs configuration.
Formatting using clusterid: CID-c4662030-0d86-4263-aed9-043fd34565cd
14/02/03 03:38:25 INFO namenode.HostFileManager: read includes:
HostSet(
)
14/02/03 03:38:25 INFO namenode.HostFileManager: read excludes:
HostSet(
)
14/02/03 03:38:25 INFO blockmanagement.DatanodeManager: dfs.block.invalidate.limit=1000
14/02/03 03:38:25 INFO util.GSet: Computing capacity for map BlocksMap
14/02/03 03:38:25 INFO util.GSet: VM type = 64-bit
14/02/03 03:38:25 INFO util.GSet: 2.0% max memory = 966.7 MB
14/02/03 03:38:25 INFO util.GSet: capacity = 2^21 = 2097152 entries
14/02/03 03:38:25 INFO blockmanagement.BlockManager: dfs.block.access.token.enable=false
14/02/03 03:38:25 INFO blockmanagement.BlockManager: defaultReplication = 2
14/02/03 03:38:25 INFO blockmanagement.BlockManager: maxReplication = 512
14/02/03 03:38:25 INFO blockmanagement.BlockManager: minReplication = 1
14/02/03 03:38:25 INFO blockmanagement.BlockManager: maxReplicationStreams = 2
14/02/03 03:38:25 INFO blockmanagement.BlockManager: shouldCheckForEnoughRacks = false
14/02/03 03:38:25 INFO blockmanagement.BlockManager: replicationRecheckInterval = 3000
14/02/03 03:38:25 INFO blockmanagement.BlockManager: encryptDataTransfer = false
14/02/03 03:38:25 INFO namenode.FSNamesystem: fsOwner = hadoop (auth:SIMPLE)
14/02/03 03:38:25 INFO namenode.FSNamesystem: supergroup = supergroup
14/02/03 03:38:25 INFO namenode.FSNamesystem: isPermissionEnabled = true
14/02/03 03:38:25 INFO namenode.FSNamesystem: HA Enabled: false
14/02/03 03:38:25 INFO namenode.FSNamesystem: Append Enabled: true
14/02/03 03:38:25 INFO util.GSet: Computing capacity for map INodeMap
14/02/03 03:38:25 INFO util.GSet: VM type = 64-bit
14/02/03 03:38:25 INFO util.GSet: 1.0% max memory = 966.7 MB
14/02/03 03:38:25 INFO util.GSet: capacity = 2^20 = 1048576 entries
14/02/03 03:38:25 INFO namenode.NameNode: Caching file names occuring more than 10 times
14/02/03 03:38:25 INFO namenode.FSNamesystem: dfs.namenode.safemode.threshold-pct =
0.9990000128746033
14/02/03 03:38:25 INFO namenode.FSNamesystem: dfs.namenode.safemode.min.datanodes = 0
14/02/03 03:38:25 INFO namenode.FSNamesystem: dfs.namenode.safemode.extension = 30000
14/02/03 03:38:25 INFO namenode.FSNamesystem: Retry cache on namenode is enabled
14/02/03 03:38:25 INFO namenode.FSNamesystem: Retry cache will use 0.03 of total heap and
retry cache entry expiry time is 600000 millis
14/02/03 03:38:25 INFO util.GSet: Computing capacity for map Namenode Retry Cache
14/02/03 03:38:25 INFO util.GSet: VM type = 64-bit

34 of 43

10/04/2014 11:20 AM

How to Setup Hadoop 1.2.1 on CentOS/RHEL 6/5 - TecAdmin.net

Share this:

http://tecadmin.net/steps-to-install-hadoop-on-centosrhel-6/

Re-format filesystem in Storage Directory /opt/hadoop/hadoop/dfs/name ? (Y or N) y


14/02/03 03:38:29 WARN net.DNS: Unable to determine local hostname -falling back to localhost
java.net.UnknownHostException: RHEL: RHEL
at java.net.InetAddress.getLocalHost(InetAddress.java:1426)
at org.apache.hadoop.net.DNS.resolveLocalHostname(DNS.java:264)
at org.apache.hadoop.net.DNS.(DNS.java:57)
at org.apache.hadoop.hdfs.server.namenode.NNStorage.newBlockPoolID(NNStorage.java:914)
at org.apache.hadoop.hdfs.server.namenode.NNStorage.newNamespaceInfo(NNStorage.java:550)
at org.apache.hadoop.hdfs.server.namenode.FSImage.format(FSImage.java:144)
at org.apache.hadoop.hdfs.server.namenode.NameNode.format(NameNode.java:837)
at org.apache.hadoop.hdfs.server.namenode.NameNode.createNameNode(NameNode.java:1213)
at org.apache.hadoop.hdfs.server.namenode.NameNode.main(NameNode.java:1320)
14/02/03 03:38:29 WARN net.DNS: Unable to determine address of the host-falling back to
localhost address
java.net.UnknownHostException: RHEL: RHEL
at java.net.InetAddress.getLocalHost(InetAddress.java:1426)
at org.apache.hadoop.net.DNS.resolveLocalHostIPAddress(DNS.java:287)
at org.apache.hadoop.net.DNS.(DNS.java:58)
at org.apache.hadoop.hdfs.server.namenode.NNStorage.newBlockPoolID(NNStorage.java:914)
at org.apache.hadoop.hdfs.server.namenode.NNStorage.newNamespaceInfo(NNStorage.java:550)
at org.apache.hadoop.hdfs.server.namenode.FSImage.format(FSImage.java:144)
at org.apache.hadoop.hdfs.server.namenode.NameNode.format(NameNode.java:837)
at org.apache.hadoop.hdfs.server.namenode.NameNode.createNameNode(NameNode.java:1213)
at org.apache.hadoop.hdfs.server.namenode.NameNode.main(NameNode.java:1320)
14/02/03 03:38:29 INFO common.Storage: Storage directory /opt/hadoop/hadoop/dfs/name has
been successfully formatted.
14/02/03 03:38:29 INFO namenode.FSImage: Saving image file /opt/hadoop/hadoop/dfs/name
/current/fsimage.ckpt_0000000000000000000 using no compression
14/02/03 03:38:29 INFO namenode.FSImage: Image file /opt/hadoop/hadoop/dfs/name/current
/fsimage.ckpt_0000000000000000000 of size 198 bytes saved in 0 seconds.
14/02/03 03:38:29 INFO namenode.NNStorageRetentionManager: Going to retain 1 images with
txid >= 0
14/02/03 03:38:30 INFO util.ExitUtil: Exiting with status 0
14/02/03 03:38:30 INFO namenode.NameNode: SHUTDOWN_MSG:
/************************************************************
SHUTDOWN_MSG: Shutting down NameNode at java.net.UnknownHostException: RHEL: RHEL
************************************************************/
[hadoop@RHEL hadoop]$
hosts file:
127.0.0.1 localhost localhost.localdomain localhost4 localhost4.localdomain4
::1 localhost localhost.localdomain localhost6 localhost6.localdomain6
RHEL
please help
Reply

35 of 43

10/04/2014 11:20 AM

How to Setup Hadoop 1.2.1 on CentOS/RHEL 6/5 - TecAdmin.net

Share this:

http://tecadmin.net/steps-to-install-hadoop-on-centosrhel-6/

Rahul Kumar | February 3, 2014


Hi Javed,
Your systems hosts file entry looks incorrect. Please add entry like below
127.0.0.1 RHEL
Reply

raj | February 27, 2014


Hi Rahul,
Your instructions are superp. The only issue I am facing is a jps not getting recognized. Not sure
where the issue could be. Otherwise all set to go.
Reply

Rahul | February 27, 2014


Hi Raj,
Try following command for jps.
$JAVA_HOME/bin/jps
Reply

Raj | February 28, 2014


Hi Rahul,
jps is not showing under $JAVA_HOME/bin
It comes out with an error no such file or directory
Also, once I complete my tasks and comes out of linux, do I need to restart the hadoop?
Reply

Raj | February 28, 2014

36 of 43

10/04/2014 11:20 AM

How to Setup Hadoop 1.2.1 on CentOS/RHEL 6/5 - TecAdmin.net

Share this:

http://tecadmin.net/steps-to-install-hadoop-on-centosrhel-6/

Reply

Raj | February 28, 2014


Hi Rahul,
Do you know what is the best site to download Pig and hive? I realized that I am unable to run and
pig and hive. I thought it comes with the package just like while setting-up under cloudera.
Reply

SS | March 12, 2014


How do we add data to the single node cluster.

Reply

sandeep | April 9, 2014


Hi Rahul,
I have successfully done the installation of single node cluster and able to see all daemons running.
But i am not able to run hadoop fs commands , for this should i install any thing else like jars??
Thanks in advance
Reply

Elavarasan | April 28, 2014


Hi Rahul,
jps 0r $JAVA_HOME/bin/jps giving error bash:jps: command not found and bash: /bin/jps: no such
file or directory.. Kindly address the error
Reply

37 of 43

10/04/2014 11:20 AM

How to Setup Hadoop 1.2.1 on CentOS/RHEL 6/5 - TecAdmin.net

Share this:

http://tecadmin.net/steps-to-install-hadoop-on-centosrhel-6/

Rahul | April 28, 2014


HI Elavarasa,
Please make sure you have setup JAVA_HOME environment variable. Please provide output
of following commands.
# env | grep JAVA_HOME
# echo $JAVA_HOME
Reply

Elavarasan | April 29, 2014


localhost: [fatal Error] core-site.xml:10:2: The markup in the document following the root element
must be well-formed.
error while running bin/start-all.sh
Reply

Elavarasan | April 29, 2014


hi rahul,
Plz help me for following case
[FATAL ERROR] core-site.xml:10:2: The markup in the document following the root element must
be well-formed.
error while running bin/start-all.sh on all *.xml files
Reply

Rahul | April 29, 2014


Hi,
Please provide your core-site.xml file content
Reply

38 of 43

10/04/2014 11:20 AM

How to Setup Hadoop 1.2.1 on CentOS/RHEL 6/5 - TecAdmin.net

Share this:

http://tecadmin.net/steps-to-install-hadoop-on-centosrhel-6/

Elavarasan | April 29, 2014


fs.default.name
hdfs://localhost:9000/
dfs.permissions
false
Reply

Rahul | April 29, 2014


Hi
I could not find file content properly. Can you email me at rahul.kumar1099@gmail.com.
Reply

Thang Nguyen | May 26, 2014


Hi Rahul,
Can you please post some direction to test with the input/output data after installation?
for example: how can we upload a file to run a simple wordcount and get the output ?
Regards,
Thang Nguyen
Reply

39 of 43

10/04/2014 11:20 AM

How to Setup Hadoop 1.2.1 on CentOS/RHEL 6/5 - TecAdmin.net

Share this:

http://tecadmin.net/steps-to-install-hadoop-on-centosrhel-6/

Kullayappa | May 28, 2014


Hi,
Your artcile is simply super, I followed each and every step and installed hadoop.
my suggestion is: while checking status with jps, it was not showing information and I got no
such command. I used again export JAVA_HOME stuff. after that JPS working.
I guess.. these export JAVA_HOME also need to include in some where else.
Thansk alot for your article buddy
Reply

Shreedhar | June 3, 2014


Thanks a lot.
I made a few modifications but the the instructions are on the money!
If you get
WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform using
builtin-java classes where applicable use the instructions from below
http://stackoverflow.com/questions/20011252/hadoop-2-2-0-64-bit-installing-but-cannot-start
Reply

Vijay | June 10, 2014


Hi Rahul,
When I execute the command $ bin/start-all.sh, I am getting the FATAL ERROR as shown below
====
[hadoop@localhost hadoop]$ sbin/start-all.sh
This script is Deprecated. Instead use start-dfs.sh and start-yarn.sh
[Fatal Error] core-site.xml:1:1: Content is not allowed in prolog.
14/06/10 01:16:14 FATAL conf.Configuration: error parsing conf core-site.xml
.
.
.
.
tarting namenodes on []
localhost: starting namenode, logging to /opt/hadoop/hadoop/logs/hadoop-hadoop-namenodelocalhost.localdomain.out
localhost: [Fatal Error] core-site.xml:1:1: Content is not allowed in prolog.
localhost: starting datanode, logging to /opt/hadoop/hadoop/logs/hadoop-hadoop-datanode-

40 of 43

10/04/2014 11:20 AM

How to Setup Hadoop 1.2.1 on CentOS/RHEL 6/5 - TecAdmin.net

Share this:

http://tecadmin.net/steps-to-install-hadoop-on-centosrhel-6/

[Fatal Error] core-site.xml:1:1: Content is not allowed in prolog.


14/06/10 01:16:25 FATAL conf.Configuration: error parsing conf core-site.xml
org.xml.sax.SAXParseException; systemId: file:/opt/hadoop/hadoop/etc/hadoop/core-site.xml;
lineNumber: 1; columnNumber: 1; Content is not allowed in prolog.
.
.
.
starting yarn daemons
starting resourcemanager, logging to /opt/hadoop/hadoop/logs/yarn-hadoop-resourcemanagerlocalhost.localdomain.out
[Fatal Error] core-site.xml:1:1: Content is not allowed in prolog.
localhost: starting nodemanager, logging to /opt/hadoop/hadoop/logs/yarn-hadoop-nodemanagerlocalhost.localdomain.out
localhost: [Fatal Error] core-site.xml:1:1: Content is not allowed in prolog.
===
where core-site.xml edited like this
$vim core-site.xml
fs.default.name
hdfs://localhost:9000/
dfs.permissions
false
#########################################
please help me to overcome this error.
Reply

41 of 43

10/04/2014 11:20 AM

How to Setup Hadoop 1.2.1 on CentOS/RHEL 6/5 - TecAdmin.net

Share this:

http://tecadmin.net/steps-to-install-hadoop-on-centosrhel-6/

Rahul | July 25, 2014


HI Rahul,
I am not able to start the hadoop services..getting an error like>
14/07/24 23:11:27 INFO namenode.NameNode: SHUTDOWN_MSG:
/************************************************************
SHUTDOWN_MSG: Shutting down NameNode at localhost.localdomain/127.0.0.1
************************************************************/
hadoop@localhost hadoop]$ bin/start-all.sh
bash: bin/start-all.sh: No such file or directory
Reply

ak | August 21, 2014


Hai Rahul,
Can u plz tell me how to configure multiple datanodes on a single machine.I am using hadoop2.5
Reply

hoangthang | September 25, 2014


Hi Rahul,
Please help me! I installed follow your guide. when i run jps that result below:
18118 Jps
18068 TaskTracker
17948 JobTracker
17861 SecondaryNameNode
17746 DataNode
however when I run stop-all.sh command that
no jobtracker to stop
localhost: no tasktracker to stop
no namenode to stop
localhost: no datanode to stop
localhost: no secondarynamenode to stop
Can you explan for me? Thanks so much!
Reply

42 of 43

10/04/2014 11:20 AM

How to Setup Hadoop 1.2.1 on CentOS/RHEL 6/5 - TecAdmin.net

Share this:

http://tecadmin.net/steps-to-install-hadoop-on-centosrhel-6/

Name (required)

Mail (will not be published) (required)

Website

Submit Comment
Notify me of follow-up comments by email.

43 of 43

10/04/2014 11:20 AM