Sunday, December 7, 2014

Few Common Problems You Face While Installing Hadoop

Some common problems, that a person faces while installing Hadoop. Here are few problems listed below.

1. Problem with ssh configuration.
 error: connection refused to port 22

2. NameNode node not reachable
    error: Retrying to connect 127.0.0.1

1. Problem with ssh configuration: In this case you may face many kind of errors, but most common one while installing Hadoop is connection refused to port 22. Here you should check if machine on which you are trying to login, should have ssh server installed.
   
If you are using Ubuntu, you can install ssh server using following command.
   
   $sudo apt-get install openssh-server
   
   On CentOs or Redhat you can install ssh server using yum package manager
   
   $sudo yum install openssh-server
   
   After you have installed ssh server, make sure you have configured the keys properly and share public key with the machine that you want to login into. If the problem persists then check for configurations of ssh in your machine. you can check configuration in /etc/ssh/sshd_config file. use following command to read this file
   
   $sudo gedit /etc/ssh/sshd_config
   
   In this file RSA Authentication should be set to yes, password less authentication also should be yes.
   
   after this close the file and restart ssh with following command
   
   $sudo /etc/init.d/ssh restart
   
   Now your problem should be resolved. Apart from this error you can face one more issue. Even though you have configured keys correctly, ssh is still prompting for password. In that case check if keys are being managed by ssh. For that run following command. your keys should be in 

   $HOME/.ssh folder
   
   ssh-add


   
 2. If your Namenode is not reachable, first thing you should check is demons running on Namenode machine. you can check that with following command

   jps
   
   This command tells you all java processes running on your machine. If you don't see Namenode in the output list, do the following. Stop the hadoop with following command.
   
   $HADOOP_HOME/bin/stop-all.sh
   
   Format the Namenode using following command
   
   $HADOOP_HOME/bin/hadoop namenode -format
   
   start hadoop with following command
   
   $HADOOP_HOME/bin/start-all.sh
   
   This time Namenode should run. if you are still not able to start namenode. then check for core-site.xml file in conf directory of hadoop with following command
   
   $gedit $HADOOP_HOME/conf/core-site.xml
   
   Check for value for property hadoop.tmp.dir. it should be set to a path where user who is trying to run hadoop has write permissions. if you dont want to scratch your head on this set it to $HOME/hadoop_tmp directory. Now save and close this file. Format the namenode again and try starting hadoop again. Things should work this time.

   Thats all for this posts, Please share problems that you are facing, we will try to solve them together. stay tuned for more stuff :)   
   

No comments: