In this post, I am giving the step by step process to install the Hadoop 2.6.0.
To see the distribution/version you are using, you can try:
to find out version of ubuntu
Hadoop framework is written in Java!!
user1@localhost:~$ cd ~
# Update the source list
k@laptop(local directory):~$ sudo apt-get update
# The OpenJDK project is the default version of Java
# that is provided from a supported Ubuntu repository.
k@laptop:~$ sudo apt-get install default-jdk
k@laptop:~$ java -version
java version “1.7.0_65”
OpenJDK Runtime Environment (IcedTea 2.5.3) (7u71-2.5.3-0ubuntu0.14.04.1)
OpenJDK 64-Bit Server VM (build 24.65-b04, mixed mode)
Adding a dedicated Hadoop user
k@laptop:~$ sudo addgroup hadoop
Adding group `hadoop’ (GID 1002) …
k@laptop:~$ sudo adduser –ingroup hadoop hduser
Adding user `hduser’ …
Adding new user `hduser’ (1001) with group `hadoop’ …
Creating home directory `/home/hduser’ …
Copying files from `/etc/skel’ …
Enter new UNIX password:
Retype new UNIX password:
passwd: password updated successfully
Changing the user information for hduser
Enter the new value, or press ENTER for the default
Full Name :
Room Number :
Work Phone :
Home Phone :
Is the information correct? [Y/n] Y
ssh has two main components:
- ssh: The command we use to connect to remote machines – the client.
- sshd: The daemon that is running on the server and allows clients to connect to the server.
The ssh is pre-enabled on Linux, but in order to start sshd daemon, we need to installssh first. Use this command to do that :
k@laptop:~$ sudo apt-get install ssh
This will install ssh on our machine. If we get something similar to the following, we can think it is setup properly:
k@laptop:~$ which ssh
k@laptop:~$ which sshd
Create and Setup SSH Certificates
Hadoop requires SSH access to manage its nodes, i.e. remote machines plus our local machine. For our single-node setup of Hadoop, we therefore need to configure SSH access to localhost.
So, we need to have SSH up and running on our machine and configured it to allow SSH public key authentication.
Hadoop uses SSH (to access its nodes) which would normally require the user to enter a password. However, this requirement can be eliminated by creating and setting up SSH certificates using the following commands. If asked for a filename just leave it blank and press the enter key to continue.
k@laptop:~$ su hduser
hduser@laptop:~$ ssh-keygen -t rsa -P “”
Generating public/private rsa key pair.
Enter file in which to save the key (/home/hduser/.ssh/id_rsa):
Created directory ‘/home/hduser/.ssh’.
Your identification has been saved in /home/hduser/.ssh/id_rsa.
Your public key has been saved in /home/hduser/.ssh/id_rsa.pub.
The key fingerprint is:
The key’s randomart image is:
+–[ RSA 2048]—-+
| .oo.o |
| . .o=. o |
| . + . o . |
| o = E |
| S + |
| . + |
| O + |
| O o |
| o.. |
hduser@laptop:/home/k$ cat $HOME/.ssh/id_rsa.pub >> $HOME/.ssh/authorized_keys
The second command adds the newly created key to the list of authorized keys so that Hadoop can use ssh without prompting for a password.
We can check if ssh works:
hduser@laptop:/home/k$ ssh localhost
The authenticity of host ‘localhost (127.0.0.1)’ can’t be established.
ECDSA key fingerprint is e1:8b:a0:a5:75:ef:f4:b4:5e:a9:ed:be:64:be:5c:2f.
Are you sure you want to continue connecting (yes/no)? yes
Warning: Permanently added ‘localhost’ (ECDSA) to the list of known hosts.
Welcome to Ubuntu 14.04.1 LTS (GNU/Linux 3.13.0-40-generic x86_64)
hduser@laptop:~$ su hduser
hduser@laptop:~$ tar xvzf hadoop-2.6.0.tar.gz
a folder would be created by the name of hadoop-2.6.0
rename this folder to hadoop
mv hadoop-2.6.0 hadoop
We want to move the Hadoop installation to the /usr/local/hadoop directory using the following command:
hduser@laptop:~$ sudo mv hadoop /usr/local/
[sudo] password for hduser:
hduser is not in the sudoers file. This incident will be reported.
Oops!… We got:
“hduser is not in the sudoers file. This incident will be reported.”
This error can be resolved by logging in as a root user, and then add hduser to sudo:
hduser@laptop:~/hadoop-2.6.0$ su k (earlier user)
k@laptop:/home/hduser$ sudo adduser hduser sudo
[sudo] password for k:
Adding user `hduser’ to group `sudo’ …
Adding user hduser to group sudo
Now, the hduser has root priviledge, we can move the Hadoop installation to the/usr/local/hadoop directory without any problem:
k@laptop:/home/hduser$ sudo su hduser
hduser@laptop:~/$ sudo mv hadoop /usr/local/
hduser@laptop:~/$ sudo chown -R hduser:hadoop /usr/local/hadoop
————–done till here———————–
Setup Configuration Files
The following files will have to be modified to complete the Hadoop setup:
Before editing the .bashrc file in our home directory, we need to find the path where Java has been installed to set the JAVA_HOME environment variable using the following command:
hduser@laptop update-alternatives –config java
There is only one alternative in link group java (providing /usr/bin/java): /usr/lib/jvm/java-7-openjdk-amd64/jre/bin/java
Nothing to configure.
Now we can append the following to the end of ~/.bashrc:
hduser@laptop:~$ nano ~/.bashrc
#HADOOP VARIABLES START
#HADOOP VARIABLES END
hduser@laptop:~$ source ~/.bashrc
note that the JAVA_HOME should be set as the path just before the ‘…/bin/’:
hduser@ubuntu-VirtualBox:~$ javac -version
hduser@ubuntu-VirtualBox:~$ which javac
hduser@ubuntu-VirtualBox:~$ readlink -f /usr/bin/javac
We need to set JAVA_HOME by modifying hadoop-env.sh file.
hduser@laptop:~$ nano /usr/local/hadoop/etc/hadoop/hadoop-env.sh
Adding the above statement in the hadoop-env.sh file ensures that the value of JAVA_HOME variable will be available to Hadoop whenever it is started up.
Please check the Part II here.