You can't run the hadoop command on the other Pis until you've copied over those hadoop directories. If you have done that, you also need to make sure that that directory is on the $PATH of the other Pis by including the following lines in each of their .bashrc files (sorry, I don't think I included this step in the instructions):
Hi. Thanks for the reply. Yes, I did the steps you mentioned. Since Java wasn't pre-installed, I installed it manually in each Pi, and checked them individually to see if they are working. As you can see below, the env variables are configured as you have suggested.
Did you follow these steps?
Create the Directories
Create the required directories on all other Pis using:
Copy the Configuration
Copy the files in /opt/hadoop to each other Pi using:
This will take quite a long time, so go grab lunch.
When you're back, verify that the files copied correctly by querying the Hadoop version on each node with the following command:
You can't run the
hadoopcommand on the other Pis until you've copied over thosehadoopdirectories. If you have done that, you also need to make sure that that directory is on the$PATHof the other Pis by including the following lines in each of their.bashrcfiles (sorry, I don't think I included this step in the instructions):You could also simply
clusterscpthe.bashrcfile from Pi #1 to each of the other Pis.Hi. Thanks for the reply. Yes, I did the steps you mentioned. Since Java wasn't pre-installed, I installed it manually in each Pi, and checked them individually to see if they are working. As you can see below, the env variables are configured as you have suggested.
Thanks. I resolved the issue by putting the PATH exports above the following part in .bashrc:
I also put the export PATH commands in /etc/profile of each Pi. Thanks.