DEV Community

Discussion on: Installing and Running Hadoop and Spark on Windows

 
crashbandavid profile image
David Camilo Serrano

Sure,
Here it is:

Volume in drive C has no label.
Volume Serial Number is 8276-D962

Directory of C:\BigData\hadoop-2.9.1\sbin

11/09/2019 09:55 a.m.

.

11/09/2019 09:55 a.m. ..

16/04/2018 06:52 a.m. 2.752 distribute-exclude.sh

11/09/2019 09:55 a.m. FederationStateStore

16/04/2018 06:52 a.m. 6.475 hadoop-daemon.sh

16/04/2018 06:52 a.m. 1.360 hadoop-daemons.sh

16/04/2018 06:52 a.m. 1.640 hdfs-config.cmd

16/04/2018 06:52 a.m. 1.427 hdfs-config.sh

16/04/2018 06:52 a.m. 3.148 httpfs.sh

16/04/2018 06:52 a.m. 3.677 kms.sh

16/04/2018 06:52 a.m. 4.134 mr-jobhistory-daemon.sh

16/04/2018 06:52 a.m. 1.648 refresh-namenodes.sh

16/04/2018 06:52 a.m. 2.145 slaves.sh

16/04/2018 06:52 a.m. 1.779 start-all.cmd

16/04/2018 06:52 a.m. 1.471 start-all.sh

16/04/2018 06:52 a.m. 1.128 start-balancer.sh

16/04/2018 06:52 a.m. 1.401 start-dfs.cmd

16/04/2018 06:52 a.m. 3.734 start-dfs.sh

16/04/2018 06:52 a.m. 1.357 start-secure-dns.sh

16/04/2018 06:52 a.m. 1.571 start-yarn.cmd

16/04/2018 06:52 a.m. 1.347 start-yarn.sh

16/04/2018 06:52 a.m. 1.770 stop-all.cmd

16/04/2018 06:52 a.m. 1.462 stop-all.sh

16/04/2018 06:52 a.m. 1.179 stop-balancer.sh

16/04/2018 06:52 a.m. 1.455 stop-dfs.cmd

16/04/2018 06:52 a.m. 3.206 stop-dfs.sh

16/04/2018 06:52 a.m. 1.340 stop-secure-dns.sh

16/04/2018 06:52 a.m. 1.642 stop-yarn.cmd

16/04/2018 06:52 a.m. 1.340 stop-yarn.sh

16/04/2018 06:52 a.m. 4.295 yarn-daemon.sh

16/04/2018 06:52 a.m. 1.353 yarn-daemons.sh

28 File(s) 61.236 bytes

3 Dir(s) 101.757.034.496 bytes free
Thread Thread
 
awwsmm profile image
Andrew (he/him)

So start-dfs.cmd works, but start-yarn.cmd doesn't? Weird. They're both in the same directory. That doesn't make much sense.

I'm not sure how I can help further without being at your terminal. I'd say maybe try starting from scratch? Sometimes, it's easy to miss a small step or two.

Thread Thread
 
crashbandavid profile image
David Camilo Serrano

mmm well, i tried to do the same process in another machine and it happened again. The same error. The yarn daemons are not running.

I have checked different options but i have not could find any solution yet.

I don't know if yarn needs some additional installation or something like that or if there is another environment variable that i am not setting up.

I am really lost here.
What kind of command would you use in my console?

Thread Thread
 
awwsmm profile image
Andrew (he/him)

I would start from scratch, and make sure the correct version (8) of Java is installed, and re-install Hadoop. Then, I would double-check all of the environment variables.

Can you try adding the environment variables as system environment variables, rather than user environment variables? You may need to be an Administrator to do this.

If all of that checks out, and the %PATH% is correct, and all of the .cmd files are on the path, I'm not sure what else I would do. There's no reason why those commands shouldn't work if they're on the %PATH%.

Thread Thread
 
crashbandavid profile image
David Camilo Serrano

I appreciate your help.

I have already added the variables to the system but the problem is still there.

I would really appreciate that you tell me If you have another ideas to solve this issue.

I think it is also weird but it seems something related to yarn. I will look for more info , more tricks and if i solve it i will post here.

Thanks so much.