DEV Community

Discussion on: Installing and Running Hadoop and Spark on Windows

 
awwsmm profile image
Andrew (he/him)

Right, so hadoop is working fine. yarn isn't a command that you run, it's just the resource negotiator that the HDFS (Hadoop Distributed File System) uses behind the scenes to manage everything.

If you successfully ran start-yarn.cmd and start-dfs.cmd, you're good to go! Try uploading a file to HDFS with:

C:\> hadoop fs -put <file name here> /

...and checking that it's been uploaded with

C:\> hadoop fs -ls /
Thread Thread
 
crashbandavid profile image
David Camilo Serrano

Hi,
Thanks for your answer.
But the problem is exactly that. When a i run the command start-yarn.cmd i get:

This file does not have an app associated with it for performing this action. Please install an app or, if one is already installed, create an association in the defaul apps settings page.

So, i tried to see what the content is for the file start-yarn.cmd and it has a call to yarn command. So i tried to call it in a independent console and i get the same error. That is the reason why i think the problem is yarn, the command as is.

Thread Thread
 
awwsmm profile image
Andrew (he/him)

Okay, I think we're getting close. Can you echo %PATH% and share the result?

start-yarn.cmd should be within the Hadoop /sbin directory. If you haven't added it to your path correctly, maybe that's why you can't access it.

Thread Thread
 
crashbandavid profile image
David Camilo Serrano

Thanks for the answer.

Here it is: echo %path%

Result:
C:\Program Files (x86)\Common Files\Oracle\Java\javapath;C:\ProgramData\Oracle\Java\javapath;E:\app\dserranoa\product\11.2.0\client_1;E:\app\dserranoa\product\11.2.0\client_1\bin;C:\oraclexe\app\oracle\product\11.2.0\server\bin;;C:\WINDOWS\system32;C:\WINDOWS;C:\WINDOWS\System32\Wbem;C:\WINDOWS\System32\WindowsPowerShell\v1.0\;C:\Program Files\TortoiseGit\bin;C:\Program Files\PuTTY\;C:\Program Files\Microsoft SQL Server\130\Tools\Binn\;C:\Program Files\Microsoft\Web Platform Installer\;C:\Program Files (x86)\Microsoft SDKs\Azure\CLI\wbin;C:\Program Files (x86)\Microsoft SQL Server\110\DTS\Binn\;C:\Program Files (x86)\Microsoft SQL Server\120\DTS\Binn\;C:\Program Files (x86)\Microsoft SQL Server\130\DTS\Binn\;C:\Program Files\Microsoft SQL Server\110\Tools\Binn\;C:\Program Files (x86)\Microsoft SQL Server\110\Tools\Binn\ManagementStudio\;C:\Program Files (x86)\Microsoft SQL Server\110\Tools\Binn\;C:\Program Files\nodejs\;C:\Program Files\Microsoft SQL Server\110\DTS\Binn\;C:\Program Files (x86)\Microsoft Visual Studio 10.0\Common7\IDE\PrivateAssemblies\;C:\Program Files (x86)\Bitvise SSH Client;C:\Program Files\dotnet\;C:\Program Files\Microsoft Service Fabric\bin\Fabric\Fabric.Code;C:\Program Files\Microsoft SDKs\Service Fabric\Tools\ServiceFabricLocalClusterManager;C:\Program Files (x86)\Brackets\command;C:\WINDOWS\System32\OpenSSH\;C:\Program Files\Microsoft SQL Server\Client SDK\ODBC\130\Tools\Binn\;C:\Program Files (x86)\Microsoft SQL Server\140\Tools\Binn\;C:\Program Files\Microsoft SQL Server\140\Tools\Binn\;C:\Program Files\Microsoft SQL Server\140\DTS\Binn\;C:\Program Files (x86)\Microsoft SQL Server\140\DTS\Binn\;C:\Program Files (x86)\Microsoft SQL Server\150\DTS\Binn\;C:\Program Files\Java\jdk1.8.0_121\bin;C:\Program Files\MySQL\MySQL Shell 8.0\bin;C:\Users\dserranoa\AppData\Local\Microsoft\WindowsApps;C:\Progra~1\Java\jdk1.8.0_121;C:\BigData\hadoop-2.9.1;C:\BigData\hadoop-2.9.1\bin;C:\BigData\hadoop-2.9.1\sbin

I have attached the image of my environment variables.

Thread Thread
 
awwsmm profile image
Andrew (he/him)

Huh. Can you run:

C:\> dir C:\BigData\hadoop-2.9.1\sbin

...and give the result?

Thread Thread
 
crashbandavid profile image
David Camilo Serrano

Sure,
Here it is:

Volume in drive C has no label.
Volume Serial Number is 8276-D962

Directory of C:\BigData\hadoop-2.9.1\sbin

11/09/2019 09:55 a.m.

.

11/09/2019 09:55 a.m. ..

16/04/2018 06:52 a.m. 2.752 distribute-exclude.sh

11/09/2019 09:55 a.m. FederationStateStore

16/04/2018 06:52 a.m. 6.475 hadoop-daemon.sh

16/04/2018 06:52 a.m. 1.360 hadoop-daemons.sh

16/04/2018 06:52 a.m. 1.640 hdfs-config.cmd

16/04/2018 06:52 a.m. 1.427 hdfs-config.sh

16/04/2018 06:52 a.m. 3.148 httpfs.sh

16/04/2018 06:52 a.m. 3.677 kms.sh

16/04/2018 06:52 a.m. 4.134 mr-jobhistory-daemon.sh

16/04/2018 06:52 a.m. 1.648 refresh-namenodes.sh

16/04/2018 06:52 a.m. 2.145 slaves.sh

16/04/2018 06:52 a.m. 1.779 start-all.cmd

16/04/2018 06:52 a.m. 1.471 start-all.sh

16/04/2018 06:52 a.m. 1.128 start-balancer.sh

16/04/2018 06:52 a.m. 1.401 start-dfs.cmd

16/04/2018 06:52 a.m. 3.734 start-dfs.sh

16/04/2018 06:52 a.m. 1.357 start-secure-dns.sh

16/04/2018 06:52 a.m. 1.571 start-yarn.cmd

16/04/2018 06:52 a.m. 1.347 start-yarn.sh

16/04/2018 06:52 a.m. 1.770 stop-all.cmd

16/04/2018 06:52 a.m. 1.462 stop-all.sh

16/04/2018 06:52 a.m. 1.179 stop-balancer.sh

16/04/2018 06:52 a.m. 1.455 stop-dfs.cmd

16/04/2018 06:52 a.m. 3.206 stop-dfs.sh

16/04/2018 06:52 a.m. 1.340 stop-secure-dns.sh

16/04/2018 06:52 a.m. 1.642 stop-yarn.cmd

16/04/2018 06:52 a.m. 1.340 stop-yarn.sh

16/04/2018 06:52 a.m. 4.295 yarn-daemon.sh

16/04/2018 06:52 a.m. 1.353 yarn-daemons.sh

28 File(s) 61.236 bytes

3 Dir(s) 101.757.034.496 bytes free
Thread Thread
 
awwsmm profile image
Andrew (he/him)

So start-dfs.cmd works, but start-yarn.cmd doesn't? Weird. They're both in the same directory. That doesn't make much sense.

I'm not sure how I can help further without being at your terminal. I'd say maybe try starting from scratch? Sometimes, it's easy to miss a small step or two.

Thread Thread
 
crashbandavid profile image
David Camilo Serrano

mmm well, i tried to do the same process in another machine and it happened again. The same error. The yarn daemons are not running.

I have checked different options but i have not could find any solution yet.

I don't know if yarn needs some additional installation or something like that or if there is another environment variable that i am not setting up.

I am really lost here.
What kind of command would you use in my console?

Thread Thread
 
awwsmm profile image
Andrew (he/him)

I would start from scratch, and make sure the correct version (8) of Java is installed, and re-install Hadoop. Then, I would double-check all of the environment variables.

Can you try adding the environment variables as system environment variables, rather than user environment variables? You may need to be an Administrator to do this.

If all of that checks out, and the %PATH% is correct, and all of the .cmd files are on the path, I'm not sure what else I would do. There's no reason why those commands shouldn't work if they're on the %PATH%.

Thread Thread
 
crashbandavid profile image
David Camilo Serrano

I appreciate your help.

I have already added the variables to the system but the problem is still there.

I would really appreciate that you tell me If you have another ideas to solve this issue.

I think it is also weird but it seems something related to yarn. I will look for more info , more tricks and if i solve it i will post here.

Thanks so much.