Spark: Trying to run spark-shell, but get 'cmd' is not recognized as an internal or
My colleague solved the problem. Although Java seemed to work ok (ref. picture), the Java path Spark was trying to read was incorrect with an extra \bin at the end. When that was removed, Spark started working! @gonbe, thank you so much for your efforts to help!
I had the similar error. I fixed it after following changes:
- There were multiple Java/bin path in the System Path. So I corrected them to reflect single Java/Bin, which is in sync with JAVA_HOME
- Added C:Windows\system32 to System Path Variable.
- My Java_Home and java.exe was pointing different places. I fixed them.
Now it works.
Thanks guys.
(I'm not Windows Spark user) The spark-shell.cmd for Windows source code expects "cmd" command is available in PATH.
https://github.com/apache/spark/blob/master/bin/spark-shell.cmd
Would you try adding the directory that contains "cmd.exe" in PATH environment variable? The directory location is shown title bar in your screenshot, and environment variable setting can be done via control panel.