PySpark Will not start - ‘python’: No such file or directory
You should have set export PYSPARK_PYTHON=python3
instead of export PYSPARK_PYTHON=python3.6.5
in your .profile
then source .profile
, of course.
That's worked for me.
other options, installing sudo apt python
(which is for 2.x ) is not appropriate.
For those who may come across this, I figured it out!
I specifically chose to use an older version of Spark in order to follow along with a tutorial I was watching - Spark 2.1.0. I did not know that the latest version of Python (3.5.6 at the time of writing this) is incompatible with Spark 2.1. Thus PySpark would not launch.
I solved this by using Python 2.7 and setting the path accordingly in .bashrc
export PYTHONPATH=$PYTHONPAH:/usr/lib/python2.7
export PYSPARK_PYTHON=python2.7