Jupyter pyspark : no module named pyspark
Use findspark lib to bypass all environment setting up process. Here is the link for more information. https://github.com/minrk/findspark
Use it as below.
import findspark
findspark.init('/path_to_spark/spark-x.x.x-bin-hadoopx.x')
from pyspark.sql import SparkSession