Can I run a pyspark jupyter notebook in cluster deploy mode?
You cannot use cluster mode with PySpark at all:
Currently, standalone mode does not support cluster mode for Python applications.
and even if you could cluster mode is not applicable in interactive environment:
case (_, CLUSTER) if isShell(args.primaryResource) =>
error("Cluster deploy mode is not applicable to Spark shells.")
case (_, CLUSTER) if isSqlShell(args.mainClass) =>
error("Cluster deploy mode is not applicable to Spark SQL shell.")