No module named 'resource' installing Apache Spark on Windows

I struggled the whole morning with the same problem. Your best bet is to downgrade to Spark 2.3.2


The fix can be found at https://github.com/apache/spark/pull/23055.

The resource module is only for Unix/Linux systems and is not applicaple in a windows environment. This fix is not yet included in the latest release but you can modify the worker.py in your installation as shown in the pull request. The changes to that file can be found at https://github.com/apache/spark/pull/23055/files.

You will have to re-zip the pyspark directory and move it the lib folder in your pyspark installation directory (where you extracted the pre-compiled pyspark according to the tutorial you mentioned)