How to make my Python module available system wide on Linux?

In one of the directories listed when you type sys.path in your Python prompt. You can also add the directory which contains your file by modifiying the PYTHONPATH environment variable:

# ~/.bashrc file
export PYTHONPATH+=:/some/dir

If you're using Ubuntu, copy files to /usr/local/lib/python2.7/dist-packages. Following command will show you where to copy.

python -c "from distutils.sysconfig import *; print(get_python_lib())"

If you are the only one use the module, copy files to ~/.local/lib/python2.7/site-packages.


The answer is: it's all about permissions.

It's not enough to place the file in the correct location, like, such instance: /usr/local/lib/python2.7/dist-packages, you also need to ensure that the file can be read by the process you're running, in this case, python.

Be sure that "other" users have read access to the file. Open the bash console and execute this:

sudo chmod o+r "yourmodule.py"
[Introduce the password]

After this go again to python and try the import:

import "yourmodule"

As long as the path where the .py file is located is present in PYTHONPATH + the file is readable then you should be allowed to import it.


There are methods to install Python modules system-wide. You may want to take a look at distutils. A good tutorial for distutils2 (the current version) can be found here.

You basically have to write a file setup.py which tells distutils what to do. Then you can simply

python setup.py install

with root permissions to install your module systemwide. There are good and easy examples, plus it's the cleanest way I can imagine.