Website downloader (cache?) to view sites offline

I use HTTrack.

It allows you to download a World Wide Web site from the Internet to a local directory, building recursively all directories, getting HTML, images, and other files from the server to your computer

I use wget with these options to mirror a site for offline use

wget -cmkE -np


-m turns on mirroring options for mirroring a site locally

-c continues a previous download in case I have already downloaded some pages

-k converts absolute href to point to local ones for offline viewing

-E ensures files have .html extension after download.

-np only downloads objects under /a/section/i/ and does not cache the whole site.

For example I wanted to download south documentation but not south tickets, etc...

wget -cmkE -np

I use Windows and run wget on cygwin but there is also a native windows wget port.

Although, in your case, you can download python offline docs from python docs section