How can I download an entire website?
HTTRACK works like a champ for copying the contents of an entire site. This tool can even grab the pieces needed to make a website with active code content work offline. I am amazed at the stuff it can replicate offline.
This program will do all you require of it.
Happy hunting!
Wget is a classic command-line tool for this kind of task. It comes with most Unix/Linux systems, and you can get it for Windows too. On a Mac, Homebrew is the easiest way to install it (brew install wget
).
You'd do something like:
wget -r --no-parent http://site.com/songs/
For more details, see Wget Manual and its examples, or e.g. these:
wget: Download entire websites easy
Wget examples and scripts
Use wget:
wget -m -p -E -k www.example.com
The options explained:
-m, --mirror Turns on recursion and time-stamping, sets infinite
recursion depth, and keeps FTP directory listings.
-p, --page-requisites Get all images, etc. needed to display HTML page.
-E, --adjust-extension Save HTML/CSS files with .html/.css extensions.
-k, --convert-links Make links in downloaded HTML point to local files.
-np, --no-parent Don't ascend to the parent directory when retrieving
recursively. This guarantees that only the files below
a certain hierarchy will be downloaded. Requires a slash
at the end of the directory, e.g. example.com/foo/.