Getting directory listing over http
I just figured out a way to do it:
$ wget --spider -r --no-parent http://some.served.dir.ca/
It's quite verbose, so you need to pipe through grep
a couple of times depending on what you're after, but the information is all there. It looks like it prints to stderr, so append 2>&1
to let grep
at it. I grepped for "\.tar\.gz" to find all of the tarballs the site had to offer.
Note that wget
writes temporary files in the working directory, and doesn't clean up its temporary directories. If this is a problem, you can change to a temporary directory:
$ (cd /tmp && wget --spider -r --no-parent http://some.served.dir.ca/)
What you are asking for best served using FTP, not HTTP.
HTTP has no concept of directory listings, FTP does.
Most HTTP servers do not allow access to directory listings, and those that do are doing so as a feature of the server, not the HTTP protocol. For those HTTP servers, they are deciding to generate and send an HTML page for human consumption, not machine consumption. You have no control over that, and would have no choice but to parse the HTML.
FTP is designed for machine consumption, more so with the introduction of the MLST
and MLSD
commands that replace the ambiguous LIST
command.