Site to download al USGS Topo Maps in bulk
I don't know if it still works, but a couple of years ago I hacked together a method for downloading the US Seamless National Elevation Dataset in bulk that might work for topo raster maps too.
To bypass the map viewer put the coords on the URL like so:
http://extract.cr.usgs.gov/Website/distreq/RequestSummary.jsp?AL=71.0,56.0,-140.0,-150.0&PL=NAK01HZ
where this is your region of interest north,south and east,west in decimal degrees.
AL=71.0,56.0,-140.0,-150.0
and this is data set to choose from, in this case “National Elevation Dataset Alaska (NED) 2 Arc Second”
PL=NAK01HZ
If you use this technique, please be gentle. It’s not in our interests to force them to take protective measures and close this avenue.
Also don't overlook snail mail, it's high latency but has almost limitless bandwidth:
With regard to ordering the regional CD’s:
“You can order the entire U.S. in 30 meter resolution in either ArcGrid or GridFloat format. The data will be provided on a 250 GB external drive at a total price of $1005.00. This includes shipping and handling. This will cover the Conterminous U.S., Alaska (at 60 meter res), Hawaii, and the territorial islands. We no longer provide this on CD or DVD media.”
An incredibly good price in my opinion.
The contact address is [email protected]
Customer Service/Webmapping
Data and Applications Support
Department Science Applications
International Corporation (SAIC) at
U.S. Geological Survey - EROS (Earth Resources Observation and Science)
47914 252nd Street Sioux Falls, SD
57198-0001Phone: 1-800-252-4547
Fax: (605)-594-6589
What about the FREE 24K geotiff DRGs available through the Libre Map Project? All 24K DRGs for all 50 states are available there. They are collared, but I had access to GlobalMapper, which has a nice function that removes collars easily (and surely there are other ways to deal with those). They are filed nicely on the server at the Internet Archive and the Python script below, in conjunction with wget, fetches tons of 'em quickly and easily:
import os, shutil, string, time
# This script fetches raster images (DRGs) from the web using an os.system call to wget
# The directory of states is at: http://libremap.org/data/
#
# Process:
# For each 24K quad in the list, it gets both
# the tif, tfw, and fgd (metadata), then moves them to new home. Moves on to next image in list
# Data is actually stored at the internet archive, with a URI like so:
# http://www.archive.org/download/usgs_drg_ar_35094_a2/o35094a2.tif
wgetDir = 'C:/Program Files/wget/o'
list = [['32094f5','HARLETON','TX'],
['32094f6','ASHLAND','TX'],
['32094f7','GLENWOOD','TX'],
['32094f8','GILMER','TX']]
exts = ['tif', 'tfw', 'fgd']
url = 'http://www.archive.org/download/'
home = '//share/Imagery/EastTexas/TOPOs/24k/'
# List of images we want to fetch
images = list
if __name__ == '__main__':
for image in images:
for ext in exts:
# Piece together out image/world file URI, so it looks like so:
# http://www.archive.org/download/usgs_drg_ar_35094_a2/o35094a2.tif
fullurl = url + 'usgs_drg_' + image[2].lower() + '_' + image[0][:5] + '_' + image[0][5:] + '/o' + image[0] + '.' + ext
# Get to wget thru an os.system call
os.system('wget %s' % (fullurl))
# Move the file to where we want it to live, with a descriptive filename, as in:
# AR_PRAIRIE_GROVE_o35094h3.tif
shutil.move(wgetDir + image[0] + '.' + ext,
home + string.upper(image[2]) + '_' + string.replace(image[1], ' ', '_') + '_' + 'o' + image[0] + '.' + ext)
I usually build the list of input quads by selecting the quads I want from a 24K topo vector footprint in ArcMap, exporting out the records to a dbf (or better yet directly to Excel with XTools), then in Excel, build the list by concatenating the fields of interest together, something like:
="['" & quad_id & "','" & quad_name & "','" & state & "'],"
I then copy the list of lists into my script or a external module and call it from there. Maybe not the most elegant method, but it works nicely. HTH.