How to retry urllib2.request when fails?

There are a few libraries out there that specialize in this.

One is backoff, which is designed with a particularly functional sensibility. Decorators are passed arbitrary callables returning generators which yield successive delay values. A simple exponential backoff with a maximum retry time of 32 seconds could be defined as:

@backoff.on_exception(backoff.expo,
                      urllib2.URLError,
                      max_value=32)
def url_open(url):
    return urllib2.urlopen("http://example.com")

Another is retrying which has very similar functionality but an API where retry parameters are specified by way of predefined keyword args.


I would use a retry decorator. There are other ones out there, but this one works pretty well. Here's how you can use it:

@retry(urllib2.URLError, tries=4, delay=3, backoff=2)
def urlopen_with_retry():
    return urllib2.urlopen("http://example.com")

This will retry the function if URLError is raised. Check the link above for documentation on the parameters, but basically it will retry a maximum of 4 times, with an exponential backoff delay doubling each time, e.g. 3 seconds, 6 seconds, 12 seconds.