python 3.3 urllib.request

Steeve C steevechailloux at
Fri Dec 7 13:52:52 CET 2012


I have a python3 script with urllib.request which have a strange behavior,
here is the script :

#!/usr/bin/env python3
# -*- coding: utf-8 -*-

import urllib.request
import sys, time

url = ''

def make_some_stuff(page, url):
    sys.stderr.write(time.strftime("%d/%m/%Y %H:%M:%S -> page from \"") +
url + "\"\n")
    sys.stderr.write(str(page) + "\"\n")
    return True

def get_page(url):
    while 1:
            page = urllib.request.urlopen(url)
            yield page

        except urllib.error.URLError as e:
            sys.stderr.write(time.strftime("%d/%m/%Y %H:%M:%S -> impossible
to access to \"") + url + "\"\n")

def main():
    print('in main')
    for page in get_page(url):
        make_some_stuff(page, url)

if __name__ == '__main__':

if the computer is connected on internet (with an ethernet connection for
example) and I run this script, it works like a charme :
- urllib.request.urlopen return the page
- make_some_stuff write in stderr
- when the ethernet cable is unplug the except block handle the error while
the cable is unplug, and when the cable is pluged
back urllib.request.urlopen return the page and make_some_stuff write in

this is the normal behavior (for me, imho).

but if the computer is not connected on internet (ethernet cable unpluged)
and I run this script, the except block handle the error (normal), but when
I plug the cable, the script continue looping and urllib.request.urlopen
never return the page (so, it alway go to the except block)

What can I do to handle that ?


-------------- next part --------------
An HTML attachment was scrubbed...
URL: <>

More information about the Python-list mailing list