urllib bug?

Steve Clift sgclift at earthlink.net
Mon Sep 2 23:23:35 CEST 2002

I'm running Python 2.2.1 on a venerable W98 box. Attempts to use
urllib.urlopen() to open trivial URLs fail with an exception 'no host given'.

>From tracing through urllib:

urllib.urlopen() instantiates an instance of FancyURLopener, which
subclasses URLopener().

To figure out whether I'm using a proxy, urllib.URLopener.__init__() calls
getproxies(), which is an alias for getproxies_registry() under Windows.

getproxies_registry() ferrets around in the registry, using the _winreg
module to extract values for ProxyEnable and ProxyServer.

On this box, regedit lists the value for ProxyEnable as 00 00 00 00, and
the value for ProxyServer as "". _winreg.QueryValueEx() returns
('\x00\x00\x00\x00', 3) for ProxyEnable and (u'', 1) for ProxyServer.
Clearly, I'm not using a proxy. 

urllib tests the truth of '\x00\x00\x00\x00' and, because it's a
non-zero-length string, incorrectly decides I *am* using a proxy and
constructs bogus http and ftp proxy entries that include no host name.

The absence of a proxy host name causes urllib to grumble.


a) It's a bug.
b) How come no-one else is complaining about it? What's special about this


More information about the Python-list mailing list