Help Tracing urllib2 Error, Please?

Rob Wolfe rw at
Sun Jul 20 07:30:09 EDT 2008

Larry Hale <larzluv at> writes:

> Since it seems I have a "unique" problem, I wonder if anyone could
> point me in the general/right direction for tracking down the issue
> and resolving it myself.
> See my prior post @
> for more info.  (Python 2.5.2 on Win XP 64 ==>> Squid Proxy requiring
> Authentication ==>> Internet not working.)
> I've looked the urllib2 source over, but am having trouble following
> it.  As previously mentioned, urllib2 initiates the request, Squid
> replies "407 error" that auth's required, and then urllib2 just stops,
> throwing error 407.
> Any though(s) on what to check out?
> It's frustrating (to say the least) that it seems so many are
> successfully accomplishing this task, and all's working perfectly for
> them, but I'm failing miserably.
> Would any quotes viewed in the HTTP traffic help?  (Wireshark shows
> all!  :)  I don't even know what other info could help.
> Any info to get about Squid's configuration that might make it "non
> standard" in a way that could cause my problem?  Any question(s) I
> should ask my Net Admin to relay info to you all?

Maybe Squid is configured to not allow sending authentication 
directly in URI. Or maybe there is only digest scheme allowed.
Try this:

def getopener(proxy=None, digest=False):
    opener = urllib2.build_opener(urllib2.HTTPHandler)
    if proxy:
        passwd_mgr = urllib2.HTTPPasswordMgr()
        passwd_mgr.add_password(None, 'http://localhost:3128', 'user', 'password')

        if digest:
            proxy_support = urllib2.ProxyDigestAuthHandler(passwd_mgr)
            proxy_support = urllib2.ProxyBasicAuthHandler(passwd_mgr)
    return opener

def fetchurl(url, opener):
    f =
    data =
    return data

print fetchurl('', getopener(''))
print fetchurl('', getopener('', digest=True))


More information about the Python-list mailing list