[issue6500] urllib2 maximum recursion depth exceeded

simon report at bugs.python.org
Fri Jul 17 10:26:14 CEST 2009


New submission from simon <nkucyd at gmail.com>:

def __getattr__(self, attr):
        # XXX this is a fallback mechanism to guard against these
        # methods getting called in a non-standard order.  this may be
        # too complicated and/or unnecessary.
        # XXX should the __r_XXX attributes be public?
        if attr[:12] == '_Request__r_':
            name = attr[12:]
            if hasattr(Request, 'get_' + name):
                getattr(self, 'get_' + name)()
                return getattr(self, attr)
        raise AttributeError, attr

this may cause "maximum recursion depth exceeded"

>>> import urllib2
>>> req = urllib2.Request('http://www.nbc.com')
>>> req._Request__r_method
RuntimeError: maximum recursion depth exceeded

"return getattr(self, attr)"? should it be removed?

----------
components: Library (Lib)
messages: 90612
nosy: nako521
severity: normal
status: open
title: urllib2 maximum recursion depth exceeded
type: behavior
versions: Python 2.4, Python 2.5, Python 2.6

_______________________________________
Python tracker <report at bugs.python.org>
<http://bugs.python.org/issue6500>
_______________________________________


More information about the Python-bugs-list mailing list