[Python-3000] Betas today - I hope
Guido van Rossum
guido at python.org
Thu Jun 12 19:24:38 CEST 2008
>> Whether they'll care about this issue of course depends on whether
>> they need overloaded operators and other special delegations to be
>> delegated transparently. We'll have to see how important this is.
>> New-style classes have been around and recommended for a long time --
>> why haven't people pushed for a proxy class before?
On Thu, Jun 12, 2008 at 7:50 AM, Nick Coghlan <ncoghlan at gmail.com> wrote:
> There was an easier way to do it in the form of classic classes - the 2,x
> interpreter is riddled with special cases that ensure that __getattr__ is
> always consulted when looking for special methods on a classic class. The
> tracker issue regarding the fact that things aren't so simple with new style
> classes was actually raised way back in 2002 when someone tried to port such
> a class to new-style and discovered that overriding __getattribute__ was no
> longer enough.
And all that time people have been using classic classes as a crutch
rather than developing a future-proof approach? I can't believe that.
I think we should look for some existing proxies so we can identify
best practice and perhaps improve upon. (Sorry if that's what you did,
I haven't checked.)
>>> I've pushed as hard as I'm personally willing to for this without
>>> anyone else that it's worth doing,
>> What does *it* refer to? Changing the behavior, or adding a proxy
>> class to the stdlib? I'm -1000 on the former, but only -0 on the
>> latter -- as I wrote in the tracker, I just don't want to see an
>> unproven proxy class (and I don't like the module name).
> "It" referred to adding the proxy class - I'm personally ambivalent on
> adding it at this point, because the complexity of it reduces my confidence
> that I got it right,
Also Michael Foord just wrote "I would prefer it if the proxy class
wrapped the return values of inplace operations" -- which sounds like
one reason why it's hard to have a one-size-fits-all proxy class,
since when to wrap and when not to is probably very application
> but it also makes it seem unfair to users of this
> feature of classic classes to take it away in 3.0 without giving them some
> kind of functional replacement.
> As for as the module name goes, I don't particularly like it either -
> dropping something in the types module instead would be an alternative
No! Not the doomed types module! :-)
>>> 2. Method lookup MAY bypass __getattribute__, shadowing the attribute in
>>> the instance dictionary MAY have ill effects. (slots such as __enter__ and
>>> __exit__ that are looked up via normal attribute lookup in CPython will
>>> fit into this category)
>> Why even have a MAY category? Are you expecting these to become tp_
>> slots at some point?
> Either tp_* slots, or just having the invocation bypass the instance
> attributes and only look at the object's type.
> I think it would actually be desirable for this category to be empty from a
> purity point of view (with all the special methods in category 1), but given
> that CPython itself currently doesn't conform to such a language spec, this
> seems to be the next best option (i.e. allowing other implementations or
> later versions of CPython to put these special methods in category 1 along
> with the rest of the special methods)
OK, fair point.
>>> 3. Technically a subcategory of group 1, these are special methods which
>>> can affect the interpreter's behaviour by their mere definition on a type.
>>> (The __get__, __set__ and __delete__ descriptor protocol methods fall into
>>> this category)
>> I don't follow why this is relevant. This is a different, AFAIK
>> orthogonal issue, used in many places: *if* an object used in a
>> certain context has a specific attribute, *then* that attribute is
>> used, *otherwise* a default action is taken. Applies to __repr__ just
>> as much. These belong in category 1 if and only if the lookup bypasses
>> the instance dict.
> Actual hasattr() checks aren't a problem - those hit __getattribute__ and a
> delegating class can correctly check them against the target object.
I never said hasattr(). AFAIK what happens sometimes is that
_PyType_Lookup() is called and if it returns NULL the default action
> like __str__ or __repr__ also aren't a major issue - those are easy to
> delegate in a way that reproduces the same behaviour as if the delegating
> class wasn't there (just reinvoke the appropriate builtin on your target
> This category is specifically for method checks in category 1 which bypass
> __getattribute__ *and* have significant effects on the way an object gets
> handled that can't be readily dealt with by a value-based delegation class -
> the most significant methods I've actually found in that category so far are
> the descriptor protocol methods (that's why my ProxyMixin class skipped
> delegating them).
Ah, now I understand. These are things that you shouldn't define in
the proxy class if the target class doesn't define it. I think I've
seen solutions that dynamically define a proxy class depending on
whether the target class implements a certain special method or not.
> As long as the callable() builtin is around, __call__ actually lands in this
> category as well (since defining it will affect the answer returned by the
> callable() builtin). Being able to return different proxy classes with and
> without __callable__ defined is actually the reason weakref.proxy is a
> factory function rather than a type in its own right.
What I said. :-)
--Guido van Rossum (home page: http://www.python.org/~guido/)
More information about the Python-3000