[Python-Dev] Re: Of slots and metaclasses...

Guido van Rossum guido@python.org
Thu, 28 Feb 2002 13:58:48 -0500

[Kevin Jacobs wrote me in private to ask my position on __slots__.
I'm posting my reply here, quoting his full message -- I see no reason
to carry this on as a private conversation.  Sorry, Kevin, if this
wasn't your intention.]

> Hi Guido;
> Now that you are back from your travels, I'll start bugging you, as
> gently as possible, for some insight into your intent wrt slots and
> metaclasses.  As you can read from the python-dev archives, I've
> instigated a fair amount of discussion on the topic, though the
> conversation is almost meaningless without your input.

Hi Kevin, you got me to finally browse the thread "Meta-reflections".
My first response was: "you've got it all wrong."  My second response
was a bit more nuanced: "that's not how I intended it to be at all!"
OK, let me elaborate. :-)

You want to be able to find out which instance attributes are defined
by __slots__, so that (by combining this with the instance's __dict__)
you can obtain the full set of attribute values.  But this defeats the
purpose of unifying built-in types and user-defined classes.

A new-style class, with or without __slots__, should be considered no
different from a new-style built-in type, except that all of the
methods happen to be defined in Python (except maybe for inherited

In order to find all attributes, you should *never* look at __slots__.
Your should search the __dict__ of the class and its base classes, in
MRO order, looking for descriptors, and *then* add the keys of the
__dict__ as a special case.  This is how PEP 252 wants it to be.

If the descriptors don't tell you everything you need, too bad -- some
types just are like that.  For example, if you're deriving from a list
or tuple, there's no attribute that leads to the items: you have to
use __len__ and __getitem__ to find out about these, and you have to
"know" that that's how you get at them (although the presence of
__getitem__ should be a clue).

Why do I reject your suggestion of making __slots__ (more) usable for
introspection?  Because it would create another split between built-in
types and user-defined classes: built-in types don't have __slots__,
so any strategy based on __slots__ will only work for user-defined
types.  And that's exactly what I'm trying to avoid!

You may complain that there are so many things to be found in a
class's __dict__, it's hard to tell which things are descriptors.
Actually, it's easy: if it has a __get__ (method) attribute, it's a
descriptor; if it also has a __set__ attribute, it's a data attribute,
otherwise it's a method.  (Note that read-only data attributes have a
descriptor that has a __set__ method that always raises TypeError or

Given this viewpoint, you won't be surprised that I have little desire
to implement your other proposals, in particular, I reject all these:

- Proxy the instance __dict__ with something that makes the slots

- Flatten slot lists and make them immutable

- Alter vars(obj) to return a dict of all attrs

- Flatten slot inheritance (see below)

- Change descriptors to fall back on class variables for unfilled

I'll be the first to admit that some details are broken in 2.2.

In particular, the fact that instances of classes with __slots__
appear picklable but lose all their slot values is a bug -- these
should either not be picklable unless you add a __reduce__ method, or
they should be pickled properly.  This is a bug of the same kind as
the problem with pickling time.localtime() (SF bug #496873), so I'm
glad this problem has now been entered in the SF database (as
#520644).  I haven't made up my mind on how to fix this -- it would be
nice if __slots__ would automatically be pickled, but it's tricky
(although I think it's doable -- without ever referencing the
__slots__ variable :-).

I'm not so sure that the fact that you can "override" or "hide" slots
defined in a base class should be classified as a bug.  I see it more
as a "don't do that" issue: If you're deriving a class that overrides
a base class slot, you haven't done your homework.  PyChecker could
warn about this though.

I think you're mostly right with your proposal "Update standard
library to use new reflection API".  Insofar as there are standard
support classes that use introspection to provide generic services for
classic classes, it would be nice of these could work correctly for
new-style classes even if they use slots or are derived from
non-trivial built-in types like dict or list.  This is a big job, and
I'd love some help.  Adding the right things to the inspect module
(without breaking pydoc :-) would probably be a first priority.

Now let me get to the rest of your letter.

> So I've been sitting on my hands and waiting for you to dive in and
> set us all straight.  Actually, that is not entirely true; I picked
> up a copy of 'Putting Metaclasses to Work' and read it cover to
> cover.

Wow.  That's more than I've ever managed (due to what I hope can still
be called a mild case of ADD :-).  But I think I studied all the
important parts.  (I should ask the authors for a percentage -- I
think they've made quite some sales because of my frequent quoting of
their book. :-)

> Many things you've done in Python 2.2 are much clearer now,
> though new questions have emerged.  I would greatly appreciate it if
> you would answer a few of them at a time.  In return, I will
> synthesize your ideas with my own and compile a document that
> clearly defines and justifies the new Python object model and
> metaclass protocol.

Maybe you can formulate it as a set of tentative clarifying patches to
PEPs 252, 253, and 254?

> To start, there are some fairly broad and overlapping questions to get
> started:
>   1) How much of IBM's SOMobject MetaClass Protocol (SOMMCP) do you
>      want to adapt to Python?  For now (Python 2.2/2.3/2.4 time
>      frame)?  And in the future (Python 3.0/3000)?

Not much more than what I've done so far.  A lot of what they describe
is awfully C++ specific anyway; a lot of the things they struggle with
(such as the redispatch hacks and requestFirstCooperativeMethodCall)
can be done so much simpler in a dynamic language like Python that I
doubt we should follow their examples literally.

>   2) In Python 2.2, what intentional deviations have you chosen from the
>      SOMMCP and what differences are incidental or accidental?

Hard to say, unless you specifically list all the things that you
consider part of the SOMMCP.  Here are some things I know:

- In descrintro.html, I describe a slightly different algorithm for
  calculating the MRO than they use.  But my implementation is theirs
  -- I didn't realize the two were different until it was too late,
  and it only matters in uninteresting corner cases.

- I currently don't complain when there are serious order
  disagreements.  I haven't decided yet whether to make these an error
  (then I'd have to implement an overridable way of defining
  "serious") or whether it's more Pythonic to leave this up to the

- I don't enforce any of their rules about cooperative methods.  This
  is Pythonic: you can be cooperative but you don't have to be.  It
  would also be too incompatible with current practice (I expect few
  people will adopt super().)

- I don't automatically derive a new metaclass if multiple base
  classes have different metaclasses.  Instead, I see if any of the
  metaclasses of the bases is usable (i.e. I don't need to derive one
  anyway), and then use that; instead of deriving a new metaclass, I
  raise an exception.  To fix this, the user can derive a metaclass
  and provide it in the __metaclass__ variable in the class statement.
  I'm not sure whether I should automatically derive metaclasses; I
  haven't got enough experience with this stuff to get a good feel for
  when it's needed.  Since I expect that non-trivial metaclasses are
  often implemented in C, I'm not so comfortable with automatically
  merging multiple metaclasses -- I can't prove to myself that it's
  always safe.

- I don't check that a base class doesn't override instance
  variables.  As I stated above, I don't think I should, but I'm not
  100% sure.

>   3) Do you intend to enforce monotonicity for all methods and slots?
>      (Clearly, this is not desirable for instance __dict__ attributes.)

If I understand the concept of monotonicity, no.  Python traditionally
allows you to override methods in ways that are incompatible with the
contract of the base class method, and I don't intend to forbid this.
It would be good if PyChecker checked for accidental mistakes in this
area, and maybe there should be a way to declare that you do want this
enforced; I don't know how though.

There's also the issue that (again, if I remember the concepts right)
there are some semantic requirements that would be really hard to
check at compile time for Python.

>   4) Should descriptors work cooperatively?  i.e., allowing a
>      'super' call within __get__ and __set__.

I don't think so, but I haven't thought through all the consequences
(I'm not sure why you're asking this, and whether it's still a
relevant question after my responses above).  You can do this for
properties though.

Thanks for the dialogue!

--Guido van Rossum (home page: http://www.python.org/~guido/)