[Python-Dev] What is the design purpose of metaclasses vs code generating decorators? (was Re: PEP 557: Data Classes)
ncoghlan at gmail.com
Fri Oct 13 02:30:50 EDT 2017
On 13 October 2017 at 04:21, Martin Teichmann <lkb.teichmann at gmail.com>
> For me, the dataclasses were a typical example for inheritance, to be
> more precise, for metaclasses. I was astonished to see them
> implemented using decorators, and I was not the only one, citing
> > I think it would be useful to write 1-2 sentences about the problem with
> > inheritance -- in that case you pretty much have to use a metaclass, and
> > use of a metaclass makes life harder for people who want to use their own
> > metaclass (since metaclasses don't combine without some manual
> > intervention).
> Python is at a weird point here. At about every new release of Python,
> a new idea shows up that could be easily solved using metaclasses, yet
> every time we hesitate to use them, because of said necessary manual
> intervention for metaclass combination.
Metaclasses currently tend to serve two distinct purposes:
1. Actually altering the runtime behaviour of a class and its children in
non-standard ways (e.g. enums, ABCs, ORMs)
2. Boilerplate reduction in class definitions, reducing the amount of code
you need to write as the author of that class
Nobody has a problem with using metaclasses for the first purpose - that's
what they're for.
It's the second use case where they're problematic, as the fact that
they're preserved on the class becomes a leaky implementation detail, and
the lack of a JIT in CPython means they can also end up being expensive
from a runtime performance perspective.
Mixin classes have the same problem: something that the author may want to
handle as an internal implementation detail leaks through to the runtime
state of the class object.
Code generating decorators like functools.total_ordering and
dataclasses.dataclass (aka attr.s) instead aim at the boilerplate reduction
problem directly: they let you declare in the class body the parts that you
need to specify as the class designer, and then fill in at class definition
time the parts that can be inferred from that base.
If all you have access to is the runtime class, it behaves almost exactly
as if you had written out all the autogenerated methods by hand (there may
be subtle differences in the method metadata, such as the values of
`__qualname__` and `__globals__`).
Such decorators also do more work at class definition time in order to
reduce the amount of runtime overhead introduced by reliance on chained
method calls in a non-JITted Python runtime.
As such, the code generating decorators have a clear domain of
applicability: boilerplate reduction for class definitions without
impacting the way instances behave (other than attribute and method
injection), and without implicitly impacting subclass definitions (other
than through regular inheritance behaviour).
As far as the dataclass interaction with `__slots__` goes, that's a problem
largely specific to slots (and `__metaclass__` before it), in that they're
the only characteristics of a class definition that affect how CPython
allocates memory for the class object itself (the descriptors for the slots
are stored as a pointer array after the class struct, rather than only in
the class dict).
Given PEP 526 variable annotations, __slots__ could potentially benefit
from a __metaclass__ style makeover, allowing an "infer_slots=True" keyword
argument to type.__new__ to request that the list of slots be inferred from
__annotations__ (Slot inference would conflict with setting class level
default values, but that's a real conflict, as you'd be trying to use the
same name on the class object for both the slot descriptor and the default
Nick Coghlan | ncoghlan at gmail.com | Brisbane, Australia
-------------- next part --------------
An HTML attachment was scrubbed...
More information about the Python-Dev