On 21.10.2015 05:36, Guido van Rossum wrote:
Right. Chris's thinking recalls the reason why inheritance at some became popular (I guess it was in the '90s). Steven explains (in the part that I've cut) why many experts have soured on it quite a bit.
Yep. It's because experience shows that the usefulness is limited when it comes to orthogonal aspects.
Furthermore, I can remember changing an algorithm's design **because** of the lack of such built-in datastructure. That's always a bad sign.
Personally, I happen to think that inheritance is often useful when a number of classes are designed together (typically all at once and belonging to the same library) and also occasionally when a base class is explicitly and carefully designed to be inherited (there are beautiful things you can do with the template pattern, for example). But inheriting from an implementation that wasn't designed with your use case in mind is often asking for trouble -- if not now, then in the future when a refactored implementation is released.
That's a pretty good observation.
You might interject, that's the fault of the implementation refactoring -- they didn't properly think about interface compatibility. But while it's usually easy enough to keep an interface compatible where it's just the user calling methods on the implementation, the "interface" presented by subclassing is much more complex -- you would have to specify exactly which method's implementation calls which other method, and you'd also have to ensure that the object is in a sane state when it calls that other method, because it *could* be the case that the latter is overridden by a subclass. It's terribly fragile, and better avoided.
Maybe, that's one reason why people hesitate to write their own OrderDefaultDict or DefaultOrderedDict. It's just not their responsibility.