
If we don't have the separate interface concept, the language just isn't as expressive. We would have to establish a convention to sacrifice one of -- a) being able to inherit from a class just for implementation purposes or b) being able to reason about interfaces using isinstance(). a) is error prone, because the language wouldn't prevent anyone from making the mistake. b) is unfortunate, because we'd have interfaces but no formal way to reason about them.
So the point is that it's possible to have a class D that picks up interface I somewhere in its inheritance chain, by inheriting from a class C that implements I, where D doesn't actually satisfy the invariants of I (or of C, probably). I can see that that is a useful feature. But it shouldn't have to preclude us from using inheritance for interfaces, if there was a way to "shut off" inheritance as far as isinstance() (or issubclass()) testing is concerned. C++ does this using private inheritance. Maybe we can add a similar convention to Python for denying inheritance from a given class or interface. Why do keep arguing for inheritance? (a) the need to deny inheritance from an interface, while essential, is relatively rare IMO, and in *most* cases the inheritance rules work just fine; (b) having two separate but similar mechanisms makes the language larger. For example, if we ever are going to add argument type declarations to Python, it will probably look like this: def foo(a: classA, b: classB): ...body... It would be convenient if this could be *defined* as assert isinstance(a, classA) and isinstance(b, classB) so that programs that have a simple class hierarchy can use their classes directly as argument types, without having to go through the trouble of declaring a parallel set of interfaces. I also think that it should be possible to come up with a set of standard "abstract" classes representing concepts like number, sequence, etc., in which the standard built-in types are nicely embedded. --Guido van Rossum (home page: http://www.python.org/~guido/)