
Dec. 2, 2003
8:14 p.m.
I'm not sure about that yet. I'd *like* to find a hack that lets the int type change representations, but the fact is that it's much easier to use different types to indicate different representations.
Aren't integers immutable? If so, I would think it doesn't make sense for them to change representation, as they don't change value. Anyway, if you want to use type to encode representation, I would think that the various integer types should be related by inheritance. As a long can always substitute for an int, at least in theory, I would think that long should be derived from int. Then isinstance(42L, int) would yield True. If integers are related this way, LSP implies that converting a long to a string should not put an L at the end.