
While working on a bug in the issue tracker, I came across something that I thought was a little odd around the behaviour of IntEnum. Should the behaviour of an instance of an IntEnum not be symmetric to an int where possible? For example:
class MyEnum(IntEnum): ... FOO = 1 ... MyEnum.FOO == 1 True MyEnum.FOO * 3 == 3 True str(MyEnum.FOO) == str(1) False
In my mind, the string representation here should be “1” and not the label. Was this simply an oversight of the specialized IntEnum implementation, or was there a concrete reason for this that I’m not seeing?

On Fri Feb 20 2015 at 11:39:11 AM Demian Brecht <demianbrecht@gmail.com> wrote:
While working on a bug in the issue tracker, I came across something that I thought was a little odd around the behaviour of IntEnum. Should the behaviour of an instance of an IntEnum not be symmetric to an int where possible? For example:
class MyEnum(IntEnum): ... FOO = 1 ... MyEnum.FOO == 1 True MyEnum.FOO * 3 == 3 True str(MyEnum.FOO) == str(1) False
In my mind, the string representation here should be “1” and not the label. Was this simply an oversight of the specialized IntEnum implementation, or was there a concrete reason for this that I’m not seeing?
Concrete reason. The string is 'MyEnum.FOO' which is much more readable and obvious where the value came from. The fact that it can be treated as an int is the same as the reason True and False are subclasses of int; it made practical sense for compatibility with what they typically replaced, but where it made more sense to diverge and introduce new behaviour then we did so.
participants (2)
-
Brett Cannon
-
Demian Brecht