On Wed, Jun 1, 2016 at 9:37 AM Sven R. Kunze
more flexible than a monolithic switch-case
That wasn't my initial intention, but I guess that's a side-effect of needing to break it down into pieces viable in current syntax. Thanks a lot, Michael.
You're welcome. I hope the effort informs the discussion of a special matching syntax. Some things that really stood out as tough to implement or not possible. 1. Avoid NameErrors when specifying identifiers to bind in a schema. I see two options to implement this in current syntax. I chose to require a binding object and all identifiers must be attributes of that binding object. Another option would be postponing the lookup of the identifiers by requiring the schema to be defined in a function, then doing some magic. 2. Matching an object and unpacking some of its attributes. If the schema is a fully-specified object, we can rely on its __eq__ to do the match. If the schema specifies only the type and does not unpack attributes, that's even easier. The tough part is a schema that has a half-specified object where where its __eq__ cannot be relied on. If all Unbounds are equal to everything, that would help a half-specified schema compare well, but it breaks all sorts of other code, such as ``list.index``. I chose to compare all public, non-Unbound attributes, but that may not have been the implementation of its __eq__. Alternatively, one could trace __eq__ execution and revise Unbound comparisons, but that's beyond my wizarding abilities. 3. Unpacking the attributes of an object that aggressively type-checks input. Given my solution of a Binding object and unpacking into its attributes, one cannot, for example, unpack the first argument of a ``range`` as it raises a TypeError: ``bind=Binding(); schema = range(bind.x)``. I'm considering removing object-unpacking from the module. To match and unpack an object, the schema could be specified as a mapping or sequence and the user would transform the object into a mapping (like ``obj.__dict__``) before doing the match.