[Python-ideas] Symbolic expressions (or: partials and closures from the inside out)

Nathan Rice nathan.alexander.rice at gmail.com
Fri Jan 13 07:24:35 CET 2012


>> I have been writing a lot of code lately that involves creating
>> symbolic expressions of one form or another, which are then fully
>> evaluated at a later time.  Examples of this include Elementwise,
>
>
> Python is designed for concrete, rather than symbolic computation. But the
> latter has been done.

Being able to create abstract expressions that are later realized is
super useful and neat.  You can do this decently right now, but life
would be better if you didn't have to jump through so many hoops.
Having symbolic variables override *anything* would also make lambdas
obsolete and greatly increase the potential for lazy evaluation.

> 0: rewrite the text -- a bit awkward in Python.
>
> action = compile("read({this})".format(this=<book>), 'xxx', 'eval')

Yeah, that is something I'd expect to see in Perl code :)

> 1. default args -- has to be done when the function is defined.
>
> def action(this = <book>): read(this)
>
> 2. closures (nested functions) -- also requires a planned-ahead definition.
>
> make_read(x):
>  return lambda: read(x)
> action = make_read(<book>)

I use this extensively in Elementwise.

> 3. bound methods -- only works for classes with methods.
>
> Class Book:
>    def read(self): pass
> action = Book(<book>).read
>
> 4. partial binding of function params -- generalizes bound methods; works
> for any function and argument.
>
> from functools import partial
> action = partial(read, <book>)
>
>> if I want to include isinstance(X, someclass) in a symbolic expression,
>
>
> Consider using partial, which can totally bind all needed args *now* for
> *later* action.

The issue isn't so much that I *can't* do things as they are more
trouble than they should be, and it makes the end user interface for
the stuff I write less elegant.

For example, if I want to work with a symbolic object, but include a
function that is not well behaved, or if I was creating a constraint
on the result of a function applied to a symbolic object, I have to
know ahead of time everything I want to do, so I can wrap the whole
expression in a lambda.  Once I wrap it, the nice generative chaining
property disappears and I'm stuck with a callable.

>>>> from functools import partial
>>>> t = partial(isinstance, 1, int)
>>>> t()
> True
>>>> f = partial(isinstance, 1, float)
>>>> f()
> False
>
>
>> I have to wrap it in a lambda
>
>
> Are you sure that partial will not work for you? Since partial is written in
> Python, you can grab and adjust the code to your needs. It amounts to adding
> default args after the fact by using a generic closure.

In some cases it would, in some cases it wouldn't.  Since I basically
never do */** expansion on wrappers, lambdas tend to be my go-to more
often.

>> (or use .apply(), in the case of
>>
>> Elementwise).  This is not a huge deal for me, but it forces me to
>> create wrappers for lots of functions (e.g. isinstance_(X, someclass))
>> and/or have users wrap every such function they want to use in a
>> symbolic expression.   Having to do this also bloats the code a lot;
>> the github version of Elementwise is over 3,000 LoC at this point
>> (including prodigious documentation, but still...).
>>
>> 2.  Python expects that certain functions, such as int(), str(), etc,
>> will have a specific return type.  While in general I agree with this,
>
>
> People expect that class constructors produce an instance of the class. It
> is surprising when one does otherwise ;-). Builtin classes like int, bool,
> and str are classes just like ones you write.

type/str/int/etc as types is definitely semi-coherent, since the
language really treats them more like functions.  They are treated
that way all over the docs as well.

>From the data model page:

"object.__str__(self)
Called by the str() built-in function and by the..."

"object.__nonzero__(self)
Called to implement truth value testing and the built-in operation bool()"

"object.__complex__(self)
object.__int__(self)
object.__long__(self)
object.__float__(self)
Called to implement the built-in functions complex(), int(), long(),
and float(). Should return a value of the appropriate type."

So clearly this is an area that needs some polish :)

>> X = SymbolicObject()
>>
>> const = Constraints(X * 2 + 1>= 5, X % 2 != 0)
>> const2 = Constraints(X[-1] == "h")
>> const3 = Constraints(X[-1].upper() == "H")
>
>
> Using a special class is a standard way to delay concrete execution.

Standard, and currently a pain in the butt, starting from the fact
that operators don't hook into __getattribute__ and getting
progressively worse from there.

>>>>> print isinstance(3, const)
>>
>> True
>
>
> A Contraints instance defines a set. 'const' is the set 'odd_ge_3'
> It would look better if you used standard syntax and do the inclusion check
> in a __contains__ method.

Used standard syntax?  Can you elaborate please?

Also, a set is one of many things a Constraints instance could
logically be represented with, as well as a discontinuous interval, a
class in the colloquial sense, etc.  The nice thing about
__instancecheck__ is that every possible constraint reduces to a type
check.  Of course, you could rephrase all type checking in terms of
set membership easily, but I don't think it is quite as intuitive to
most folks.  Your preference has been noted though.

>> As I mentioned in the first paragraph, Constraints is a metaclass,
>
>
> This is your first mention, actually.

Feeling feisty?  I'm actually curious to see what sort of rational you
come up with to back up that statement; should be interesting :)

>
>> so your validations are checked using __instancecheck__.
>
> But it is a fake check in that 3 is not really an instance of the class,
> which has no instances. It is hard for me to believe that you cannot put the
> same constraint data in an instance of Constraint as a class and the
> inclusion logic in __contains__. If you customize __instancecheck__ for each
> instance of Constraints, then write

It is a fake check, like any abstract base class instancecheck is a fake check.

>  def__contains__(self, ob): return self._check(ob)
>
> where _check does the same as your current __instancecheck__
>
> Even if you *have* to make Constraints a metaclass, for other reasons, I
> believe you could still give it the same __contains__ method. A metaclass
> *is* a class, and if its class instances represent collections, inclusion in
> the colleciton should be tested in the standard way.

It could be any sort of callable.  __instancecheck__ is the only
reason it is a metaclass.  Otherwise, I probably wouldn't bother with
classes at all; returning a check inner function with constraints in
the closure would be easy.

Cheers,

Nathan



More information about the Python-ideas mailing list