Delayed evaluation and setdefault()

Peter Otten __peter__ at web.de
Mon Jan 19 21:03:51 CET 2004


Leo Breebaart wrote:

> Hi all,
> 
> I have a question about Python and delayed evaluation.
> 
> Short-circuiting of Boolean expressions implies that in:
> 
>    >>> if a() and b():
> 
> any possible side-effects the call to b() might have will not
> happen of a() returns true, because then b() will never be
> executed.
> 
> However, if I go looking for dictionary keys like this:
> 
>    >>> d = {}
>    >>> d.setdefault('foo', b())
> 
> then b() *does* get executed regardless of whether or not the
> value it returns is actually used ('foo' was not found) or not
> ('foo' was found).
> 
> If b() has side-effects, this may not be what you want at all. It
> would seem to me not too exotic to expect Python to have some
> sort of construct or syntax for dealing with this, just as it
> supports Boolean short-circuiting.
> 
> So I guess my question is twofold: one, do people think
> differently, and if so why?; and two: is there any elegant (or
> other) way in which I can achieve the delayed evaluation I desire
> for setdefault, given a side-effect-having b()?
> 

Lazy evaluation is dangerous as the state of mutable objects may have
changed in the mean time. In your case

if "foo" not in d:
    d["foo"] = b()

should do.

However, if you absolutely want it, here's a slightly more general approach:

<defer.py>
class Defer:
    def __init__(self, fun, *args):
        self.fun = fun
        self.args = args
    def __call__(self):
        """ Calculate a deferred function the first time it is
            invoked, return the stored result for subsequent calls
        """
        try:
            return self.value
        except AttributeError:
            self.value = self.fun(*self.args)
            return self.value

def defer(fun, *args):
    """ Use, e. g., defer(average, 1, 2) to delay the calculation
        of the average until the first time undefer() is called
        with the Defer instance.
    """
    return Defer(fun, *args)

def undefer(value):
    """ Check if value is deferred, if so calculate it, otherwise
        return it unchanged
    """
    if isinstance(value, Defer):
        return value()
    return value

#example usage:

class Dict(dict):
    def setdefault(self, key, value):
        try:
            return self[key]
        except KeyError:
            # it might make sense to further delay the
            # calculation until __getitem__()
            self[key] = value = undefer(value)
            return value

def b(value):
    print "side effect for", value
    return value
dv = defer(b, "used three times")

d = Dict()
d.setdefault("x", defer(b, "a1"))
d.setdefault("x", defer(b, "a2"))
d.setdefault("y", "b")
d.setdefault("a", dv)
d.setdefault("b", dv)
d.setdefault("c", dv)
print ",\n".join(repr(d).split(","))
assert d["a"] is d["b"]

</defer.py>

You will still have to bother with undeferring in *many* places in your
program, so the above would go into the "or other" category. For lazy
evaluation to take off, I think it has to be build into the language. I
doubt, though, that much effort will be saved, as a human is normally lazy
enough to not calculate things until absolutely needed.

Peter




More information about the Python-list mailing list