[Python-ideas] PEP draft: context variables

Nick Coghlan ncoghlan at gmail.com
Sun Oct 15 01:13:05 EDT 2017

On 15 October 2017 at 14:53, M.-A. Lemburg <mal at egenix.com> wrote:

> On 15.10.2017 06:39, Nick Coghlan wrote:
> > On 15 October 2017 at 05:47, Paul Moore <p.f.moore at gmail.com
> > <mailto:p.f.moore at gmail.com>> wrote:
> >
> >     On 14 October 2017 at 17:50, Nick Coghlan <ncoghlan at gmail.com
> >     <mailto:ncoghlan at gmail.com>> wrote:
> >     > If you capture the context eagerly, then there are fewer
> opportunities to
> >     > get materially different values from "data = list(iterable)" and
> "data =
> >     > iter(context_capturing_iterable)".
> >     >
> >     > While that's a valid intent for folks to want to be able to
> express, I
> >     > personally think it would be more clearly requested via an
> expression like
> >     > "data = iter_in_context(iterable)" rather than having it be
> implicit in the
> >     > way generators work (especially since having eager context capture
> be
> >     > generator-only behaviour would create an odd discrepancy between
> generators
> >     > and other iterators like those in itertools).
> >
> >     OK. I understand the point here - but I'm not sure I see the
> practical
> >     use case for iter_in_context. When would something like that be used?
> >
> >
> > Suppose you have some existing code that looks like this:
> >
> >     results = [calculate_result(a, b) for a, b in data]
> >
> > If calculate_result is context dependent in some way (e.g. a & b might
> > be decimal values), then eager evaluation of "calculate_result(a, b)"
> > will use the context that's in effect on this line for every result.
> >
> > Now, suppose you want to change the code to use lazy evaluation, so that
> > you don't need to bother calculating any results you don't actually use:
> >
> >     results = (calculate_result(a, b) for a, b in data)
> >
> > In a PEP 550 world, this refactoring now has a side-effect that goes
> > beyond simply delaying the calculation: since "calculate_result(a, b)"
> > is no longer executed immediately, it will default to using whatever
> > execution context is in effect when it actually does get executed, *not*
> > the one that's in effect on this line.
> >
> > A context capturing helper for iterators would let you decide whether or
> > not that's what you actually wanted by instead writing:
> >
> >     results = iter_in_context(calculate_result(a, b) for a, b in data)
> >
> > Here, "iter_in_context" would indicate explicitly to the reader that
> > whenever another item is taken from this iterator, the execution context
> > is going to be temporarily reset back to the way it was on this line.
> > And since it would be a protocol based iterator-in-iterator-out
> > function, you could wrap it around *any* iterator, not just
> > generator-iterator objects.
> I have a hard time seeing the advantage of having a default
> where the context at the time of execution is dependent on
> where it happens rather than where it's defined.

The underlying rationale is that the generator form should continue to be
as close as we can reasonably make it to being pure syntactic sugar for the
iterator form:

    class ResultsIterator:
        def __init__(self, data):
            self._itr = iter(data)
        def __next__(self):
            return calculate_result(next(self._itr))

    results = _ResultsIterator(data)

The logical context adjustments in PEP 550 then serve to make using a with
statement around a yield expression in a generator closer in meaning to
using one around a return statement in a __next__ method implementation.

> IMO, the default should be to use the context where the line
> was defined in the code, since that matches the intuitive
> way of writing and defining code.

This would introduce a major behavioural discrepancy between generators and

> The behavior of also deferring the context to time of
> execution should be the non-standard form to not break
> this intuition, otherwise debugging will be a pain and
> writing fully working code would be really hard in the
> face of changing contexts (e.g. say decimal rounding
> changes in different parts of the code).

No, it really wouldn't, since "the execution context is the context that's
active when the code is executed" is relatively easy to understand based
entirely on the way functions, methods, and other forms of delayed
execution work (including iterators).

"The execution context is the context that's active when the code is
executed, *unless* the code is in a generator, in which case, it's the
context that was active when the generator-iterator was instantiated" is
harder to follow.


Nick Coghlan   |   ncoghlan at gmail.com   |   Brisbane, Australia
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mail.python.org/pipermail/python-ideas/attachments/20171015/0709cf3f/attachment.html>

More information about the Python-ideas mailing list