[Python-ideas] PEP draft: context variables

Guido van Rossum guido at python.org
Sun Oct 15 16:54:18 EDT 2017


All this arguing based on "equivalence" between different code fragments is
nuts. The equivalences were never meant to be exact, and people don't
typically understand code using generators using these equivalencies.

The key problem we're trying to address is creating a "context" abstraction
that can be controlled by async task frameworks without making the *access*
API specific to the framework. Because coroutines and generators are
similar under the covers, Yury demonstrated the issue with generators
instead of coroutines (which are unfamiliar to many people). And then
somehow we got hung up about fixing the problem in the example.

I want to back out of this entirely, because the problem (while it can be
demonstrated) is entirely theoretical, and the proposed solution is made
too complicated by attempting to solve the problem for generators as well
as for tasks.

--Guido

On Sat, Oct 14, 2017 at 10:43 PM, Nick Coghlan <ncoghlan at gmail.com> wrote:

> On 15 October 2017 at 15:05, Guido van Rossum <guido at python.org> wrote:
>
>> I would like to reboot this discussion (again). It feels to me we're
>> getting farther and farther from solving any of the problems we might solve.
>>
>> I think we need to give up on doing anything about generators; the use
>> cases point in too many conflicting directions. So we should keep the
>> semantics there, and if you don't want your numeric or decimal context to
>> leak out of a generator, don't put `yield` inside `with`. (Yury and Stefan
>> have both remarked that this is not a problem in practice, given that there
>> are no bug reports or StackOverflow questions about this topic.)
>>
>
> Let me have another go at building up the PEP 550 generator argument from
> first principles.
>
> The behaviour that PEP 550 says *shouldn't* change is the semantic
> equivalence of the following code:
>
>     # Iterator form
>     class ResultsIterator:
>         def __init__(self, data):
>             self._itr = iter(data)
>         def __next__(self):
>             return calculate_result(next(self._itr))
>
>     results = _ResultsIterator(data)
>
>     # Generator form
>     def _results_gen(data):
>         for item in data:
>             yield calculate_result(item)
>
>     results = _results_gen(data)
>
> This *had* been non-controversial until recently, and I still don't
> understand why folks suddenly decided we should bring it into question by
> proposing that generators should start implicitly capturing state at
> creation time just because it's technically possible for them to do so (yes
> we can implicitly change the way all generators work, but no, we can't
> implicitly change the way all *iterators* work).
>
> The behaviour that PEP 550 thinks *should* change is for the following
> code to become roughly semantically equivalent, given the constraint that
> the context manager involved either doesn't manipulate any shared state at
> all (already supported), or else only manipulates context variables (the
> new part that PEP 550 adds):
>
>     # Iterator form
>     class ResultsIterator:
>         def __init__(self, data):
>             self._itr = iter(data)
>         def __next__(self):
>             with adjusted_context():
>                 return calculate_result(next(self._itr))
>
>     results = _ResultsIterator(data)
>
>     # Generator form
>     def _results_gen(data):
>         for item in data:
>             with adjusted_context():
>                 yield calculate_result(item)
>
>     results = _results_gen(data)
>
> Today, while these two forms look like they *should* be comparable,
> they're not especially close to being semantically equivalent, as there's
> no mechanism that allows for implicit context reversion at the yield point
> in the generator form.
>
> While I think PEP 550 would still be usable without fixing this
> discrepancy, I'd be thoroughly disappointed if the only reason we decided
> not to do it was because we couldn't clearly articulate the difference in
> reasoning between:
>
> * "Generators currently have no way to reasonably express the equivalent
> of having a context-dependent return statement inside a with statement in a
> __next__ method implementation, so let's define one" (aka "context variable
> changes shouldn't leak out of generators, as that will make them *more*
> like explicit iterator __next__ methods"); and
> * "Generator functions should otherwise continue to be unsurprising
> syntactic sugar for objects that implement the regular iterator protocol"
> (aka "generators shouldn't implicitly capture their creation context, as
> that would make them *less* like explicit iterator __init__ methods").
>
> Cheers,
> Nick.
>
> --
> Nick Coghlan   |   ncoghlan at gmail.com   |   Brisbane, Australia
>



-- 
--Guido van Rossum (python.org/~guido)
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mail.python.org/pipermail/python-ideas/attachments/20171015/05e6e0a3/attachment-0001.html>


More information about the Python-ideas mailing list