[Python-Dev] PEP 550 v4

Yury Selivanov yselivanov.ml at gmail.com
Tue Sep 5 20:31:13 EDT 2017

On Tue, Sep 5, 2017 at 4:59 PM, Greg Ewing <greg.ewing at canterbury.ac.nz> wrote:
> Yury Selivanov wrote:
>> Question: how to write a context manager with contextvar.new?
>>   var = new_context_var()
>>    class CM:
>>         def __enter__(self):
>>               var.new(42)
>>    with CM():
>>         print(var.get() or 'None')
>> My understanding that the above code will print "None", because
>> "var.new()" makes 42 visible only to callees of __enter__.
> If you tie the introduction of a new scope for context vars
> to generators, as PEP 550 currently does, then this isn't a
> problem.
> But I'm trying to avoid doing that. The basic issue is that,
> ever since yield-from, "generator" and "task" are not
> synonymous.
> When you use a generator to implement an iterator, you
> probably want it to behave as a distinct task with its
> own local context. But a generator used with yield-from
> isn't a task of its own, it's just part of another task,
> and there is nothing built into Python that lets you
> tell the difference automatically.

Greg, have you seen this new section:

It has a couple of examples that illustrate some issues with the "But
a generator used with yield-from isn't a task of its own, it's just
part of another task," reasoning.

In principle, we can modify PEP 550 to make 'yield from' transparent
to context changes.  The interpreter can just reset
g.__logical_context__ to None whenever 'g' is being 'yield-frommed'.
The key issue is that there are a couple of edge-cases when having
this semantics is problematic.  The bottomline is that it's easier to
reason about context when it's guaranteed that context changes are
always isolated in generators no matter what.  I think this semantics
actually makes the refactoring easier. Please take a look at the
linked section.

> So I'm now thinking that the introduction of a new local
> context should also be explicit.
> Suppose we have these primitives:
>    push_local_context()
>    pop_local_context()
> Now introducing a temporary decimal context looks like:
>    push_local_context()
>    decimal.localcontextvar.new(decimal.getcontext().copy())
>    decimal.localcontextvar.prec = 5
>    do_some_calculations()
>    pop_local_context()
> Since calls (either normal or generator) no longer automatically
> result in a new local context, we can easily factor this out into
> a context manager:
>    class LocalDecimalContext():
>       def __enter__(self):
>          push_local_context()
>          ctx = decimal.getcontext().copy()
>          decimal.localcontextvar.new(ctx)
>          return ctx
>       def __exit__(self):
>          pop_local_context()
> Usage:
>    with LocalDecimalContext() as ctx:
>       ctx.prec = 5
>       do_some_calculations()

This will have some performance implications and make the API way more
complex. But I'm not convinced yet that real-life code needs the
semantics you want.

This will work with the current PEP 550 design:

     def g():
         with DecimalContext() as ctx:
             ctx.prec = 5
             yield from do_some_calculations()  # will run with the correct ctx

the only thing that won't work is this:

     def do_some_calculations():
         ctx = DecimalContext()
         ctx.prec = 10

     def g():
         yield from do_some_calculations()
         # Context changes in do_some_calculations() will not leak to g()

In the above example, do_some_calculations() deliberately tries to
leak context changes (by not using a contextmanager). And I consider
it a feature that PEP 550 does not allow generators to leak state.

If you write code that uses 'with' statements consistently, you will
never even know that context changes are isolated in generators.


More information about the Python-Dev mailing list