[Python-ideas] PEP draft: context variables
M.-A. Lemburg
mal at egenix.com
Sun Oct 15 00:53:58 EDT 2017
On 15.10.2017 06:39, Nick Coghlan wrote:
> On 15 October 2017 at 05:47, Paul Moore <p.f.moore at gmail.com
> <mailto:p.f.moore at gmail.com>> wrote:
>
> On 14 October 2017 at 17:50, Nick Coghlan <ncoghlan at gmail.com
> <mailto:ncoghlan at gmail.com>> wrote:
> > If you capture the context eagerly, then there are fewer opportunities to
> > get materially different values from "data = list(iterable)" and "data =
> > iter(context_capturing_iterable)".
> >
> > While that's a valid intent for folks to want to be able to express, I
> > personally think it would be more clearly requested via an expression like
> > "data = iter_in_context(iterable)" rather than having it be implicit in the
> > way generators work (especially since having eager context capture be
> > generator-only behaviour would create an odd discrepancy between generators
> > and other iterators like those in itertools).
>
> OK. I understand the point here - but I'm not sure I see the practical
> use case for iter_in_context. When would something like that be used?
>
>
> Suppose you have some existing code that looks like this:
>
> results = [calculate_result(a, b) for a, b in data]
>
> If calculate_result is context dependent in some way (e.g. a & b might
> be decimal values), then eager evaluation of "calculate_result(a, b)"
> will use the context that's in effect on this line for every result.
>
> Now, suppose you want to change the code to use lazy evaluation, so that
> you don't need to bother calculating any results you don't actually use:
>
> results = (calculate_result(a, b) for a, b in data)
>
> In a PEP 550 world, this refactoring now has a side-effect that goes
> beyond simply delaying the calculation: since "calculate_result(a, b)"
> is no longer executed immediately, it will default to using whatever
> execution context is in effect when it actually does get executed, *not*
> the one that's in effect on this line.
>
> A context capturing helper for iterators would let you decide whether or
> not that's what you actually wanted by instead writing:
>
> results = iter_in_context(calculate_result(a, b) for a, b in data)
>
> Here, "iter_in_context" would indicate explicitly to the reader that
> whenever another item is taken from this iterator, the execution context
> is going to be temporarily reset back to the way it was on this line.
> And since it would be a protocol based iterator-in-iterator-out
> function, you could wrap it around *any* iterator, not just
> generator-iterator objects.
I have a hard time seeing the advantage of having a default
where the context at the time of execution is dependent on
where it happens rather than where it's defined.
IMO, the default should be to use the context where the line
was defined in the code, since that matches the intuitive
way of writing and defining code.
The behavior of also deferring the context to time of
execution should be the non-standard form to not break
this intuition, otherwise debugging will be a pain and
writing fully working code would be really hard in the
face of changing contexts (e.g. say decimal rounding
changes in different parts of the code).
--
Marc-Andre Lemburg
eGenix.com
Professional Python Services directly from the Experts (#1, Oct 15 2017)
>>> Python Projects, Coaching and Consulting ... http://www.egenix.com/
>>> Python Database Interfaces ... http://products.egenix.com/
>>> Plone/Zope Database Interfaces ... http://zope.egenix.com/
________________________________________________________________________
::: We implement business ideas - efficiently in both time and costs :::
eGenix.com Software, Skills and Services GmbH Pastor-Loeh-Str.48
D-40764 Langenfeld, Germany. CEO Dipl.-Math. Marc-Andre Lemburg
Registered at Amtsgericht Duesseldorf: HRB 46611
http://www.egenix.com/company/contact/
http://www.malemburg.com/
More information about the Python-ideas
mailing list