[Python-ideas] lazy tuple unpacking
Ian Cordasco
graffatcolmingov at gmail.com
Sat Jul 5 03:19:01 CEST 2014
On Fri, Jul 4, 2014 at 7:59 PM, Paul Tagliamonte <paultag at gmail.com> wrote:
> Given:
>
> >>> def g_range(n):
> ... for y in range(n):
> ... yield y
> ...
>
> I notice that:
>
> >>> a, b, c, *others = g_range(100)
>
> Works great. Super useful stuff there. Looks good.
>
> I also notice that this causes *others to consume the generator
> in a greedy way.
>
> >>> type(others)
> <class 'list'>
>
> And this makes me sad.
>
> >>> a, b, c, *others = g_range(10000000000)
> # will also make your machine very sad. Eventually resulting
> # (ok, unless you've got a really fancy bit of kit) in:
> Killed
>
> Really, the behavior (I think) should be more similar to:
>
> >>> _x = g_range(1000000000000)
> >>> a = next(_x)
> >>> b = next(_x)
> >>> c = next(_x)
> >>> others = _x
> >>>
>
>
> Of course, this leads to all sorts of fun errors, like the fact you
> couldn't iterate over it twice. This might not be expected. However, it
> might be nice to have this behavior when you're unpacking a generator.
>
> Thoughts?
I agree that the behaviour is suboptimal, but as Chris already pointed
out it would introduce a significant inconsistency in the API of
unpacking. I'm struggling to see a *good* way of doing this. My first
instinct was that we could make something like this do what you
expect:
>>> a, b, c, others = g_range(some_really_big_number)
>>> others
<generator ...>
But this doesn't work like this currently because Python currently
raises a ValueError because there were too many values to unpack. I'm
also against introducing some new syntax to add the behaviour.
More information about the Python-ideas
mailing list