return value of yield expressions

Hi, I should start off by saying that my knowledge of PEP342 is less than a week old, and I cannot claim any use-case for what I am about to propose. The PEP itself says "In effect, a yield-expression is like an inverted function call; the argument to yield is in fact returned (yielded) from the currently executing function, and the "return value" of yield is the argument passed in via send()." One could, I guess, wrap a function to be 'called' via this mechanism: def genwrap(func): def gen(): ret = None while True: args, kwds = (yield ret) ret = func(*args, **kwds) del args, kwds g = gen() g.next() return g f = genwrap(func) f.send( (args, kwds) ) However, 'send' takes only one argument (and hence the poor syntax in the last statement). Has the option of the return value of the yield expression treated very much like function arguments been discussed? (a1, a2, a3 = default, *args, **kwds) = (yield ret) 'send' could support positional and keyword arguments. I guess this particular form would break backward compatibility but there might be other alternatives. Regards, Krishnan

Hi On 2011-09-13 05:24, H. Krishnan wrote:
That's OK. Welcome to the wonderful world of PEP342 generators :)
There are really two independent proposals here. I'll call them "function argument unpacking" and "allow arbitrary arguments to send". To start with the latter, I don't think that is going to fly. All arguments to send must be returned as a single value by yield or cause most of the examples in the PEP380 discussion to fail. Allowing arbitrary arguments to send would have to change that value in a backward-incompatible way when sending a single argument. However, you don't really need to change send itself to improve the situation. A small helper function like: def send(*args, **kwds): if not args: raise TypeError( 'send() takes at least 1 positional argument (0 given)' ) return args[0].send((args[1:], kwds)) would turn your "f.send( (args, kwds) )" into "send(f, *args, **kwds)", which is already much nicer syntax. I would be +0 on adding such a helper as builtin, similar to the "next" builtin. Your other proposal is really independent of generators I think. I too would like to see a way to do "function argument unpacking". args, kwds = (yield ret) # any expression really (a1, a2, a3, *args), kwds = (lambda a1,a2,a3=default,*args, **kwds: (a1,a2,a3)+args, kwds )(*args, **kwds) is about the shortest way I can come up with that works today, and that is way too much repetition for my taste. If locals() were writable (a change that is unlikely to happen btw), this could be reduced to: args, kwds = (yield ret) # any expression really locals().update( (lambda a1,a2,a3=default,*args,**kwds:locals())(*args, **kwds) ) which would avoid some of the duplication but is really not that more readable. Your version seems like a nice extension of the regular tuple unpacking for the special case of (args, kwds) tuples. The main problem is how to distinguish the normal tuple unpacking from the new form. E.g. does a, b = (), {'a':1, 'b':2} result in a==() and b=={'a':1, 'b':2} or in a==1 and b==2? (it would have to be the first, for obvious backward-compatibility reasons) A possible solution would be: *(a1, a2, a3 = default, *args, **kwds) = (yield ret) Ie, let '*(argument list) = args, kwds' be the syntax for "function argument unpacking". I am not sure if this is parseable within the constraints that we have for the python parser, but I think it would be. It is currently invalid syntax, so should be backwards compatible. To summarize -1 to "allow arbitrary arguments to generator.send". +0 to adding a builtin "send" helper function as described above. +1 for "function argument unpacking". Best regards - Jacob

Jacob Holm, 13.09.2011 13:02:
Note that recent Python versions support extended argument unpacking, so this works: a1, *other, a2 = return_some_sequence() If you use the last value of the returned sequence (such as a tuple) to pass a dict, or if you return a tuple with two arguments (posargs, kwargdict), you basically get what you wanted above. Time machine keys are back were they were. Nothing to see here, keep passing. Stefan

Hi Stefan On 2011-09-13 14:21, Stefan Behnel wrote:
If you look closer you'll see I am actually using that feature already in the code snippet you quoted.
basically no. The suggested "function argument unpacking" includes support for default values, and for passing positional arguments by name. Everything that happens when you call a function using (*arg, **kwds) really.
Time machine keys are back were they were. Nothing to see here, keep passing.
I disagree. You overlooked most of the requested feature... - Jacob

In the particular situation of send() --> yield, one could do something like (forgeting the backward compatibility with respect to the return value): gen.send(2, a2=3, e=5) and in the gen: (a1, a2, a3 = 3, *args, **kwds) = (yield <expr>) without having to do whatever Jacob has written: args, kwds = (yield <expr>) (a1, a2, a3, *args), kwds = (lambda a1,a2,a3=3,*args, **kwds: (a1,a2,a3)+args, kwds )(*args, **kwds)

On Tue, Sep 13, 2011 at 6:00 AM, H. Krishnan <hetchkay@gmail.com> wrote:
In the particular situation of send() --> yield, one could do something like (forgeting the backward compatibility with respect to the return value):
gen.send(2, a2=3, e=5)
and in the gen: (a1, a2, a3 = 3, *args, **kwds) = (yield <expr>)
What on earth is this syntax supposed to mean? Never mind that there's a yield on the RHS; it is just a parenthesized expression so it could could be any other function. What on earth do you expect to happen with the syntax on the left, i.e. with the part (a1, a2, a3 = 3, *args, **kwds) = ........whatever........ ??????? It seems to me that changes to send() could be proposed that allow .send() without argument or with multiple (positional) arguments, but this would just make the yield expression return a tuple. It is not unprecedented that foo() == foo(None), nor is it inconceivable that bar(x, y) == bar((x, y)). This is not so different from the equivalence between "x = 1, 2" and "x = (1, 2)". Do note that there's perhaps a bit of an odd case where g.send((1,)) should yield a 1-tuple, whereas g.send(1,) would be equivalent to g.send(1) and hence just send the value 1, not wrapped in a tuple. That this wasn't proposed in the original PEP 342 was probably just a matter of keeping things simple -- which I am still in favor of, and barring a lot more evidence of how incredibly useful the proposed enhancement would be (with *real* use cases, not made-up examples!) I will remain -0 on the proposal of allowing different argument counts to .send(). But this business with argument unpacking syntax is different. It seems poorly thought through. Syntax proposals like this often fail because the proposer does not actually understand how Python's parser works and how it is constrained, intentionally, to limited look-ahead and no compile-time knowledge of the types of the values being passed around at runtime. Also keep in mind orthogonality -- (yield <expr>) should be usable anywhere: in an expression, in an if-statement, in a return statement, in a function argument, in an index, as an operand of a built-in operator, etc. You can't have a grammar where the syntax for the left-hand side of the assignment symbol has a different syntax depending on whether the righ-hand side contains a (yield <expr>) or not. -- --Guido van Rossum (python.org/~guido)

*(a1, a2, a3 = 3, *args, **kwds) = (yield expr) and with g.send(*a, **k) being called, a1, a2, a3, args, kwds can be inferred using the same semantics that is used to decipher a1, a2, a3, args, kwds in a call to the following function with *a and **k as arguments: def func(a1, a2, a3 = 3, *args, **kwds): ... That (yield expr) can be used in expressions does not (I feel) affect this (particularly if we go with Jacob's suggestion of this being a general unpacking option), just in the same way that "a, b, *args, c = <tuple>" does not affect using tuples in expressions. But after reading your comment in another thread about those who don't know how python works should keep quiet, I guess it is best if I end this here :-)

On Tue, Sep 13, 2011 at 10:21 AM, H. Krishnan <hetchkay@gmail.com> wrote:
It's not a backwards compatibility issue. Maybe I'm dense, but *I do not understand what it means in your proposal.*
Suppose we use Jacob's syntax (for argument's sake)
*(a1, a2, a3 = 3, *args, **kwds) = (yield expr)
I still don't follow. Can you show a specific argument list to send, e.g. g.send(1, 2, 3, foo='a', bar='b') and then tell me what the values of a1, a2, a3, args and kwds will be? And, since X = Y should always be equivalent to tmp = Y X = tmp can you also tell me what value is supposed to be produced by (yield expr)? I.e. if I did this: tmp = (yield expr) *(a1, a2, a3 = 3, *args, **kwds) = tmp what would the value of tmp be?
I understand "a, b, *c = <something>". I do not understand three things in your example: *(....) = <something> # What is the meaning of the prefix *? *(.... = .....) = <something> # What does an = inside parentheses mean? *(.... = ......, **kwds) = <something> # What does **kwds mean in this context? -- --Guido van Rossum (python.org/~guido)

On Tue, Sep 13, 2011 at 10:57 AM, Georg Brandl <g.brandl@gmx.net> wrote:
That sounds like impossible to implement. If there should be special-case treatment it should use some other form of syntax (e.g. "yield <expr> as <var>"). But I really don't see why this is so important to change so much of the syntax of the language. -- --Guido van Rossum (python.org/~guido)

On 2011-09-13 20:07, Guido van Rossum wrote:
The '*(argument list) = <expression>' syntax was my attempt at fixing some of the problems in the original proposal. I.e. to remove the restriction that it only should work with "= yield", and distinguish the "function argument unpacking" from the regular tuple unpacking. My idea was to allow *(argument list) = (args, kwds) where the LHS follows all the rules of a function argument list (the part of a function definition between the name and the colon), and the RHS may be any expression that returns a 2-tuple consisting of a sequence and a dictionary. The intended semantics was to compute the arguments that a function with the given argument list would see when called with (*args, **kwds), and make the necessary assignments in the local scope. Any way I can think of that works today violates DRY. A simple example: *(a, b, c=42, *args, **kwds) = ((1,), {'b':2, 'd':3}) assigns a=1, b=2, c=42, args=(), kwds={'d':3} And my best hack so far to get the same effect today is: args, kwds = ((1,), {'b':2, 'd':3}) a, b, c, args, kwds = \ (lambda a, b, c=42, *args, **kwds:a, b, c, args, kwds)(*args, **kwds) I am not wed to the particular suggested syntax, but I would like to see the functionality available *somehow* in a way where you don't have to repeat the sequence of variable names. And I don't think that is possible without *some* syntax change. - Jacob

On Tue, Sep 13, 2011 at 11:35 AM, Jacob Holm <jh@improva.dk> wrote:
Thanks, I finally understand. Part of the reason why it was unclear to me was that I was reading the form inside *(...) as having the same priority rules as a regular assignment, where "," binds tighter than "="; whereas in a function argument list "=" binds tighter than ",".
I agree that you can't do this today. But so what? I am not so convinced that DRY is always the most important rule to invoke. Apart from the extremely esoteric example of wanting to call g.send() with a mix of positional and keyword arguments that are somehow interpreted as a completely general function parameter list by the receiving yield expression, what is the use case? -- --Guido van Rossum (python.org/~guido)

On 2011-09-13 20:53, Guido van Rossum wrote:
I knew it had to be something like that. :)
I am sure it isn't. I would also have used "readability counts", but your earlier comments have shown that that argument is probably not carrying much weight either in this case. :)
See that's my main problem. I know I have run into this need several times in the past, but for the life of me I can't remember the details. I remember it had something to do with decorators, but nothing beyond that. At the moment I just like the idea because it seems like this is a case where something should be really easy, but isn't. I'll let it rest until and unless I remember what the actual use was. - Jacob

On Sep 13, 2011 8:53 PM, "Guido van Rossum" <guido@python.org> wrote:
I've only skimmed this so I might be way off track, but isn't this capability (although not the syntax) already covered to an extent by PEP 362?
But so what?
I am not so convinced that DRY is always the most important rule to
invoke.

On Wed, Sep 14, 2011 at 8:26 AM, David Townshend <aquavitae69@gmail.com> wrote:
Yep, I was going to mention PEP 362's Signature.bind() as well - the function signature objects in that PEP are closely tied in to the issues described in that thread. Sure, you can't easily unpack them into local variables, but you could fairly easily use them to pass formatted data into a coroutine. As far as non-coroutine use cases go, the main benefit (explicitly mentioned in PEP 362) lies in generating signature details for wrapper functions that go beyond the current typical "*args, **kwds". A tool like functools.partial(), for instance, could take the signature object for the function being wrapped, remove the values already supplied, and then set the result as the signature for the created object. Another Signature.bind() in particular is useful for is in prevalidating arguments for delayed calls - you can check that the parameters at least match the function being called immediately, while deferring the actual invocation of the function until later. Cheers, Nick. -- Nick Coghlan | ncoghlan@gmail.com | Brisbane, Australia

My idea was to allow
*(argument list) = (args, kwds)
May be tuple unpacking can be extended to support: (a,b,c,*args,d,**kwds) = p,q,*a1,r,s,**k1 where "p,q,*a,r,s" is equivalent to itertools.chain((p,q),a,(r,s)) (something like scheme's ",@") And in parallel, function argument unpacking could be extended to support def func(a,b,c,*args,d,**kwds): pass But I guess this would not fly: 1. I am not sure about how default values in function arguments would be handled 2. PEP 3113 disallows tuple unpacking as part of function arguments 3. Currently, *args is populated with a list (and not tuple) in tuple unpacking and possibly a dozen other issues. Krishnan

In general, I wonder if a, b, *args, c, *args2, d could be syntax sugar for itertools.chain((a,b), args, (c,), args2, (d,)) Function calls could then support multiple *args and interperse *args with other positional arguments. Krishnan

Jacob Holm, 13.09.2011 14:32:
Ah, ok, I thought you were *proposing* to add this. That happens on python-ideas a lot more often than you seem to expect.
Well, I really don't see how this is a wide-spread use case (I certainly never stumbled over it), but if you feel like needing it, write a utility function that does the unpacking for you in a couple of lines and wrap the call with that. I have my doubts that it would make your code much clearer. Especially default values for keyword dict return values do not appear to be any useful to me, given that you'd most likely unpack them one by one anyway. So you could just use d.get() with a default argument there, thus making it explicit and obvious in your code what is going on. Stefan

On 2011-09-13 15:00, Stefan Behnel wrote:
I remember running into this exact problem at one point when working with decorators, but don't remember the details. IIRC you really can't do much in terms of writing a helper function. You can get far, but not all the way. a, b, c, args, kwds = helper( (lambda a, b, c=3, *args, **kwds:locals()), (args, kwds) ) is IIRC the best you can do. You need the tuple-unpacking assignment to get the names into locals, and you need an actual function (the lambda) to specify the arguments in a natural way. You still end up repeating the names of each argument, which is a DRY violation. No helper function can get you even close to the readability and non-DRYness of *(a, b, c=42, *args, **kwds) = (args, kwds)
The point is to be able to get from a (args, kwds) tuple to some actual locally assigned names, based on the rules we already know and love from function arguments. I know this is a weak argument, based on purity rather than practicality, but until I run into the issue again I really can't give you an example. - Jacob

Hi On 2011-09-13 15:03, H. Krishnan wrote:
Not quite sure I understand the question, but I'll try to answer anyway. You need to be able to use "var1 = yield var2" in the generator independent on what send is called with, or you can't wrap a generator using another generator (different ways of doing that was a large part of the PEP380 discussion). So what should "var1" be in each of the following cases? g.send() # currently illegal g.send(None) # None g.send((), {}) # currently illegal g.send(((), {})) # ((), {}) I suppose you could change it so passing exactly one argument did something different from passing zero or multiple arguments, but then you have a problem distinguishing when you actually want to use the value in the other end. Hope this helps. - Jacob

Let us consider from a function point of view: If suppose there was no support for *args and **kwds and you wrote: def func(a): do_something_with_a and this was called with: func((1,2)) and subsequently, *args, **kwds support was added to functions, will anything related to 'func' need to change? Thus, if var = yield <expr> is used, send() needs to be called with only one argument (or as send(var = x) ). If var1, var2 = yield <expr> is used, send() needs to be called with two arguments. If yield <expr> is used, send() can be called with any no. of arguments/keywords etc. Krishnan

On 2011-09-13 17:09, H. Krishnan wrote:
I think I get your point. I don't think you are getting mine. Consider this example: def foo(): *(a, b) = yield f = foo() next(f) f.send(1, 2) # Ok def wrapper(g): r = next(g) while 1: args = (yield r) r = g.send(args) # BOOM! f = wrapper(foo()) next(f) f.send(1, 2) # BOOM! That wrapper works today, and does nothing (except messing up throw handling, but that is a different story). With your suggested change, such a wrapper would break unless it was changed to use *(*args, **kwargs) = (yield r) instead. IOW a backwards incompatible change. - Jacob

On Tue, Sep 13, 2011 at 8:57 AM, Jacob Holm <jh@improva.dk> wrote:
What did you mean here? It is not valid syntax. It says SyntaxError: starred assignment target must be in a list or tuple
What on earth would **kwds in a list/tuple unpack mean? -- --Guido van Rossum (python.org/~guido)

On Tue, 2011-09-13 at 13:02 +0200, Jacob Holm wrote:
Instead of extending tuple unpacking, I think I'd prefer to go the other way and improve function data sharing. If we could get the functions arguments when a function is done instead of getting the return value. (The modified function arguments object is the return value in this case.) fargs = &foo(a, b, c, d=4) Then someplace else use that function argument object directly. bar.__with_args__(fargs) # fargs becomes bar's argument object It could avoid unpacking and repacking data when it's passed between compatible functions. Cheers, Ron

On Wed, Sep 14, 2011 at 3:23 AM, Ron Adam <ron3200@gmail.com> wrote:
This use case is covered by the bind() method in PEP 362 [1]: foo_signature = inspect.signature(foo) fargs = foo_signature.bind(a, b, c, d=4) [1] http://www.python.org/dev/peps/pep-0362/#signature-object Cheers, Nick. -- Nick Coghlan | ncoghlan@gmail.com | Brisbane, Australia

On Wed, 2011-09-14 at 13:41 +1000, Nick Coghlan wrote:
It doesn't quite do the same thing, and it's not nearly as easy to use. Regarding PEP 362: The signature function gets the signature object from the target function, ok, The use case in the pep describes binding and introspecting the signature object for various purposes, which is a nice thing to be able to do. But after that, it just calls the functions with the same *args and **kwds that was used to bind the signature. So it's unpacking args and kwds a second time. Will there be a way to reuse the bound signature object with a function directly without unpacking it? Cheers, Ron (My replies are still taking more than 6 to 8 hours to show up on the list? I changed my email address recently, The old one no longer works as I chose not to renew the domain name.)

On Wed, Sep 14, 2011 at 3:01 PM, Ron Adam <ron3200@gmail.com> wrote:
And magic syntax is better? This is an incredibly niche use case, so there is zero justification for giving it special syntax when functions, methods and classes will do the job just fine (it's also worth keeping in mind that Guido hasn't even officially given PEP 362 itself the nod at this point).
No such micro-optimisation is planned (it would require a complete redesign of the way arguments are passed to functions). If the performance hit of explicit validation is unacceptable, just don't bother with it and accept that you won't get an error until the function is actually used. Cheers, Nick. -- Nick Coghlan | ncoghlan@gmail.com | Brisbane, Australia

On Wed, 2011-09-14 at 17:30 +1000, Nick Coghlan wrote:
I don't know about it being a niche use case. How many time is (*args, **kwds) used to pass the signature data forward or back? I think this might be a lateral issue that could compliment Pep 362. Thinking about it, I am able (with a bit of work) get the final signature data mapping from within a function and return it. def xupdate_map(map1, map2): """ Only update values in map1 from map2. """ for k in map1.keys(): map1[k] = map2[k] def foo(v, count=2): sig_vals = locals() # Do stuff that may change v and/or count. for n = range(count): v += 1 return xupdate_map(sig_vals, locals()) That's pretty limiting and I can't call foo again with the result. It's also not very efficient either. I need to play with this a bit more before I can really explain what I seem to be thinking of. If I get it pinned down with some real world examples of how it would help, then maybe I'll post it here later. The general idea is to be able to use the signature mapping "such as what bind returns" in a more direct way with more freedom than *args, and **kwds allows. Cheers, Ron

On Thu, Sep 15, 2011 at 9:26 PM, Ron Adam <ron3200@gmail.com> wrote:
I think you're on the wrong path. Structured data ought to be represented as a class instance, or some other suitable data structure, not as an argument list to a function. Otherwise before you know it you will have reinvented namedtuple. I feel pretty strongly that adding syntax so that you can unpack an expression the way function arguments are unpacked is a bad way to evolve Python. -- --Guido van Rossum (python.org/~guido)

Hi On 2011-09-13 05:24, H. Krishnan wrote:
That's OK. Welcome to the wonderful world of PEP342 generators :)
There are really two independent proposals here. I'll call them "function argument unpacking" and "allow arbitrary arguments to send". To start with the latter, I don't think that is going to fly. All arguments to send must be returned as a single value by yield or cause most of the examples in the PEP380 discussion to fail. Allowing arbitrary arguments to send would have to change that value in a backward-incompatible way when sending a single argument. However, you don't really need to change send itself to improve the situation. A small helper function like: def send(*args, **kwds): if not args: raise TypeError( 'send() takes at least 1 positional argument (0 given)' ) return args[0].send((args[1:], kwds)) would turn your "f.send( (args, kwds) )" into "send(f, *args, **kwds)", which is already much nicer syntax. I would be +0 on adding such a helper as builtin, similar to the "next" builtin. Your other proposal is really independent of generators I think. I too would like to see a way to do "function argument unpacking". args, kwds = (yield ret) # any expression really (a1, a2, a3, *args), kwds = (lambda a1,a2,a3=default,*args, **kwds: (a1,a2,a3)+args, kwds )(*args, **kwds) is about the shortest way I can come up with that works today, and that is way too much repetition for my taste. If locals() were writable (a change that is unlikely to happen btw), this could be reduced to: args, kwds = (yield ret) # any expression really locals().update( (lambda a1,a2,a3=default,*args,**kwds:locals())(*args, **kwds) ) which would avoid some of the duplication but is really not that more readable. Your version seems like a nice extension of the regular tuple unpacking for the special case of (args, kwds) tuples. The main problem is how to distinguish the normal tuple unpacking from the new form. E.g. does a, b = (), {'a':1, 'b':2} result in a==() and b=={'a':1, 'b':2} or in a==1 and b==2? (it would have to be the first, for obvious backward-compatibility reasons) A possible solution would be: *(a1, a2, a3 = default, *args, **kwds) = (yield ret) Ie, let '*(argument list) = args, kwds' be the syntax for "function argument unpacking". I am not sure if this is parseable within the constraints that we have for the python parser, but I think it would be. It is currently invalid syntax, so should be backwards compatible. To summarize -1 to "allow arbitrary arguments to generator.send". +0 to adding a builtin "send" helper function as described above. +1 for "function argument unpacking". Best regards - Jacob

Jacob Holm, 13.09.2011 13:02:
Note that recent Python versions support extended argument unpacking, so this works: a1, *other, a2 = return_some_sequence() If you use the last value of the returned sequence (such as a tuple) to pass a dict, or if you return a tuple with two arguments (posargs, kwargdict), you basically get what you wanted above. Time machine keys are back were they were. Nothing to see here, keep passing. Stefan

Hi Stefan On 2011-09-13 14:21, Stefan Behnel wrote:
If you look closer you'll see I am actually using that feature already in the code snippet you quoted.
basically no. The suggested "function argument unpacking" includes support for default values, and for passing positional arguments by name. Everything that happens when you call a function using (*arg, **kwds) really.
Time machine keys are back were they were. Nothing to see here, keep passing.
I disagree. You overlooked most of the requested feature... - Jacob

In the particular situation of send() --> yield, one could do something like (forgeting the backward compatibility with respect to the return value): gen.send(2, a2=3, e=5) and in the gen: (a1, a2, a3 = 3, *args, **kwds) = (yield <expr>) without having to do whatever Jacob has written: args, kwds = (yield <expr>) (a1, a2, a3, *args), kwds = (lambda a1,a2,a3=3,*args, **kwds: (a1,a2,a3)+args, kwds )(*args, **kwds)

On Tue, Sep 13, 2011 at 6:00 AM, H. Krishnan <hetchkay@gmail.com> wrote:
In the particular situation of send() --> yield, one could do something like (forgeting the backward compatibility with respect to the return value):
gen.send(2, a2=3, e=5)
and in the gen: (a1, a2, a3 = 3, *args, **kwds) = (yield <expr>)
What on earth is this syntax supposed to mean? Never mind that there's a yield on the RHS; it is just a parenthesized expression so it could could be any other function. What on earth do you expect to happen with the syntax on the left, i.e. with the part (a1, a2, a3 = 3, *args, **kwds) = ........whatever........ ??????? It seems to me that changes to send() could be proposed that allow .send() without argument or with multiple (positional) arguments, but this would just make the yield expression return a tuple. It is not unprecedented that foo() == foo(None), nor is it inconceivable that bar(x, y) == bar((x, y)). This is not so different from the equivalence between "x = 1, 2" and "x = (1, 2)". Do note that there's perhaps a bit of an odd case where g.send((1,)) should yield a 1-tuple, whereas g.send(1,) would be equivalent to g.send(1) and hence just send the value 1, not wrapped in a tuple. That this wasn't proposed in the original PEP 342 was probably just a matter of keeping things simple -- which I am still in favor of, and barring a lot more evidence of how incredibly useful the proposed enhancement would be (with *real* use cases, not made-up examples!) I will remain -0 on the proposal of allowing different argument counts to .send(). But this business with argument unpacking syntax is different. It seems poorly thought through. Syntax proposals like this often fail because the proposer does not actually understand how Python's parser works and how it is constrained, intentionally, to limited look-ahead and no compile-time knowledge of the types of the values being passed around at runtime. Also keep in mind orthogonality -- (yield <expr>) should be usable anywhere: in an expression, in an if-statement, in a return statement, in a function argument, in an index, as an operand of a built-in operator, etc. You can't have a grammar where the syntax for the left-hand side of the assignment symbol has a different syntax depending on whether the righ-hand side contains a (yield <expr>) or not. -- --Guido van Rossum (python.org/~guido)

*(a1, a2, a3 = 3, *args, **kwds) = (yield expr) and with g.send(*a, **k) being called, a1, a2, a3, args, kwds can be inferred using the same semantics that is used to decipher a1, a2, a3, args, kwds in a call to the following function with *a and **k as arguments: def func(a1, a2, a3 = 3, *args, **kwds): ... That (yield expr) can be used in expressions does not (I feel) affect this (particularly if we go with Jacob's suggestion of this being a general unpacking option), just in the same way that "a, b, *args, c = <tuple>" does not affect using tuples in expressions. But after reading your comment in another thread about those who don't know how python works should keep quiet, I guess it is best if I end this here :-)

On Tue, Sep 13, 2011 at 10:21 AM, H. Krishnan <hetchkay@gmail.com> wrote:
It's not a backwards compatibility issue. Maybe I'm dense, but *I do not understand what it means in your proposal.*
Suppose we use Jacob's syntax (for argument's sake)
*(a1, a2, a3 = 3, *args, **kwds) = (yield expr)
I still don't follow. Can you show a specific argument list to send, e.g. g.send(1, 2, 3, foo='a', bar='b') and then tell me what the values of a1, a2, a3, args and kwds will be? And, since X = Y should always be equivalent to tmp = Y X = tmp can you also tell me what value is supposed to be produced by (yield expr)? I.e. if I did this: tmp = (yield expr) *(a1, a2, a3 = 3, *args, **kwds) = tmp what would the value of tmp be?
I understand "a, b, *c = <something>". I do not understand three things in your example: *(....) = <something> # What is the meaning of the prefix *? *(.... = .....) = <something> # What does an = inside parentheses mean? *(.... = ......, **kwds) = <something> # What does **kwds mean in this context? -- --Guido van Rossum (python.org/~guido)

On Tue, Sep 13, 2011 at 10:57 AM, Georg Brandl <g.brandl@gmx.net> wrote:
That sounds like impossible to implement. If there should be special-case treatment it should use some other form of syntax (e.g. "yield <expr> as <var>"). But I really don't see why this is so important to change so much of the syntax of the language. -- --Guido van Rossum (python.org/~guido)

On 2011-09-13 20:07, Guido van Rossum wrote:
The '*(argument list) = <expression>' syntax was my attempt at fixing some of the problems in the original proposal. I.e. to remove the restriction that it only should work with "= yield", and distinguish the "function argument unpacking" from the regular tuple unpacking. My idea was to allow *(argument list) = (args, kwds) where the LHS follows all the rules of a function argument list (the part of a function definition between the name and the colon), and the RHS may be any expression that returns a 2-tuple consisting of a sequence and a dictionary. The intended semantics was to compute the arguments that a function with the given argument list would see when called with (*args, **kwds), and make the necessary assignments in the local scope. Any way I can think of that works today violates DRY. A simple example: *(a, b, c=42, *args, **kwds) = ((1,), {'b':2, 'd':3}) assigns a=1, b=2, c=42, args=(), kwds={'d':3} And my best hack so far to get the same effect today is: args, kwds = ((1,), {'b':2, 'd':3}) a, b, c, args, kwds = \ (lambda a, b, c=42, *args, **kwds:a, b, c, args, kwds)(*args, **kwds) I am not wed to the particular suggested syntax, but I would like to see the functionality available *somehow* in a way where you don't have to repeat the sequence of variable names. And I don't think that is possible without *some* syntax change. - Jacob

On Tue, Sep 13, 2011 at 11:35 AM, Jacob Holm <jh@improva.dk> wrote:
Thanks, I finally understand. Part of the reason why it was unclear to me was that I was reading the form inside *(...) as having the same priority rules as a regular assignment, where "," binds tighter than "="; whereas in a function argument list "=" binds tighter than ",".
I agree that you can't do this today. But so what? I am not so convinced that DRY is always the most important rule to invoke. Apart from the extremely esoteric example of wanting to call g.send() with a mix of positional and keyword arguments that are somehow interpreted as a completely general function parameter list by the receiving yield expression, what is the use case? -- --Guido van Rossum (python.org/~guido)

On 2011-09-13 20:53, Guido van Rossum wrote:
I knew it had to be something like that. :)
I am sure it isn't. I would also have used "readability counts", but your earlier comments have shown that that argument is probably not carrying much weight either in this case. :)
See that's my main problem. I know I have run into this need several times in the past, but for the life of me I can't remember the details. I remember it had something to do with decorators, but nothing beyond that. At the moment I just like the idea because it seems like this is a case where something should be really easy, but isn't. I'll let it rest until and unless I remember what the actual use was. - Jacob

On Sep 13, 2011 8:53 PM, "Guido van Rossum" <guido@python.org> wrote:
I've only skimmed this so I might be way off track, but isn't this capability (although not the syntax) already covered to an extent by PEP 362?
But so what?
I am not so convinced that DRY is always the most important rule to
invoke.

On Wed, Sep 14, 2011 at 8:26 AM, David Townshend <aquavitae69@gmail.com> wrote:
Yep, I was going to mention PEP 362's Signature.bind() as well - the function signature objects in that PEP are closely tied in to the issues described in that thread. Sure, you can't easily unpack them into local variables, but you could fairly easily use them to pass formatted data into a coroutine. As far as non-coroutine use cases go, the main benefit (explicitly mentioned in PEP 362) lies in generating signature details for wrapper functions that go beyond the current typical "*args, **kwds". A tool like functools.partial(), for instance, could take the signature object for the function being wrapped, remove the values already supplied, and then set the result as the signature for the created object. Another Signature.bind() in particular is useful for is in prevalidating arguments for delayed calls - you can check that the parameters at least match the function being called immediately, while deferring the actual invocation of the function until later. Cheers, Nick. -- Nick Coghlan | ncoghlan@gmail.com | Brisbane, Australia

My idea was to allow
*(argument list) = (args, kwds)
May be tuple unpacking can be extended to support: (a,b,c,*args,d,**kwds) = p,q,*a1,r,s,**k1 where "p,q,*a,r,s" is equivalent to itertools.chain((p,q),a,(r,s)) (something like scheme's ",@") And in parallel, function argument unpacking could be extended to support def func(a,b,c,*args,d,**kwds): pass But I guess this would not fly: 1. I am not sure about how default values in function arguments would be handled 2. PEP 3113 disallows tuple unpacking as part of function arguments 3. Currently, *args is populated with a list (and not tuple) in tuple unpacking and possibly a dozen other issues. Krishnan

In general, I wonder if a, b, *args, c, *args2, d could be syntax sugar for itertools.chain((a,b), args, (c,), args2, (d,)) Function calls could then support multiple *args and interperse *args with other positional arguments. Krishnan

Jacob Holm, 13.09.2011 14:32:
Ah, ok, I thought you were *proposing* to add this. That happens on python-ideas a lot more often than you seem to expect.
Well, I really don't see how this is a wide-spread use case (I certainly never stumbled over it), but if you feel like needing it, write a utility function that does the unpacking for you in a couple of lines and wrap the call with that. I have my doubts that it would make your code much clearer. Especially default values for keyword dict return values do not appear to be any useful to me, given that you'd most likely unpack them one by one anyway. So you could just use d.get() with a default argument there, thus making it explicit and obvious in your code what is going on. Stefan

On 2011-09-13 15:00, Stefan Behnel wrote:
I remember running into this exact problem at one point when working with decorators, but don't remember the details. IIRC you really can't do much in terms of writing a helper function. You can get far, but not all the way. a, b, c, args, kwds = helper( (lambda a, b, c=3, *args, **kwds:locals()), (args, kwds) ) is IIRC the best you can do. You need the tuple-unpacking assignment to get the names into locals, and you need an actual function (the lambda) to specify the arguments in a natural way. You still end up repeating the names of each argument, which is a DRY violation. No helper function can get you even close to the readability and non-DRYness of *(a, b, c=42, *args, **kwds) = (args, kwds)
The point is to be able to get from a (args, kwds) tuple to some actual locally assigned names, based on the rules we already know and love from function arguments. I know this is a weak argument, based on purity rather than practicality, but until I run into the issue again I really can't give you an example. - Jacob

Hi On 2011-09-13 15:03, H. Krishnan wrote:
Not quite sure I understand the question, but I'll try to answer anyway. You need to be able to use "var1 = yield var2" in the generator independent on what send is called with, or you can't wrap a generator using another generator (different ways of doing that was a large part of the PEP380 discussion). So what should "var1" be in each of the following cases? g.send() # currently illegal g.send(None) # None g.send((), {}) # currently illegal g.send(((), {})) # ((), {}) I suppose you could change it so passing exactly one argument did something different from passing zero or multiple arguments, but then you have a problem distinguishing when you actually want to use the value in the other end. Hope this helps. - Jacob

Let us consider from a function point of view: If suppose there was no support for *args and **kwds and you wrote: def func(a): do_something_with_a and this was called with: func((1,2)) and subsequently, *args, **kwds support was added to functions, will anything related to 'func' need to change? Thus, if var = yield <expr> is used, send() needs to be called with only one argument (or as send(var = x) ). If var1, var2 = yield <expr> is used, send() needs to be called with two arguments. If yield <expr> is used, send() can be called with any no. of arguments/keywords etc. Krishnan

On 2011-09-13 17:09, H. Krishnan wrote:
I think I get your point. I don't think you are getting mine. Consider this example: def foo(): *(a, b) = yield f = foo() next(f) f.send(1, 2) # Ok def wrapper(g): r = next(g) while 1: args = (yield r) r = g.send(args) # BOOM! f = wrapper(foo()) next(f) f.send(1, 2) # BOOM! That wrapper works today, and does nothing (except messing up throw handling, but that is a different story). With your suggested change, such a wrapper would break unless it was changed to use *(*args, **kwargs) = (yield r) instead. IOW a backwards incompatible change. - Jacob

On Tue, Sep 13, 2011 at 8:57 AM, Jacob Holm <jh@improva.dk> wrote:
What did you mean here? It is not valid syntax. It says SyntaxError: starred assignment target must be in a list or tuple
What on earth would **kwds in a list/tuple unpack mean? -- --Guido van Rossum (python.org/~guido)

On Tue, 2011-09-13 at 13:02 +0200, Jacob Holm wrote:
Instead of extending tuple unpacking, I think I'd prefer to go the other way and improve function data sharing. If we could get the functions arguments when a function is done instead of getting the return value. (The modified function arguments object is the return value in this case.) fargs = &foo(a, b, c, d=4) Then someplace else use that function argument object directly. bar.__with_args__(fargs) # fargs becomes bar's argument object It could avoid unpacking and repacking data when it's passed between compatible functions. Cheers, Ron

On Wed, Sep 14, 2011 at 3:23 AM, Ron Adam <ron3200@gmail.com> wrote:
This use case is covered by the bind() method in PEP 362 [1]: foo_signature = inspect.signature(foo) fargs = foo_signature.bind(a, b, c, d=4) [1] http://www.python.org/dev/peps/pep-0362/#signature-object Cheers, Nick. -- Nick Coghlan | ncoghlan@gmail.com | Brisbane, Australia

On Wed, 2011-09-14 at 13:41 +1000, Nick Coghlan wrote:
It doesn't quite do the same thing, and it's not nearly as easy to use. Regarding PEP 362: The signature function gets the signature object from the target function, ok, The use case in the pep describes binding and introspecting the signature object for various purposes, which is a nice thing to be able to do. But after that, it just calls the functions with the same *args and **kwds that was used to bind the signature. So it's unpacking args and kwds a second time. Will there be a way to reuse the bound signature object with a function directly without unpacking it? Cheers, Ron (My replies are still taking more than 6 to 8 hours to show up on the list? I changed my email address recently, The old one no longer works as I chose not to renew the domain name.)

On Wed, Sep 14, 2011 at 3:01 PM, Ron Adam <ron3200@gmail.com> wrote:
And magic syntax is better? This is an incredibly niche use case, so there is zero justification for giving it special syntax when functions, methods and classes will do the job just fine (it's also worth keeping in mind that Guido hasn't even officially given PEP 362 itself the nod at this point).
No such micro-optimisation is planned (it would require a complete redesign of the way arguments are passed to functions). If the performance hit of explicit validation is unacceptable, just don't bother with it and accept that you won't get an error until the function is actually used. Cheers, Nick. -- Nick Coghlan | ncoghlan@gmail.com | Brisbane, Australia

On Wed, 2011-09-14 at 17:30 +1000, Nick Coghlan wrote:
I don't know about it being a niche use case. How many time is (*args, **kwds) used to pass the signature data forward or back? I think this might be a lateral issue that could compliment Pep 362. Thinking about it, I am able (with a bit of work) get the final signature data mapping from within a function and return it. def xupdate_map(map1, map2): """ Only update values in map1 from map2. """ for k in map1.keys(): map1[k] = map2[k] def foo(v, count=2): sig_vals = locals() # Do stuff that may change v and/or count. for n = range(count): v += 1 return xupdate_map(sig_vals, locals()) That's pretty limiting and I can't call foo again with the result. It's also not very efficient either. I need to play with this a bit more before I can really explain what I seem to be thinking of. If I get it pinned down with some real world examples of how it would help, then maybe I'll post it here later. The general idea is to be able to use the signature mapping "such as what bind returns" in a more direct way with more freedom than *args, and **kwds allows. Cheers, Ron

On Thu, Sep 15, 2011 at 9:26 PM, Ron Adam <ron3200@gmail.com> wrote:
I think you're on the wrong path. Structured data ought to be represented as a class instance, or some other suitable data structure, not as an argument list to a function. Otherwise before you know it you will have reinvented namedtuple. I feel pretty strongly that adding syntax so that you can unpack an expression the way function arguments are unpacked is a bad way to evolve Python. -- --Guido van Rossum (python.org/~guido)
participants (9)
-
David Townshend
-
Georg Brandl
-
Guido van Rossum
-
H. Krishnan
-
Jacob Holm
-
Matt Joiner
-
Nick Coghlan
-
Ron Adam
-
Stefan Behnel