Python's iterable unpacking is what Lispers might call a destructuring bind.
py> iterable = 1, 2, 3, 4, 5
py> a, b, *rest = iterable
py> a, b, rest
(1, 2, (3, 4, 5))
Clojure also supports mapping destructuring. Let's add that to Python!
py> mapping = {"a": 1, "b": 2, "c": 3}
py> {"a": x, "b": y, "c": z} = mapping
py> x, y, z
(1, 2, 3)
py> {"a": x, "b": y} = mapping
Traceback:
ValueError: too many keys to unpack
This will be approximately as helpful as iterable unpacking was before PEP 3132 (https://www.python.org/dev/peps/pep-3132/).
I hope to keep discussion in this thread focused on the most basic form of dict unpacking, but we could extended mapping unpacking similarly to how PEP 3132 extended iterable unpacking. Just brainstorming...
py> mapping = {"a": 1, "b": 2, "c": 3}
py> {"a": x, **rest} = mapping
py> x, rest
(1, {"b": 2, "c": 3})
On 25 May 2016 at 14:11, Michael Selik michael.selik@gmail.com wrote:
Clojure also supports mapping destructuring. Let's add that to Python!
The part of me that likes new shiny things says "ooh, yes!" But the part of me that has to support software wants to know what improvements this would make to real-world code. Personally, I can't think of anywhere I'd have actually used a construct like this.
Paul
On 25.05.2016 15:11, Michael Selik wrote:
Clojure also supports mapping destructuring. Let's add that to Python!
+1 for the general idea.
Let's keep Python simple by providing flexible tools like the one you proposed instead of a monolithic switch-case. :)
>
py> mapping = {"a": 1, "b": 2, "c": 3}
py> {"a": x, "b": y, "c": z} = mapping
py> x, y, z
(1, 2, 3)
py> {"a": x, "b": y} = mapping
Traceback:
ValueError: too many keys to unpack
Nice! I like the error message.
I could imagine how this can be generalized to attribute access BUT I think its easier to discuss dict unpacking to the end first.
This will be approximately as helpful as iterable unpacking was before PEP 3132 (https://www.python.org/dev/peps/pep-3132/).
I'd add that all benefits for iterable unpacking apply as well. So, what's true for iterable unpacking is true for dict unpacking, too.
I hope to keep discussion in this thread focused on the most basic form of dict unpacking, but we could extended mapping unpacking similarly to how PEP 3132 extended iterable unpacking. Just brainstorming...
py> mapping = {"a": 1, "b": 2, "c": 3}
py> {"a": x, **rest} = mapping
py> x, rest
(1, {"b": 2, "c": 3})
That's basically suppressing the the ValueError above with the same justification as for PEP 3132.
I for one find this one of the shortest proposals compared to other recent proposals I have seen. :)
Best, Sven
I can see this being useful if I have a Django Rest Framework validate method (http://www.django-rest-framework.org/api-guide/serializers/#object-level-val...) which takes a dictionary argument.
def validate(self, data): {'name': name, 'address': address} = data
Currently, this would look like:
def validate(self, data): name, address = data['name'], data['address']
It does get more useful with the extension:
def validate(self, data): {'name': name, 'address': address, **rest} = data
instead of:
def validate(self, data): rest = data.copy() name = rest.pop('name') address = rest.pop('address')
In the rest framework case, mutating data directly might not be a problem, but this does feel like a nice syntax when avoiding mutation is required.
Regards, Ian
On 25/05/16 14:11, Michael Selik wrote:
Python's iterable unpacking is what Lispers might call a destructuring bind.
py> iterable = 1, 2, 3, 4, 5
py> a, b, *rest = iterable
py> a, b, rest
(1, 2, (3, 4, 5))
Clojure also supports mapping destructuring. Let's add that to Python!
py> mapping = {"a": 1, "b": 2, "c": 3}
py> {"a": x, "b": y, "c": z} = mapping
py> x, y, z
(1, 2, 3)
py> {"a": x, "b": y} = mapping
Traceback:
ValueError: too many keys to unpack
This will be approximately as helpful as iterable unpacking was before PEP 3132 (https://www.python.org/dev/peps/pep-3132/).
I hope to keep discussion in this thread focused on the most basic form of dict unpacking, but we could extended mapping unpacking similarly to how PEP 3132 extended iterable unpacking. Just brainstorming...
py> mapping = {"a": 1, "b": 2, "c": 3}
py> {"a": x, **rest} = mapping
py> x, rest
(1, {"b": 2, "c": 3})
Python-ideas mailing list Python-ideas@python.org https://mail.python.org/mailman/listinfo/python-ideas Code of Conduct: http://python.org/psf/codeofconduct/
On 25.05.2016 16:06, Ian Foote wrote:
I can see this being useful if I have a Django Rest Framework validate method (http://www.django-rest-framework.org/api-guide/serializers/#object-level-val...) which takes a dictionary argument.
def validate(self, data): {'name': name, 'address': address} = data
Currently, this would look like:
def validate(self, data): name, address = data['name'], data['address']
Now, that you mention it (the restframework), I remember a concrete usecase for our systems as well. We receive a YAML resource (via the restframework -- that's the association) and we know it must be a dict and contain several items. Additionally, we need the remaining items to store somewhere else depending on some of the received data.
{'needed1': needed1, 'needed2': needed2, **rest} = yaml_data store_there(needed1, rest) store_somewhere(needed2, rest)
Please note, that needed1 and needed2 are not allowed to be part of the data which is supposed to be stored. These are mere flags.
>
It does get more useful with the extension:
def validate(self, data): {'name': name, 'address': address, **rest} = data
@Michael Does using * instead of seem more appropriate?
>
instead of:
def validate(self, data): rest = data.copy() name = rest.pop('name') address = rest.pop('address')
In the rest framework case, mutating data directly might not be a problem, but this does feel like a nice syntax when avoiding mutation is required.
Regards, Ian
Best, Sven
On Thu, May 26, 2016 at 12:32 PM Sven R. Kunze srkunze@mail.de wrote:
On 25.05.2016 16:06, Ian Foote wrote:
def validate(self, data): {'name': name, 'address': address, **rest} = data
@Michael Does using * instead of seem more appropriate?
It does. *args
creates a tuple. **kwargs
creates a
dict.
On Wed, May 25, 2016, at 09:11, Michael Selik wrote:
Python's iterable unpacking is what Lispers might call a destructuring bind.
py> iterable = 1, 2, 3, 4, 5
py> a, b, *rest = iterable
py> a, b, rest
(1, 2, (3, 4, 5))
Clojure also supports mapping destructuring. Let's add that to Python!
py> mapping = {"a": 1, "b": 2, "c": 3}
py> {"a": x, "b": y, "c": z} = mapping
py> x, y, z
(1, 2, 3)
py> {"a": x, "b": y} = mapping
Traceback:
ValueError: too many keys to unpack
How is this better than:
mapping = {"a": 1, "b": 2, "c": 3} x, y, z = mapping[k] for k in ("a", "b", "c")
I'm responding here to Sven, Random832, and Ethan.
On Wed, May 25, 2016 at 10:08 AM Sven R. Kunze srkunze@mail.de wrote:
I for one find this one of the shortest proposals compared to other recent proposals I have seen. :)
If it's easy to explain, it might be a good idea :-)
On Wed, May 25, 2016 at 10:40 AM Random832 random832@fastmail.com wrote:
On Wed, May 25, 2016, at 09:11, Michael Selik wrote:
Clojure also supports mapping destructuring. Let's add that to Python!
py> mapping = {"a": 1, "b": 2, "c": 3}
py> {"a": x, "b": y, "c": z} = mapping
How is this better than: py> mapping = {"a": 1, "b": 2, "c": 3} py> x, y, z = mapping[k] for k in ("a", "b", "c")
I think the thread has formed a consensus that there are at least 2 clear use cases for unpacking. Not surprisingly, they're the same use cases for both tuple unpacking and dict unpacking.
In your example, what if the dict has more keys than you are looping over? Look at the other part of my proposal:
py> mapping = {"a": 1, "b": 2, "c": 3}
py> {"a": x, "b": y} = mapping
Traceback:
ValueError: too many keys to unpack
I really like Sven's example.
py> mapping = {"a": 1, "b": 2, "c": 3}
py> {"a": x, "b": y, "c": 2} = mapping
Traceback:
ValueError: key 'c' does not match value 2
Even if we don't implement this feature in the first version of dict unpacking, we should keep the option open.
On Wed, May 25, 2016 at 4:14 PM Ethan Furman ethan@stoneleaf.us wrote:
The proposal is this: a, b = **mapping
The advantages:
Why doesn't that work for tuple unpacking? py> a, b = *iterable SyntaxError
Whatever the reasons, that syntax wasn't chosen for tuple unpacking. Dict
unpacking should mimic tuple unpacking. If I saw a, b = **mapping
I
would expect a, b = *iterable
.
Unpacking a tuple mirrors a tuple display.
py> (a, b) = (1, 2)
py> (a, b) = (1, 2, 3)
ValueError: too many values to unpack, expected 2
Unpacking a dict should mirror a dict display.
py> {'x': a, 'y': b} = {'x': 1, 'y': 2}
py> {'x': a, 'y': b} = {'x': 1, 'y': 2, 'z': 3}
ValueError: too many keys to unpack, expected {'x', 'y'}
As Brendan and others have mentioned, the more concise syntax you're proposing will not support non-string keys and cannot be enhanced to support Erlang/Clojure/etc-style matching on values.
On Wed, May 25, 2016 at 11:18 AM Paul Moore p.f.moore@gmail.com wrote:
get a set of elements and ignore the rest:
If it's just one or two, that's easy. Use an underscore to indicate you don't care. Again, I'm trying to mirror a dict display. This is the same way that tuple unpacking solves the problem.
py> (a, b, _) = (1, 2, 3)
py> {'x': a, 'y': b, 'z': _} = {'x': 1, 'y': 2, 'z': 3}
If you need to ignore many, we need to extend dict unpacking the same way that tuple unpacking was extended in PEP 3132.
py> (a, b, *rest) = (1, 2, 3, 4)
py> a, b, rest
(1, 2, (3, 4))
py> {'x': a, 'y', b, **rest} = {'x': 1, 'y': 2, 'z': 3, 'w': 4}
py> a, b, rest
(1, 2, {'w': 4, 'z': 3})
Sure, *rest isn't valid in a dict display, but it's the same with tuple unpacking: rest isn't valid in a tuple display.
On 05/25/2016 03:52 PM, Michael Selik wrote:
I really like Sven's example.
py> mapping = {"a": 1, "b": 2, "c": 3} py>{"a": x, "b": y, "c": 2} = mapping Traceback: ValueError: key 'c' does not match value 2
Even if we don't implement this feature in the first version of dict unpacking, we should keep the option open.
Ugh, no. That is not dict unpacking, it's dict matching and unpacking, which is way beyond the simplicity of just unpacking.
On Wed, May 25, 2016 at 4:14 PM Ethan Furman wrote: >
The proposal is this: a, b = **mapping
The advantages:
Why doesn't that work for tuple unpacking? py> a, b = *iterable SyntaxError
Whatever the reasons, that syntax wasn't chosen for tuple unpacking. Dict unpacking should mimic tuple unpacking. If I saw ``a, b =
Good point. So it should just be:
a, b = mapping
or a, b, **_ = mapping # when you don't care about the rest
Unpacking a tuple mirrors a tuple display.
py> (a, b) = (1, 2)
py> (a, b) = (1, 2, 3)
ValueError: too many values to unpack, expected 2
I think that most Pythonistas would say:
a, b = 1, 2 # look ma! no round brackets!
Unpacking a dict should mirror a dict display.
py> {'x': a, 'y': b} = {'x': 1, 'y': 2}
Absolutely not, at least not for the simple case. A simple tuple/list unpack looks like
a, b, c = an_iterable
while a more complicated one looks like
a, b, c = an_iterable[7], an_iterable[3], an_iterable[10]
So a simple dict unpack should look like
some_dict = dict(a=99, b=44, c=37) a, b, c = some_dict
and if we don't need all the items
a, b, **_ = some_dict
and if we want to rename the keys on extraction
x, y, z = some_dict['a'], some_dict['b'], some_dict['c']
or, as Random pointed out
x, y, z = [some_dict[k] for k in ('a', 'b', 'c')]
which has a nice symmetry to it.
py> {'x': a, 'y': b} = {'x': 1, 'y': 2, 'z': 3} ValueError: too many keys to unpack, expected {'x', 'y'}
As Brendan and others have mentioned, the more concise syntax you're proposing will not support non-string keys and cannot be enhanced to support Erlang/Clojure/etc-style matching on values.
And as I have mentioned, matching syntax is out-of-scope for an unpacking proposal. At most, it should be a "let's not paint ourselves into a corner" type of concern -- and I must admit I don't see why
a, b, c = some_dict
rules out
{'x': a, 'y':b} = some_dict
as a pattern-matching construct.
On Wed, May 25, 2016 at 11:18 AM Paul Moore wrote: >
get a set of elements and ignore the rest:
If it's just one or two, that's easy. Use an underscore to indicate you don't care. Again, I'm trying to mirror a dict display. This is the same way that tuple unpacking solves the problem.
py> (a, b, _) = (1, 2, 3)
py> {'x': a, 'y': b, 'z': _} = {'x': 1, 'y': 2, 'z': 3}
Since you've repeated yourself, I will too. ;)
The parenthesis are legal, but usually unnecessary noise, when creating/unpacking a tuple.
-- ~Ethan~
On 2016-05-26 00:41, Ethan Furman wrote:
On 05/25/2016 03:52 PM, Michael Selik wrote:
I really like Sven's example.
py> mapping = {"a": 1, "b": 2, "c": 3} py>{"a": x, "b": y, "c": 2} = mapping Traceback: ValueError: key 'c' does not match value 2
Even if we don't implement this feature in the first version of dict unpacking, we should keep the option open.
Ugh, no. That is not dict unpacking, it's dict matching and unpacking, which is way beyond the simplicity of just unpacking.
On Wed, May 25, 2016 at 4:14 PM Ethan Furman wrote: >
The proposal is this: a, b = **mapping
The advantages:
Why doesn't that work for tuple unpacking? py> a, b = *iterable SyntaxError
Whatever the reasons, that syntax wasn't chosen for tuple unpacking. Dict unpacking should mimic tuple unpacking. If I saw ``a, b =
Good point. So it should just be:
a, b = mapping
or a, b, **_ = mapping # when you don't care about the rest
Unpacking a tuple mirrors a tuple display.
py> (a, b) = (1, 2)
py> (a, b) = (1, 2, 3)
ValueError: too many values to unpack, expected 2
I think that most Pythonistas would say:
a, b = 1, 2 # look ma! no round brackets!
Unpacking a dict should mirror a dict display.
py> {'x': a, 'y': b} = {'x': 1, 'y': 2}
Absolutely not, at least not for the simple case. A simple tuple/list unpack looks like
a, b, c = an_iterable
while a more complicated one looks like
a, b, c = an_iterable[7], an_iterable[3], an_iterable[10]
So a simple dict unpack should look like
some_dict = dict(a=99, b=44, c=37) a, b, c = some_dict
and if we don't need all the items
a, b, **_ = some_dict
and if we want to rename the keys on extraction
x, y, z = some_dict['a'], some_dict['b'], some_dict['c']
or, as Random pointed out
x, y, z = [some_dict[k] for k in ('a', 'b', 'c')]
which has a nice symmetry to it.
Could we use 'as', which is already used for renaming in imports?
a as x, b as y, c as z = some_dict
or, perhaps:
'a' as x, 'b' as y, 'c' as z = some_dict
which would cater for keys that aren't valid as identifiers.
>
py> {'x': a, 'y': b} = {'x': 1, 'y': 2, 'z': 3} ValueError: too many keys to unpack, expected {'x', 'y'}
As Brendan and others have mentioned, the more concise syntax you're proposing will not support non-string keys and cannot be enhanced to support Erlang/Clojure/etc-style matching on values.
And as I have mentioned, matching syntax is out-of-scope for an unpacking proposal. At most, it should be a "let's not paint ourselves into a corner" type of concern -- and I must admit I don't see why
a, b, c = some_dict
rules out
{'x': a, 'y':b} = some_dict
as a pattern-matching construct.
On Wed, May 25, 2016 at 11:18 AM Paul Moore wrote: >
get a set of elements and ignore the rest:
If it's just one or two, that's easy. Use an underscore to indicate you don't care. Again, I'm trying to mirror a dict display. This is the same way that tuple unpacking solves the problem.
py> (a, b, _) = (1, 2, 3)
py> {'x': a, 'y': b, 'z': _} = {'x': 1, 'y': 2, 'z': 3}
Since you've repeated yourself, I will too. ;)
The parenthesis are legal, but usually unnecessary noise, when creating/unpacking a tuple.
On 05/25/2016 04:58 PM, MRAB wrote:
On 2016-05-26 00:41, Ethan Furman wrote:
or, as Random pointed out
x, y, z = [some_dict[k] for k in ('a', 'b', 'c')]
which has a nice symmetry to it.
Could we use 'as', which is already used for renaming in imports?
a as x, b as y, c as z = some_dict
I'm okay with that.
or, perhaps:
'a' as x, 'b' as y, 'c' as z = some_dict
which would cater for keys that aren't valid as identifiers.
That probably makes more sense, especially since it's already the rare(r) case of needing/wanting to rename the keys.
-- ~Ethan~
On 26.05.2016 02:14, Ethan Furman wrote: >
or, perhaps:
'a' as x, 'b' as y, 'c' as z = some_dict
which would cater for keys that aren't valid as identifiers.
That probably makes more sense, especially since it's already the rare(r) case of needing/wanting to rename the keys.
What do you think about?
'a': x, 'b': y, 'c': z = some_dict
Best, Sven
On May 26, 2016 7:52 PM, "Sven R. Kunze" srkunze@mail.de wrote: >
On 26.05.2016 02:14, Ethan Furman wrote: > >
or, perhaps:
'a' as x, 'b' as y, 'c' as z = some_dict
which would cater for keys that aren't valid as identifiers.
That probably makes more sense, especially since it's already the rare(r) case of needing/wanting to rename the keys.
What do you think about?
'a': x, 'b': y, 'c': z = some_dict
Maybe keep that for potential non-comment type hints in the future...
-- Koos (mobile)
Best, Sven
Python-ideas mailing list Python-ideas@python.org https://mail.python.org/mailman/listinfo/python-ideas Code of Conduct: http://python.org/psf/codeofconduct/
On Wed, May 25, 2016 at 7:41 PM Ethan Furman ethan@stoneleaf.us wrote:
a, b = mapping a, b, **_ = mapping # when you don't care about the rest
I'd grudgingly accept this as a second-best syntax. It doesn't support keys
that aren't strs of valid identifiers, but I suppose def foo(**kwargs)
doesn't either. I frequently have keys that are strs with punctuation, keys
that are tuples of ints, etc. Being restricted to identifiers feels cramped.
Also, I wonder if the implementation will be more difficult. In tuple unpacking, it's the LHS that provides the semantic meaning. The RHS is just duck-typed. What's the appropriate error message if you meant to do dict unpacking and didn't have a mapping on the RHS?
py> a, b = 42
TypeError: 'int' object is not iterable
py> {'a': x, 'b': y} = 42
TypeError: 'int' object is not subscriptable
I think that most Pythonistas would say:
a, b = 1, 2 # look ma! no round brackets!
You need brackets for nested/recursive destructuring.
py> a, b, (c, d) = 1, 2, (3, 4)
py> {'a': x, 'b': {'c': y, 'd': z} = {'a': 1, 'b': {'c': 2, 'd': 3}}
matching syntax is out-of-scope for an
unpacking proposal. At most, it should be a "let's not paint ourselves into a corner" type of concern -- and I must admit I don't see why a, b, c = some_dict rules out {'x': a, 'y':b} = some_dict as a pattern-matching construct.
Yep. I just want to look ahead a bit. While you're considering dict unpacking syntax options, keep in mind that tuple matching will need to parallel dict matching. Also, our future selves will want one-way-to-do-it when considering matching syntaxes.
py> a, b, c, 0 = 1, 2, 3, 4
ValueError: index 3 does not match value 0
py> {'x': a, 'y': 0} = {'x': 1, 'y': 2}
ValueError: key 'y' does not match value 0
On Wed, May 25, 2016 at 11:18 AM Paul Moore wrote:
get a set of elements and ignore the rest:
Since you've repeated yourself, I will too. ;)
I figured repetition was more clear than "see above". :-)
[Including replies to
On 05/25/2016 05:29 PM, Michael Selik wrote:
On Wed, May 25, 2016 at 7:41 PM Ethan Furman wrote:
a, b = mapping
a, b, **_ = mapping # when you don't care about the rest
I'd grudgingly accept this as a second-best syntax. It doesn't support
keys that aren't strs of valid identifiers, but I suppose def
foo(**kwargs)
doesn't either. I frequently have keys that are strs
with punctuation, keys that are tuples of ints, etc. Being restricted to
identifiers feels cramped.
Doesn't have to be the only syntax, just the easiest.
Breaks backwards compatibility.
py> a, b = {'a': 1, 'b': 2}
py> a, b
('b', 'a')
Ah. That is a very good point.
I'll meet you halfway:
{a, b} = some_dict
which is currently a SyntaxError, so little backwards compatibility concerns, plus it clearly state the mapping should have two elements, similarly to
[a] = some_iterable
and
(b) = some_iterable
both clearly state that a one-element iterable is being unpacked.
I think that most Pythonistas would say: a, b = 1, 2 # look ma! no round brackets!
You need brackets for nested/recursive destructuring.
py> a, b, (c, d) = 1, 2, (3, 4)
Sure, but that isn't the simple case.
py> {'a': x, 'b': {'c': y, 'd': z} =
{'a': 1, 'b': {'c': 2, 'd': 3}}
With the above syntax:
{a, b {c, d}} = a_mapping
Yep. I just want to look ahead a bit. While you're considering dict unpacking syntax options, keep in mind that tuple matching will need to parallel dict matching. Also, our future selves will want one-way-to-do-it when considering matching syntaxes.
py> a, b, c, 0 = 1, 2, 3, 4
ValueError: index 3 does not match value 0
py> {'x': a, 'y': 0} = {'x': 1, 'y': 2}
ValueError: key 'y' does not match value 0
The adage is actually one /obvious/ way to do it -- which can change depending on the circumstances.
On 05/25/2016 07:16 PM, David Mertz wrote:
I can see how this spelling might be intuitive at first brush. But the more I think about it, the more I recoil against the violation of a relatively uniform semantic principle in Python.
In no other case in Python, does the RHS of an assignment "probe into" the LHS to figure out how to determine its value. Moreover, the idea that variable names are not just bindings, but also pseudo- literals, or maybe something akin to a Lisp 'symbol', feels enormously unpythonic to me.
On 05/25/2016 07:18 PM, Guido van Rossum wrote:
I have to warn here. This looks cool but it does something that AFAIK no other Python syntax uses -- it takes variable names and does something to those variables but also uses their actual names as string literals. I agree that the use cases for this seem pretty sweet, but perhaps we should give it a somewhat different syntax just so it's clear that the names on the LHS matter. The precedent that the targets must be actual names rather than anything you can assign to is also kind of scary.
Very good points. I'm glad we had the discussion, though, since it elicited Random's contribution -- which is what I will use from now on for dict unpacking. :)
-- ~Ethan~
On Thu, May 26, 2016 at 12:43 PM, Ethan Furman ethan@stoneleaf.us wrote:
I'll meet you halfway:
{a, b} = some_dict
which is currently a SyntaxError, so little backwards compatibility concerns, plus it clearly state the mapping should have two elements, similarly to
[a] = some_iterable
and
(b) = some_iterable
both clearly state that a one-element iterable is being unpacked.
Careful - the second one doesn't:
(b) = [1, 2, 3]
Parens don't make a tuple, and that includes with unpacking. You'd be correct if you had a comma in there, though.
I hope there doesn't end up being a confusion between mapping unpacking and set display. Sets are a bit of an odd duck; are they like lists only unordered, or like mappings only without values? I've seen them used both ways, and the syntax is somewhere between the two. Having a syntax that constructs a set if used on the RHS but unpacks a dict if used on the LHS seems to violate syntactic purity, but I'd be happy to let practicality trump that.
ChrisA
On 05/25/2016 07:56 PM, Chris Angelico wrote:
On Thu, May 26, 2016 at 12:43 PM, Ethan Furman wrote:
I'll meet you halfway:
{a, b} = some_dict
which is currently a SyntaxError, so little backwards compatibility concerns, plus it clearly state the mapping should have two elements, similarly to
[a] = some_iterable
and
(b) = some_iterable
both clearly state that a one-element iterable is being unpacked.
Careful - the second one doesn't:
--> (b) = [1, 2, 3] -->
Ack. Right you are, which is why I always use the list form, even on tuples.
I hope there doesn't end up being a confusion between mapping unpacking and set display. Sets are a bit of an odd duck; are they like lists only unordered, or like mappings only without values? I've seen them used both ways, and the syntax is somewhere between the two. Having a syntax that constructs a set if used on the RHS but unpacks a dict if used on the LHS seems to violate syntactic purity, but I'd be happy to let practicality trump that.
Given the other issues, I'm happy to let this one die. Good discussion.
-- ~Ethan~
Then is it too late to point out that the desired goal (of dumping part of a dict into the local namespace) is really similar to importing?
Something I am most definitely not suggesting: from some_dict dict_import a, b, c
Related, a version that works for Enums: from some_Enum attr_import from vars(some_enum) dict_import
Again, I'm not suggesting these syntaxes (and I hope everyone else hates them), just pointing out that it's more similar to imports than destructuring.
On Wed, May 25, 2016 at 11:46 PM, Ethan Furman ethan@stoneleaf.us wrote:
On 05/25/2016 07:56 PM, Chris Angelico wrote: >
On Thu, May 26, 2016 at 12:43 PM, Ethan Furman wrote:
I'll meet you halfway:
{a, b} = some_dict
which is currently a SyntaxError, so little backwards compatibility concerns, plus it clearly state the mapping should have two elements, similarly to
[a] = some_iterable
and
(b) = some_iterable
both clearly state that a one-element iterable is being unpacked.
Careful - the second one doesn't:
--> (b) = [1, 2, 3] -->
Ack. Right you are, which is why I always use the list form, even on tuples.
I hope there doesn't end up being a confusion between mapping unpacking and set display. Sets are a bit of an odd duck; are they like lists only unordered, or like mappings only without values? I've seen them used both ways, and the syntax is somewhere between the two. Having a syntax that constructs a set if used on the RHS but unpacks a dict if used on the LHS seems to violate syntactic purity, but I'd be happy to let practicality trump that.
Given the other issues, I'm happy to let this one die. Good discussion.
-- ~Ethan~
Python-ideas mailing list Python-ideas@python.org https://mail.python.org/mailman/listinfo/python-ideas Code of Conduct: http://python.org/psf/codeofconduct/
On 05/25/2016 11:24 PM, Greg Ewing wrote:
Chris Angelico wrote:
I hope there doesn't end up being a confusion between mapping unpacking and set display. Sets are a bit of an odd duck; are they like lists only unordered, or like mappings only without values?
Let's just hope nobody says they want set unpacking...
We already have that:
--> a, b, c = {7, 99, 22} --> a, b, c (99, 22, 7)
Of course, the order gets all messed up... ;)
-- ~Ethan~
On Wed, May 25, 2016 at 7:41 PM Ethan Furman ethan@stoneleaf.us wrote:
it should just be: a, b = mapping
Breaks backwards compatibility.
py> a, b = {'a': 1, 'b': 2}
py> a, b
('b', 'a')
Sorry for the double-post. I should have pondered longer before sending...
On Thu, May 26, 2016 at 1:52 AM, Michael Selik michael.selik@gmail.com wrote: [...]
On Wed, May 25, 2016 at 4:14 PM Ethan Furman ethan@stoneleaf.us wrote: >
The proposal is this: a, b = **mapping
The advantages:
Why doesn't that work for tuple unpacking? py> a, b = *iterable SyntaxError
Whatever the reasons, that syntax wasn't chosen for tuple unpacking. Dict
unpacking should mimic tuple unpacking. If I saw a, b = **mapping
I
would expect a, b = *iterable
.
I think it would make sense to allow
a, b = *iterable
That would be more explicit about unpacking the iterable. Still, in
clear cases like a, b = 1, 2
, one could omit the asterisk.
After all, this already works:
a, b, c = 1, *(2, 3)
Some more examples:
a, b = 1, 2 # this is clear a, b = (1,2) # could be legal and equivalent to the above a = (1, 2) # would fail, and should fail! a = 1, 2 # does not fail
So why not allow being more explicit about unpacking?
-- Koos
On 25.05.2016 15:11, Michael Selik wrote:
I hope to keep discussion in this thread focused on the most basic form of dict unpacking, but we could extended mapping unpacking similarly to how PEP 3132 extended iterable unpacking. Just brainstorming...
Another idea (borrowed from Erlang):
mapping = {"a": 1, "b": 2, "c": 3} {"a": x, "b": y, "c": 3} = mapping x,y (1,2)
mapping = {"a": 1, "b": 2, "c": 3} {"a": x, "b": y, "c": 2} = mapping Traceback: ValueError: key 'c' does not match to 2
Best, Sven
On 25 May 2016 at 14:11, Michael Selik michael.selik@gmail.com wrote:
Clojure also supports mapping destructuring. Let's add that to Python!
py> mapping = {"a": 1, "b": 2, "c": 3}
py> {"a": x, "b": y, "c": z} = mapping
py> x, y, z
(1, 2, 3)
py> {"a": x, "b": y} = mapping
Traceback:
ValueError: too many keys to unpack
This will be approximately as helpful as iterable unpacking was before PEP 3132 (https://www.python.org/dev/peps/pep-3132/).
Neither this, nor the *rest extension you proposed, does what I think would be the most common requirement - get a set of elements andignore* the rest:
mapping = {"a": 1, "b": 2, "c": 3} {"a": x, "b": y} = mapping
# Note no error!
x, y, z (1, 2)
I'd still like to see some real-world use cases though - there are lots of options as the above example demonstrates, and nothing to guide us in deciding which would be the most useful one to choose.
Paul
On 05/25/2016 08:18 AM, Paul Moore wrote:
Neither this, nor the *rest extension you proposed, does what I think would be the most common requirement - get a set of elements andignore* the rest:
--> mapping = {"a": 1, "b": 2, "c": 3} --> {"a": x, "b": y} = mapping --> # Note no error! --> x, y # z discarded, should be NameError (1, 2)
I'd still like to see some real-world use cases though - there are lots of options as the above example demonstrates, and nothing to guide us in deciding which would be the most useful one to choose.
Here's a real-world use-case: The main software app I support passes a
values
dict around like the plague. When doing interesting stuff I
often unpack some of the values into local variables as that reads
better, types better, and makes it easier to reason about the code (it's
on 2.7 so for that I'll have to use Random's example).
So for me, unpacking a dict with friendly syntax would be useful, and unpacking four or five keys from a 20-element dict will be far more useful than having to unpack them all.
-- ~Ethan~
On 25 May 2016 at 17:08, Ethan Furman ethan@stoneleaf.us wrote:
Here's a real-world use-case: The main software app
I support passes a
values
dict around like the plague. When doing interesting stuff I often
unpack some of the values into local variables as that reads better, types
better, and makes it easier to reason about the code (it's on 2.7 so for
that I'll have to use Random's example).
So for me, unpacking a dict with friendly syntax would be useful, and unpacking four or five keys from a 20-element dict will be far more useful than having to unpack them all.
Thanks. That's the sort of use case I thought might exist - and it sounds to me as if you'd get much more benefit from a syntax that allowed "partial" unpacking:
{"a": x, "b": y} = dict(a=1, b=2, c=3)
gives x=1, y=2 with no error.
I can't think of a good example where Michael's original proposal that this would give a ValueError would be a better approach. Paul
On 05/25/2016 11:12 AM, Paul Moore wrote:
On 25 May 2016 at 17:08, Ethan Furman wrote:
Here's a real-world use-case: The main software
app I support passes a
values
dict around like the plague. When doing interesting stuff I often
unpack some of the values into local variables as that reads better, types
better, and makes it easier to reason about the code (it's on 2.7 so for
that I'll have to use Random's example).
So for me, unpacking a dict with friendly syntax would be useful, and unpacking four or five keys from a 20-element dict will be far more useful than having to unpack them all.
Thanks. That's the sort of use case I thought might exist - and it sounds to me as if you'd get much more benefit from a syntax that allowed "partial" unpacking:
{"a": x, "b": y} = dict(a=1, b=2, c=3)
gives x=1, y=2 with no error.
I can't think of a good example where Michael's original proposal that this would give a ValueError would be a better approach.
Agreed. The ValueError approach would make this useless for me.
-- ~Ethan~
On 25.05.2016 20:21, Ethan Furman wrote:
On 05/25/2016 11:12 AM, Paul Moore wrote:
On 25 May 2016 at 17:08, Ethan Furman wrote:
I can't think of a good example where Michael's original proposal that this would give a ValueError would be a better approach.
Agreed. The ValueError approach would make this useless for me.
I think that's why the * syntax might be useful as well.
Best, Sven
On 05/25/2016 11:44 AM, Sven R. Kunze wrote:
On 25.05.2016 20:21, Ethan Furman wrote:
On 05/25/2016 11:12 AM, Paul Moore wrote:
On 25 May 2016 at 17:08, Ethan Furman wrote:
I can't think of a good example where Michael's original proposal that this would give a ValueError would be a better approach.
Agreed. The ValueError approach would make this useless for me.
I think that's why the * syntax might be useful as well.
Maybe. Get the simple case added first; we can add more later if deemed appropriate.
-- ~Ethan~
On 05/25/2016 02:12 PM, Paul Moore wrote:
On 25 May 2016 at 17:08, Ethan Furman ethan@stoneleaf.us wrote:
Here's a real-world use-case: The main software
app I support passes a
values
dict around like the plague. When doing interesting stuff I often
unpack some of the values into local variables as that reads better, types
better, and makes it easier to reason about the code (it's on 2.7 so for
that I'll have to use Random's example).
So for me, unpacking a dict with friendly syntax would be useful, and unpacking four or five keys from a 20-element dict will be far more useful than having to unpack them all.
Thanks. That's the sort of use case I thought might exist - and it sounds to me as if you'd get much more benefit from a syntax that allowed "partial" unpacking:
{"a": x, "b": y} = dict(a=1, b=2, c=3)
gives x=1, y=2 with no error.
I can't think of a good example where Michael's original proposal that this would give a ValueError would be a better approach. Paul
How is this an improvement over:
def extract(mapping, *keys): return [mapping[key] for key in keys]
mapping = {'a': 1, 'b': 2, 'c': 3}
x, y = extract(mapping, 'a', 'b') print(x, y) 1, 2
Eric.
On 05/25/2016 01:08 PM, Eric V. Smith wrote:
How is this an improvement over:
def extract(mapping, *keys): return [mapping[key] for key in keys]
mapping = {'a': 1, 'b': 2, 'c': 3}
x, y = extract(mapping, 'a', 'b') print(x, y) 1, 2
Let's pretend you wrote:
a, b = extract(mapping, 'a', 'b')
since that's the way I would almost always be using it.
The proposal is this:
a, b = **mapping
The advantages:
Less duplication might not seem like that big a deal, but it's one of the motivators behind decorations and in-place operators, which are both wins.
-- ~Ethan~
On 5/25/2016 4:14 PM, Ethan Furman wrote:
On 05/25/2016 01:08 PM, Eric V. Smith wrote:
x, y = extract(mapping, 'a', 'b') print(x, y) 1, 2
Let's pretend you wrote:
a, b = extract(mapping, 'a', 'b')
since that's the way I would almost always be using it.
The proposal is this:
a, b = **mapping
The advantages:
Less duplication might not seem like that big a deal, but it's one of the motivators behind decorations and in-place operators, which are both wins.
I agree that would be a win. That's a lot of compiler magic, though.
Eric.
I actually don't think it will be very hard.
On Wednesday, May 25, 2016 at 5:02:52 PM UTC-4, Eric V. Smith wrote: >
On 5/25/2016 4:14 PM, Ethan Furman wrote:
On 05/25/2016 01:08 PM, Eric V. Smith wrote:
x, y = extract(mapping, 'a', 'b') print(x, y) 1, 2
Let's pretend you wrote:
a, b = extract(mapping, 'a', 'b')
since that's the way I would almost always be using it.
The proposal is this:
a, b = **mapping
The advantages:
Less duplication might not seem like that big a deal, but it's one of the motivators behind decorations and in-place operators, which are both wins.
I agree that would be a win. That's a lot of compiler magic, though.
Eric.
Python-ideas mailing list Python...@python.org <javascript:> https://mail.python.org/mailman/listinfo/python-ideas Code of Conduct: http://python.org/psf/codeofconduct/
I'm currently working on a data export softare using 1000th of OrderedDicts. Sometime I iterate, and sometime I get part of it.
Currently I have a wrapper doing:
x, y , z = d(data).unpack('x', 'y', 'z', default=None)
So I can see a use for it. And it's not the first time I wish I could do this.
Although I think for a shortcut to lookup automatically the vars with the same name as the dict keys by default would be nice:
{x, y, z} = data
Le 25/05/2016 17:18, Paul Moore a écrit :
On 25 May 2016 at 14:11, Michael Selik michael.selik@gmail.com wrote:
Clojure also supports mapping destructuring. Let's add that to Python!
py> mapping = {"a": 1, "b": 2, "c": 3}
py> {"a": x, "b": y, "c": z} = mapping
py> x, y, z
(1, 2, 3)
py> {"a": x, "b": y} = mapping
Traceback:
ValueError: too many keys to unpack
This will be approximately as helpful as iterable unpacking was before PEP 3132 (https://www.python.org/dev/peps/pep-3132/).
Neither this, nor the *rest extension you proposed, does what I think would be the most common requirement - get a set of elements andignore* the rest:
mapping = {"a": 1, "b": 2, "c": 3} {"a": x, "b": y} = mapping
# Note no error!
x, y, z (1, 2)
I'd still like to see some real-world use cases though - there are lots of options as the above example demonstrates, and nothing to guide us in deciding which would be the most useful one to choose.
Paul
Python-ideas mailing list Python-ideas@python.org https://mail.python.org/mailman/listinfo/python-ideas Code of Conduct: http://python.org/psf/codeofconduct/
On Wed, May 25, 2016 at 01:11:35PM +0000, Michael Selik wrote:
py> mapping = {"a": 1, "b": 2, "c": 3}
py> {"a": x, "b": y, "c": z} = mapping
py> x, y, z
(1, 2, 3)
I think that is too verbose and visually baffling. I'd rather see something less general and (in my opinion) more useful:
a, b, c = **mapping
being equivalent to:
a = mapping['a'] b = mapping['b'] c = mapping['c']
It's less general, because you can only use the same names as the keys in the dict. But most of the time you'll probably want to do that anyway. Think of (for example) carrying around a "settings" or "preferences" dict:
prefs = {'width': 80, 'height': 200, 'verbose': False, 'mode': PLAIN, 'name': 'Fnord', 'flags': spam|eggs|cheese, ... }
# plus many more keys:values
There's no need to unpack the entire dict, you can grab only the keys you need:
width, height = **prefs
# like
width = prefs['width'] height = prefs['height']
Sure, you are forced to use the same variable names as the keys, but that's what you will probably do most of the time. You're not likely to write:
foo = prefs['width'] bar = prefs['height']
although you might write:
zip_code = prefs['zip code']
but probably shouldn't. (Just use 'zip_code' as the key.) Another awkward case is when a key is a keyword:
except_ = prefs['except']
but I expect those cases will be relatively rare, and you can always manually unpack them the old fashioned way.
Admittedly your syntax would allow those cases, at the cost of a more verbose statement:
{'width': foo, 'height': bar, 'except': except_, 'zip code': zip_code} = mapping
but I think that's mostly an over-generalisation and too hard to grasp what is going on.
Naturally the order of the keys doesn't matter:
mode, height, width = prefs height, mode, width = prefs
etc are all the same.
If you twist my arm and force me to come up with syntax for a "change of variable name", I'd consider:
height, width, zip_code:'zip code', except_:'except' = **mapping
In other words, if the target on the left is a plain name, the unpacking does:
name = mapping['name']
If the target on the left has a colon, it is an identifier followed by key. The identifier can be any valid reference, including dots and [] subscripts. The key must be a string:
identifier:'key'
which performs:
identifier = mapping['key']
Examples of valid colon targets:
spam:'ham' # like spam = mapping['ham']
spam[1].eggs:'while' # like spam[1].eggs = mapping['while']
etc.
If there's too many targets to comfortably fit on the one line, wrap them in parentheses to allow line wrapping:
(height, flags, except_:'except', mymodule.obj.attribute[2].gamma:'gamma', alpha) = **prefs
But the simple case, the case you'll use most of the time, is simple:
height, width, verbose, flags = **prefs
-- Steve
On 05/25/2016 11:42 AM, Steven D'Aprano wrote:
On Wed, May 25, 2016 at 01:11:35PM +0000, Michael Selik wrote:
py> mapping = {"a": 1, "b": 2, "c":
3}
py> {"a": x, "b": y, "c": z} = mapping
py> x, y, z
(1, 2, 3)
I think that is too verbose and visually baffling. I'd rather see something less general and (in my opinion) more useful:
a, b, c = **mapping
being equivalent to:
a = mapping['a'] b = mapping['b'] c = mapping['c']
+1
Simplest, easiest to grok, probably solves 95+% of the use-cases.
-- ~Ethan~
On Thu, May 26, 2016 at 4:47 AM, Ethan Furman ethan@stoneleaf.us wrote:
On 05/25/2016 11:42 AM, Steven D'Aprano wrote: >
On Wed, May 25, 2016 at 01:11:35PM +0000, Michael Selik wrote:
py> mapping = {"a": 1, "b": 2, "c":
3}
py> {"a": x, "b": y, "c": z} = mapping
py> x, y, z
(1, 2, 3)
I think that is too verbose and visually baffling. I'd rather see something less general and (in my opinion) more useful:
a, b, c = **mapping
being equivalent to:
a = mapping['a'] b = mapping['b'] c = mapping['c']
+1
Simplest, easiest to grok, probably solves 95+% of the use-cases.
Agreed, and also +1 on the proposal. It leaves room for future syntactic expansion in the same way as PEP 448.
ChrisA
On 25 May 2016 at 19:47, Ethan Furman ethan@stoneleaf.us wrote:
On 05/25/2016 11:42 AM, Steven D'Aprano wrote: >
On Wed, May 25, 2016 at 01:11:35PM +0000, Michael Selik wrote:
py> mapping = {"a": 1, "b": 2, "c":
3}
py> {"a": x, "b": y, "c": z} = mapping
py> x, y, z
(1, 2, 3)
I think that is too verbose and visually baffling. I'd rather see something less general and (in my opinion) more useful:
a, b, c = **mapping
being equivalent to:
a = mapping['a'] b = mapping['b'] c = mapping['c']
+1
Simplest, easiest to grok, probably solves 95+% of the use-cases.
OTOH, you could also do
x = SimpleNamespace(**mapping)
and use x.a, x.b, x.c.
Paul
On 05/25/2016 11:52 AM, Paul Moore wrote:
On 25 May 2016 at 19:47, Ethan Furman wrote:
On 05/25/2016 11:42 AM, Steven D'Aprano wrote:
On Wed, May 25, 2016 at 01:11:35PM +0000, Michael Selik wrote:
py> mapping = {"a": 1, "b": 2, "c":
3}
py> {"a": x, "b": y, "c": z} = mapping
py> x, y, z
(1, 2, 3)
I think that is too verbose and visually baffling. I'd rather see something less general and (in my opinion) more useful:
a, b, c = **mapping
being equivalent to:
a = mapping['a'] b = mapping['b'] c = mapping['c']
+1
Simplest, easiest to grok, probably solves 95+% of the use-cases.
OTOH, you could also do
x = SimpleNamespace(**mapping)
and use x.a, x.b, x.c.
Beside visual clarity, the other big reason for using local variables instead constant dict access is speed -- which you lose by using another object.
-- ~Ethan~
I am -1 on the whole idea. What is either asking for the identifier being assigned to having semantic meaning in the language, something we do not have anywhere else. (yes, we have special names, but the language does not actually care about what object you assign to the special name, and the name itself does not change the behavior of the assignment.) OR is totally redundant to something we can already do:
a, b, c, *r = mapping.values()
-----Original Message----- From: Python-ideas [mailto:python-ideas-bounces+tritium-list=sdamon.com@python.org] On Behalf Of Ethan Furman Sent: Wednesday, May 25, 2016 2:47 PM To: python-ideas@python.org Subject: Re: [Python-ideas] Unpacking a dict
On 05/25/2016 11:42 AM, Steven D'Aprano wrote:
On Wed, May 25, 2016 at 01:11:35PM +0000, Michael Selik wrote:
py> mapping = {"a": 1, "b": 2, "c":
3}
py> {"a": x, "b": y, "c": z} = mapping
py> x, y, z
(1, 2, 3)
I think that is too verbose and visually baffling. I'd rather see something less general and (in my opinion) more useful:
a, b, c = **mapping
being equivalent to:
a = mapping['a'] b = mapping['b'] c = mapping['c']
+1
Simplest, easiest to grok, probably solves 95+% of the use-cases.
-- ~Ethan~
Python-ideas mailing list Python-ideas@python.org https://mail.python.org/mailman/listinfo/python-ideas Code of Conduct: http://python.org/psf/codeofconduct/
On Thu, May 26, 2016 at 12:39 AM tritium-list@sdamon.com wrote:
I am -1 on the whole idea. What is either asking for the identifier being assigned to having semantic meaning in the language, something we do not have anywhere else.
No? Tuple unpacking has semantic meaning for the left-hand side. Or did I misunderstand you?
a, b, c, *r = mapping.values()
Unless it's an OrderedDict and I know the order, I wouldn't want to do that.
On 2016-05-25 11:42, Steven D'Aprano wrote:
If the target on the left has a colon, it is an identifier followed by key. The identifier can be any valid reference, including dots and [] subscripts. The key must be a string:
identifier:'key'
which performs:
identifier = mapping['key']
Why does the key have to be a string? I agree that the common case is
where you want to assign to a local variable with the same name as the string key, but if you do allow specifying how keys map to assignment targets, it seems like you might as well allow non-string keys.
-- Brendan Barnwell "Do not follow where the path may lead. Go, instead, where there is no path, and leave a trail." --author unknown
On 25.05.2016 20:42, Steven D'Aprano wrote:
There's no need to unpack the entire dict, you can grab only the keys you need:
width, height = **prefs
# like
width = prefs['width'] height = prefs['height']
That's not the same behavior as it is for tuple unpacking. As a new user, I would expect them to work the same way.
If I want to dismiss the remainder of the dict, I'd rather consider an explicit approach:
width, height = prefs # just those values width, height, *r = prefs # r captures the rest
This gives me two benefits:
1) I can further work with r from which width and height are extracted 2) a convenient way of verifying that a dict only contains certain values and extracting those at the same time
Sure, you are forced to use the same variable names as the keys, but that's what you will probably do most of the time. You're not likely to write:
foo = prefs['width'] bar = prefs['height']
although you might write:
zip_code = prefs['zip code']
but probably shouldn't. (Just use 'zip_code' as the key.)
"just use 'zip_code' as the key" won't do it when you have no influence on the data. I might remind you that most engineers are no gods who can change everything at their whim.
[...] If you twist my arm and force me to come up with syntax for a "change of variable name", I'd consider:
Rest assured I will twist your arm very hard. ;)
Also rest assured that we use non-string keys on a regular basis.
Maybe, it's just me but I still tend to think that using unquoted strings should be reserved for attribute unpacking.
[...]
If the target on the left has a colon, it is an identifier followed by key. The identifier can be any valid reference, including dots and [] subscripts. The key must be a string:
identifier:'key'
I would rather turn it around. It feels weird to have the "key" on the right side.
One additional drawback of this solution is the fact that the "value" part of the colon is already taken. So, value matching like done in Erlang is not possible. OTOH, this applies to tuple unpacking in its current form as well
Best, Sven
On 05/25/2016 12:38 PM, Sven R. Kunze wrote:
On 25.05.2016 20:42, Steven D'Aprano wrote:
There's no need to unpack the entire dict, you can grab only the keys you need:
width, height = **prefs
# like
width = prefs['width'] height = prefs['height']
That's not the same behavior as it is for tuple unpacking. As a new user, I would expect them to work the same way.
If I want to dismiss the remainder of the dict, I'd rather consider an explicit approach:
width, height = prefs # just those values width, height, *r = prefs # r captures the rest
This gives me two benefits:
1) I can further work with r from which width and height are extracted 2) a convenient way of verifying that a dict only contains certain values and extracting those at the same time
Okay, those are good benefits. +1
Maybe, it's just me but I still tend to think that using unquoted strings should be reserved for attribute unpacking.
lists, tuples, and dicts are basic containers -- having syntax to unpacak them easily is a clear win; attribute unpacking not so much:
--> some_long_object_name = MyClass(blah, blah) --> s = some_long_object_name --> s.name some value --> s.value --> another value
which is sufficiently readable.
One additional drawback of this solution is the fact that the "value" part of the colon is already taken. So, value matching like done in Erlang is not possible. OTOH, this applies to tuple unpacking in its current form as well
This is about unpacking, not matching.
-- ~Ethan~
On 25.05.2016 22:11, Ethan Furman wrote:
On 05/25/2016 12:38 PM, Sven R. Kunze wrote:
Maybe, it's just me but I still tend to think that using unquoted strings should be reserved for attribute unpacking.
lists, tuples, and dicts are basic containers -- having syntax to unpacak them easily is a clear win; attribute unpacking not so much:
Wait a second. Reading your example below, do you imply that dicts actually should provide something like a attribute access to its keys?
d = {'a': 1,'b': 2} d.b 2
Wouldn't this solution be equally readable/useful? I am just asking as it seems that you find accessing a dict to be straining. So, unpacking it would a one solution but providing attribute-like access to it would be another solution to that problem.
--> some_long_object_name = MyClass(blah, blah) --> s = some_long_object_name --> s.name some value --> s.value --> another value
which is sufficiently readable.
I see your point.
But let me explain how I approached the problem by considering it from the the "new users" perspective. Really fast, he will build up the following associations:
attributes <-> .abc dict keys <-> ['abc'] or [other stuff] list/tuple keys <-> [123]
So, from his perspective everything is clearly separated: lists/tuples use integers, dicts use mostly strings with quotes and objects have attributes which can be used like normal variables without quotes.
Mixing things up (by removing/mixing the visual indicators), makes me feel nervous about the consistent perception of Python. Maybe, it's just FUD on my side but I can't help this feeling.
Best, Sven
On 05/25/2016 04:03 PM, Sven R. Kunze wrote:
On 25.05.2016 22:11, Ethan Furman wrote:
On 05/25/2016 12:38 PM, Sven R. Kunze wrote:
Maybe, it's just me but I still tend to think that using unquoted strings should be reserved for attribute unpacking.
lists, tuples, and dicts are basic containers -- having syntax to unpacak them easily is a clear win; attribute unpacking not so much:
Wait a second. Reading your example below, do you imply that dicts actually should provide something like a attribute access to its keys?
d = {'a': 1,'b': 2} d.b 2
No. I'm saying attribute access is already extremely easy, so we don't need to try and make it easier.
But let me explain how I approached the problem by considering it from the the "new users" perspective. Really fast, he will build up the following associations:
attributes <-> .abc dict keys <-> ['abc'] or [other stuff] list/tuple keys <-> [123]
So, from his perspective everything is clearly separated: lists/tuples use integers, dicts use mostly strings with quotes and objects have attributes which can be used like normal variables without quotes.
And this newbie would be wrong, as [other stuff] for a mapping can easily be integers just like lists/tuples.
Mixing things up (by removing/mixing the visual indicators), makes me feel nervous about the consistent perception of Python. Maybe, it's just FUD on my side but I can't help this feeling.
We already have that situation:
some_var = [1, 2] a = some_var[0] b = some_var[1]
is exactly the same as
a, b = some_var
and the visual indicators (the integers, the item access) are nowhere to be seen.
-- ~Ethan~
On 26.05.2016 01:25, Ethan Furman wrote:
We already have that situation:
some_var = [1, 2] a = some_var[0] b = some_var[1]
is exactly the same as
a, b = some_var
and the visual indicators (the integers, the item access) are nowhere to be seen.
Good point. However, don't you think this case is different?
As a human (at least the ones I know) can easily relate position and index number. So, 1 is first, 2 is second and so on. That's pretty straightforward. So, the item access is quite naturally mapped.
With keys of a dictionary it is less clear. I prefer the explicit key variant over some implicitly working one. Maybe there will come up a better argument for or against this. We will see.
Best, Sven
On Wed, May 25, 2016 at 01:11:35PM +0000, Michael Selik wrote:
Clojure also supports mapping destructuring. Let's add that to Python!
py> mapping = {"a": 1, "b": 2, "c": 3}
py> {"a": x, "b": y, "c": z} = mapping
py> x, y, z
(1, 2, 3)
py> {"a": x, "b": y} = mapping
Traceback:
ValueError: too many keys to unpack
This will be approximately as helpful as iterable unpacking was before PEP 3132 (https://www.python.org/dev/peps/pep-3132/).
What is your evidence for this claim? So far I've only seen one real- world use-case for this, and that single use-case would be well served by a simpler syntax:
a, b, c = **mapping
which just requires that a, b, c etc are legal names (not general identifiers). The dict is then unpacked:
a = mapping['a']
etc. Two questions:
(1) For the use-case we've already seen, a "preferences" or "settings" mapping, do you think users will be prepared to use your syntax?
(2) Do you have any other concrete use cases for this, and if so, what are they?
For (1), I've taken a prefs dict from a rather small command line script I've written. This is taken from actual code in use. To unpack the dict using your syntax, I would have to write:
{'sort': sort, 'reverse': reverse, 'classify': classify, 'showlinks': showlinks, 'showsize': showsize, 'spacer': spacer, 'style': style, 'width': width} = prefs
# next line is optional, but I might not wish to pollute the namespace
del sort, reverse, showlinks, showsize, style
in order to unpack the three keys I actually want. Here's my suggestion:
classify, spacer, width = **prefs
I don't know about you, but there's no way I'd use your suggested syntax as-shown. I'd rather unpack manually:
classify, spacer, width = [prefs[key] for key in 'classify spacer width'.split()]
or variations of same.
For (2), here's another use-case I can think of. Unpacking **kwargs in functions/methods.
def spam(self, a, b, **kwargs): ...
I'd like to unpack a small number of keys:values from kwargs, extract them from the dict, and pass that on to another method:
fnord = kwargs.pop('fnord', 'default')
wibble = kwargs.pop('wibble', 42)
super().spam(a, b, **kwargs)
I don't have a concise syntax for this use-case, and yours won't work either. There's no way to supply defaults, nor can you list all the keys because you don't know what they will be. (The caller can provide arbitrary keyword arguments.)
So I don't think this use-case can be handled by either your syntax or mine for dict unpacking.
I think your syntax is too verbose and repetitive for the simple case. It does have the advantage that it can deal with keys which aren't identifiers:
{'while': while_, 'foo bar': foobar} = mapping
but it only looks good in toy examples. In real code, I wouldn't use it, it would be too painful and repetitive.
If we add syntax to collect all the unused items, your syntax will be a bit less painful, but still repetitive:
{'classify': classify, 'spacer': spacer, 'width': width, **whocares} = prefs
but that has no advantage over what we already have:
classify, spacer, width = [prefs[key] for key in ('classify', 'spacer', 'width')]
(The two are almost the same length, and equally repetitive.)
As far as changing names, we can already do that, and use arbitrary references:
myobj.attr['key'][1], while_, foobar = [ mapping[key] for key in ('something', 'while', 'foo bar')]
So I think that the problems your syntax solve are already easy to solve, and the things which are annoying to solve now, your syntax is too painful to use.
I'd rather have syntax which is less general but more useful in practice, than something which solves dict unpacking in its full generality but a pain to use.
-- Steve
On Wed, May 25, 2016 at 6:56 PM, Steven D'Aprano steve@pearwood.info wrote:
What is your evidence for this claim? So far I've only seen one real- world use-case for this, and that single use-case would be well served by a simpler syntax:
a, b, c = **mapping
which just requires that a, b, c etc are legal names (not general identifiers). The dict is then unpacked:
a = mapping['a']
I can see how this spelling might be intuitive at first brush. But the more I think about it, the more I recoil against the violation of a relatively uniform semantic principle in Python.
In no other case in Python, does the RHS of an assignment "probe into" the LHS to figure out how to determine its value. Moreover, the idea that variable names are not just bindings, but also pseudo-literals, or maybe something akin to a Lisp 'symbol', feels enormously unpythonic to me.
Moreover, given that comprehensions are already available, and can express every variation we might want simply, I see no point of having this mild syntax sugar. This includes binding in the usual style to arbitrary names, but also all the expected mechanisms of derived values and conditionals. You can write:
a, b, c = (mapping[x] for x in ['a','b','c'])
But equally you can write a natural extension like:
x, y, z = (2*mapping[x] for x in get_keys() if x.isupper())
Special casing the very simplest thing to save a minimal number of characters does not seem worthwhile.
-- Keeping medicines from the bloodstreams of the sick; food from the bellies of the hungry; books from the hands of the uneducated; technology from the underdeveloped; and putting advocates of freedom in prisons. Intellectual property is to the 21st century what the slave trade was to the 16th.
On Wed, May 25, 2016 at 6:56 PM, Steven D'Aprano steve@pearwood.info wrote:
On Wed, May 25, 2016 at 01:11:35PM +0000, Michael Selik wrote: What is your evidence for this claim? So far I've only seen one real- world use-case for this, and that single use-case would be well served by a simpler syntax:
a, b, c = **mapping
I have to warn here. This looks cool but it does something that AFAIK no other Python syntax uses -- it takes variable names and does something to those variables but also uses their actual names as string literals. I agree that the use cases for this seem pretty sweet, but perhaps we should give it a somewhat different syntax just so it's clear that the names on the LHS matter. The precedent that the targets must be actual names rather than anything you can assign to is also kind of scary.
-- --Guido van Rossum (python.org/~guido)
Guido van Rossum wrote:
it does something that AFAIK no other Python syntax uses -- it takes variable names and does something to those variables but also uses their actual names as string literals.
The names in def and class statements also end up in the __name__ attributes of the created objects -- does that count?
-- Greg
On Thu, May 26, 2016 at 5:18 AM, Guido van Rossum guido@python.org wrote:
On Wed, May 25, 2016 at 6:56 PM, Steven D'Aprano steve@pearwood.info wrote:
On Wed, May 25, 2016 at 01:11:35PM +0000, Michael Selik wrote: What is your evidence for this claim? So far I've only seen one real- world use-case for this, and that single use-case would be well served by a simpler syntax:
a, b, c = **mapping
I have to warn here. This looks cool but it does something that AFAIK no other Python syntax uses -- it takes variable names and does something to those variables but also uses their actual names as string literals. I agree that the use cases for this seem pretty sweet, but perhaps we should give it a somewhat different syntax just so it's clear that the names on the LHS matter. The precedent that the targets must be actual names rather than anything you can assign to is also kind of scary.
I understand the concern, and maybe you are right. However , this:
def func(y, z, x) print(x, y, z)
func(**dict(x=1, y=2, z=3))
prints "1 2 3" , a nd so doe s func(x=1, y=2, z=3)
and
func(z=3, x=1, y=2)
So a, b, c = **mapping
would be perfectly in line with this. Of course
there may still be confusion, but that would mean the user would probably
already be confused about whether dicts are ordered or not, so that
confusion would need to be fixed anyway. I think the key is that ** should
_never_ be interpreted as unpack/repack by order. Or in other words, it
always means unpack/repack _by name_.
That said, here's a couple of suggestions:
**(a, b, c) = **mapping
**{a, b, c} = **mapping
Although a, b, c = **mapping
would still be more convenient.
-- Koos
PS. For even more explicitness:
a from 'a', b from 'b', c from 'c' = **mapping
Which would allow even
b from 1, a from 0, x from 2 = **iterable
Or
a, b, c from 'a', 'b', 'c' in mapping b, a, c from 1, 0, 2 in mapping
>
-- --Guido van Rossum (python.org/~guido)
Python-ideas mailing list Python-ideas@python.org https://mail.python.org/mailman/listinfo/python-ideas Code of Conduct: http://python.org/psf/codeofconduct/
So a, b, c = **mapping
would be
perfectly in line with this.
Your func
example is a great connection to have made but I would
not
reach the same conclusion. When calling func(**some_dict)
we are not
performing variable assignment but with the a, b, c = **mapping
we would
be. Whereas I accept that func(**some_dict)
imposes some constraints on
the nature of the keys in the dict because functions must have proper
variable names as input parameters, I find it difficult to accept a similar
constraint on general dict unpacking.
Davin
On Thu, May 26, 2016 at 7:27 AM, Koos Zevenhoven k7hoven@gmail.com wrote:
On Thu, May 26, 2016 at 5:18 AM, Guido van Rossum guido@python.org wrote:
On Wed, May 25, 2016 at 6:56 PM, Steven D'Aprano steve@pearwood.info wrote:
On Wed, May 25, 2016 at 01:11:35PM +0000, Michael Selik wrote: What is your evidence for this claim? So far I've only seen one real- world use-case for this, and that single use-case would be well served by a simpler syntax:
a, b, c = **mapping
I have to warn here. This looks cool but it does something that AFAIK no other Python syntax uses -- it takes variable names and does something to those variables but also uses their actual names as string literals. I agree that the use cases for this seem pretty sweet, but perhaps we should give it a somewhat different syntax just so it's clear that the names on the LHS matter. The precedent that the targets must be actual names rather than anything you can assign to is also kind of scary.
I understand the concern, and maybe you are right. However , this:
def func(y, z, x) print(x, y, z)
func(**dict(x=1, y=2, z=3))
prints "1 2 3" , a nd so doe s func(x=1, y=2, z=3)
and
func(z=3, x=1, y=2)
So a, b, c = **mapping
would be perfectly in line with this. Of course
there may still be confusion, but that would mean the user would probably
already be confused about whether dicts are ordered or not, so that
confusion would need to be fixed anyway. I think the key is that ** should
_never_ be interpreted as unpack/repack by order. Or in other words, it
always means unpack/repack _by name_.
That said, here's a couple of suggestions:
**(a, b, c) = **mapping
**{a, b, c} = **mapping
Although a, b, c = **mapping
would still be more convenient.
-- Koos
PS. For even more explicitness:
a from 'a', b from 'b', c from 'c' = **mapping
Which would allow even
b from 1, a from 0, x from 2 = **iterable
Or
a, b, c from 'a', 'b', 'c' in mapping b, a, c from 1, 0, 2 in mapping
>
-- --Guido van Rossum (python.org/~guido)
Python-ideas mailing list Python-ideas@python.org https://mail.python.org/mailman/listinfo/python-ideas Code of Conduct: http://python.org/psf/codeofconduct/
Python-ideas mailing list Python-ideas@python.org https://mail.python.org/mailman/listinfo/python-ideas Code of Conduct: http://python.org/psf/codeofconduct/
On 26.05.2016 14:27, Koos Zevenhoven wrote:
That said, here's a couple of suggestions:
**(a, b, c) = **mapping
**{a, b, c} = **mapping
Although a, b, c = **mapping
would still be more convenient.
-- Koos
Do you see what you just did there?
{a, b, c} = mapping
That seems like a mathematical equation where the ** appear to be superfluous (besides I don't really like special characters ;-) ). So it yields:
{a, b, c} = mapping
However and additionally, here the LHS (and the LHS in your suggestion) reminds me of a set. That's not good I guess. I one now adds the keys back in we go back to
{'a': s1, 'b': s2, 'c': s3} = mapping
So far, all proposals which deviate from Michael's one are just "optimizations in terms of characters". The only one I would find not necessarily too restrictive were:
'a': s1, 'b': s2, 'c': s3 = mapping # no braces :)
That looks quite good to me. What do you think?
Best, Sven
On Thu, May 26, 2016 at 06:50:12PM +0200, Sven R. Kunze wrote:
So far, all proposals which deviate from Michael's one are just "optimizations in terms of characters". The only one I would find not necessarily too restrictive were:
'a': s1, 'b': s2, 'c': s3 = mapping # no braces :)
That looks quite good to me. What do you think?
I think that if you submitted code to me with keys 'a', 'b', 'c' and variables s1, s2, s3, I'd probably reject it and tell you to use descriptive, meaningful keys and names.
I wish people would stop giving toy examples as examples of how nice the syntax looks, and instead try to use it with descriptive names taken from real code. I believe that, by far the majority of the time, you will be repeating the same names twice, and likely exceeding most reasonable line lengths:
'referer': referer, 'useragent': useragent, 'use_proxy': use_proxy, 'follow_links': follow_links, 'clobber': clobber, 'timeout': timeout = mapping
Still think it looks quite good? If you do, that's your right, of course, it's a matter of personal taste. But using toy examples with one or two letter variable names is not a fair or realistic test of what it will be like to use this syntax in real code.
-- Steve
On 05/26/2016 10:40 AM, Steven D'Aprano wrote:
On Thu, May 26, 2016 at 06:50:12PM +0200, Sven R. Kunze wrote:
So far, all proposals which deviate from Michael's one are ju
'a': s1, 'b': s2, 'c': s3 = mapping # no braces :)
That looks quite good to me. What do you think?
I think that if you submitted code to me with keys 'a', 'b', 'c' and variables s1, s2, s3, I'd probably reject it and tell you to use descriptive, meaningful keys and names.
I wish people would stop giving toy examples as examples of how nice the syntax looks, and instead try to use it with descriptive names taken from real code. I believe that, by far the majority of the time, you will be repeating the same names twice, and likely exceeding most reasonable line lengths:
'referer': referer, 'useragent': useragent, 'use_proxy': use_proxy, 'follow_links': follow_links, 'clobber': clobber, 'timeout': timeout = mapping
Still think it looks quite good? If you do, that's your right, of course, it's a matter of personal taste. But using toy examples with one or two letter variable names is not a fair or realistic test of what it will be like to use this syntax in real code.
With the simple syntax that I could live with, a real example could be:
{active_id, active_ids, active_model} = context
or
{partner_id, product_id, ship_to, product_ids} = values
which is more readable than
(partner_id, product_id, ship_to, product_ids = (values[k] for k in ['partner_id', 'product_id', 'ship_to', 'product_ids']))
Wow. That's a lot of room for typos and wrong order.
-- ~Ethan~
On 26 May 2016 at 18:55, Ethan Furman ethan@stoneleaf.us wrote:
With the simple syntax that I could live with, a real example could be:
{active_id, active_ids, active_model} = context
or
{partner_id, product_id, ship_to, product_ids} = values
The behaviour of using the names of the variables from the LHS to introspect the value on the RHS is, to me, extremely magical and unlike anything I've seen in any other language. I don't think it sits well in Python, even though it is certainly a very readable idiom for the sort of unpacking we're talking about here.
One other disadvantage of this syntax is that it would break badly if someone refactored the code and renamed one of the variables. Of course, the semantics of this construct means that renaming the variables changes the meaning - but once again, that's not something I can recall ever having seen in any language.
Having said all that...
which is more readable than
(partner_id, product_id, ship_to, product_ids = (values[k] for k in ['partner_id', 'product_id', 'ship_to', 'product_ids']))
Wow. That's a lot of room for typos and wrong order.
I agree - this is pretty horrible. Although marginally better than the proposed {'partner_id': partner_id, ...} form with explicit naming of the keys.
Personally, though, I don't see that much wrong with
partner_id = values['partner_id']
product_id = values['product_id']
ship_to = values['ship_to']
product_ids = values['product_ids']
It's a bit repetitive, and maybe a little verbose, but nothing a good editor or IDE (or anything better than gmail's web interface :-)) wouldn't make straightforward to manage.
Paul
On 2016-05-26 20:25, Paul Moore wrote: [snip]
Personally, though, I don't see that much wrong with
partner_id = values['partner_id']
product_id = values['product_id']
ship_to = values['ship_to']
product_ids = values['product_ids']
It's a bit repetitive, and maybe a little verbose, but nothing a good editor or IDE (or anything better than gmail's web interface :-)) wouldn't make straightforward to manage.
Could we use semicolons in the subscript to create a tuple? They could be used for packing or unpacking:
partner_id, product_id, ship_to, product_ids = values['partner_id';
'product_id'; 'ship_to'; 'product_ids']
my_dict['partner_id'; 'product_id'; 'ship_to'; 'product_ids'] =
partner_id, product_id, ship_to, product_ids
Or they would that be too easily confused with commas?
On 26 May 2016 at 20:48, MRAB python@mrabarnett.plus.com wrote:
On 2016-05-26 20:25, Paul Moore wrote: [snip]
Personally, though, I don't see that much wrong with
partner_id = values['partner_id']
product_id = values['product_id']
ship_to = values['ship_to']
product_ids = values['product_ids']
It's a bit repetitive, and maybe a little verbose, but nothing a good editor or IDE (or anything better than gmail's web interface :-)) wouldn't make straightforward to manage.
Could we use semicolons in the subscript to create a tuple? They could be used for packing or unpacking:
partner_id, product_id, ship_to, product_ids = values['partner_id';
'product_id'; 'ship_to'; 'product_ids']
my_dict['partner_id'; 'product_id'; 'ship_to'; 'product_ids'] =
partner_id, product_id, ship_to, product_ids
Or they would that be too easily confused with commas?
I'd imagine it would be confusing. And personally, I still find that syntax less readable than the sequence of assignments.
Full disclosure - I've written that sort of sequence of assignments quite a few times, and it's annoyed me every time I have. So I sympathise with the desire for "something better". But now that we're using real-world names, I'm finding that none of the proposed options are actually qualifying as "better" - just "different"...
Paul
On Thu, May 26, 2016 at 8:48 PM, MRAB python@mrabarnett.plus.com wrote:
On 2016-05-26 20:25, Paul Moore wrote: [snip]
Personally, though, I don't see that much wrong with >
partner_id = values['partner_id']
product_id = values['product_id']
ship_to = values['ship_to']
product_ids = values['product_ids']
It's a bit repetitive, and maybe a little verbose, but nothing a good editor or IDE (or anything better than gmail's web interface :-)) wouldn't make straightforward to manage.
Could we use semicolons in the subscript to create a tuple? They could be used for packing or unpacking:
partner_id, product_id, ship_to, product_ids = values['partner_id';
'product_id'; 'ship_to'; 'product_ids']
my_dict['partner_id'; 'product_id'; 'ship_to'; 'product_ids'] =
partner_id, product_id, ship_to, product_ids
Instead of special syntax, what if dict.values() returned a tuple when given keys as arguments:
partner_id, product_id, ship_to, product_ids = my_dict.values('partner_id', 'product_id', 'ship_to', 'product_ids')
That avoids repeating the dict variable, at least. And as there is dict.update(), I don't see the need for a new syntax for assigning to multiple keys.
Nathan
I think this is important enough to get a change in subject line, lest it be lost in the dict unpacking thread.
On Thu, May 26, 2016 at 11:28:25PM +0100, Nathan Schneider wrote:
Instead of special syntax, what if dict.values() returned a tuple when given keys as arguments:
partner_id, product_id, ship_to, product_ids = my_dict.values( 'partner_id', 'product_id', 'ship_to', 'product_ids')
That avoids repeating the dict variable, at least. And as there is dict.update(), I don't see the need for a new syntax for assigning to multiple keys.
I like this idea. I think it beats the status quo:
partner_id = my_dict['partner_id'] product_id = my_dict['product_id']
# etc.
and the various "dict unpacking" syntax suggested, e.g.:
{'partner_id': partner_id, 'product_id': product_id, **catch_all} = my_dict
and it's less magical than variants that extract the names from the left hand side:
partner_id, product_id, ship_to, product_ids = **my_dict
It naturally and trivially supports the case where assignment targets aren't names, and where keys are not identifiers:
obj.partner_id, products[the_id] = my_dict.values('partner id', 'is')
It's still a bit repetitive in the simple case where the keys are the same as the variable names, but without compiler magic, what else are you going to do?
+1
-- Steve
On Thursday, May 26, 2016, Steven D'Aprano steve@pearwood.info wrote:
I think this is important enough to get a change in subject line, lest it be lost in the dict unpacking thread.
On Thu, May 26, 2016 at 11:28:25PM +0100, Nathan Schneider wrote:
Instead of special syntax, what if dict.values() returned a tuple when given keys as arguments:
partner_id, product_id, ship_to, product_ids = my_dict.values( 'partner_id', 'product_id', 'ship_to', 'product_ids')
That avoids repeating the dict variable, at least. And as there is dict.update(), I don't see the need for a new syntax for assigning to multiple keys.
I like this idea. I think it beats the status quo:
partner_id = my_dict['partner_id'] product_id = my_dict['product_id']
# etc.
and the various "dict unpacking" syntax suggested, e.g.:
{'partner_id': partner_id, 'product_id': product_id, **catch_all} = my_dict
and it's less magical than variants that extract the names from the left hand side:
partner_id, product_id, ship_to, product_ids = **my_dict
It naturally and trivially supports the case where assignment targets aren't names, and where keys are not identifiers:
obj.partner_id, products[the_id] = my_dict.values('partner id', 'is')
It's still a bit repetitive in the simple case where the keys are the same as the variable names, but without compiler magic, what else are you going to do?
+1
Interesting. It should probably have a different name. What type should it return? Iterator? Sequence? It can't really be a ValuesView because that class is just a view on the hash table. Even though you technically could combine this functionality into values(), I don't think it would be helpful to do so -- if only because of the surprising edge case where if you were to pass it a list of keys to extract using **args, if the list is empty, values() would default to its original behavior or returning all keys, in hash table order.
--Guido
On 05/26/2016 09:26 PM, Guido van Rossum wrote:
On Thursday, May 26, 2016, Steven D'Apranowrote:
On Thu, May 26, 2016 at 11:28:25PM +0100, Nathan Schneider wrote:
I think this is important enough to get a change in subject line, lest it be lost in the dict unpacking thread.
Instead of special syntax, what if dict.values() returned a tuple when given keys as arguments:
partner_id, product_id, ship_to, product_ids = my_dict.values( 'partner_id', 'product_id', 'ship_to', 'product_ids')
That avoids repeating the dict variable, at least. And as there is dict.update(), I don't see the need for a new syntax for assigning to multiple keys.
I like this idea. I think it beats the status quo:
+1
Interesting. It should probably have a different name. What type should it return? Iterator? Sequence? It can't really be a ValuesView because that class is just a view on the hash table. Even though you technically could combine this functionality into values(), I don't think it would be helpful to do so -- if only because of the surprising edge case where if you were to pass it a list of keys to extract using **args, if the list is empty, values() would default to its original behavior or returning all keys, in hash table order.
Good point. The time bomb would be even worse if sometimes the dict had the same number of elements as were being asked for, as then it would be an intermittent problem.
However, if we make a new method we could just as easily make a new function:
def get_values_from(a_dict, keys=()): if not keys: raise an_error yield a_dict[k] for k in keys
Hmmm. That could even be handy for a list/tuple:
offset, name = get_values_from(a_list, [1, 7])
;)
At any rate, the return type should be an iterator.
-- ~Ethan~
On Fri, May 27, 2016 at 7:57 AM, Ethan Furman ethan@stoneleaf.us wrote:
On 05/26/2016 09:26 PM, Guido van Rossum wrote:
On Thursday, May 26, 2016, Steven D'Apranowrote:
On Thu, May 26, 2016 at 11:28:25PM +0100, Nathan Schneider wrote:
I think this is important enough to get a change in subject line, lest
it be lost in the dict unpacking thread.
Instead of special syntax, what if dict.values() returned a tuple
> when given keys as arguments:
>
partner_id, product_id, ship_to, product_ids = my_dict.values( 'partner_id', 'product_id', 'ship_to', 'product_ids')
That avoids repeating the dict variable, at least. And as there is dict.update(), I don't see the need for a new syntax for assigning to multiple keys.
I like this idea. I think it beats the status quo:
+1
Interesting. It should probably have a different name. What type should it return? Iterator? Sequence? It can't really be a ValuesView because that class is just a view on the hash table. Even though you technically could combine this functionality into values(), I don't think it would be helpful to do so -- if only because of the surprising edge case where if you were to pass it a list of keys to extract using **args, if the list is empty, values() would default to its original behavior or returning all keys, in hash table order.
Good point. The time bomb would be even worse if sometimes the dict had the same number of elements as were being asked for, as then it would be an intermittent problem.
However, if we make a new method we could just as easily make a new function:
def get_values_from(a_dict, keys=()): if not keys: raise an_error yield a_dict[k] for k in keys
Hmmm. That could even be handy for a list/tuple:
offset, name = get_values_from(a_list, [1, 7])
;)
getitems(obj, subscripts) ?
We almost have this:
from operator import itemgetter
itemgetter(1,7)(a_list)
-- Koos
At any rate, the return type should be an iterator.
-- ~Ethan~
Python-ideas mailing list Python-ideas@python.org https://mail.python.org/mailman/listinfo/python-ideas Code of Conduct: http://python.org/psf/codeofconduct/
Neat!
My bikeshed is coloured dict.getmany(*keys, default=None)
, and while
returning a namedtuple might be cool, an iterator is probably the way to go.
But +1 regardless of colour.
Top-posted from my Windows Phone
-----Original Message----- From: "Koos Zevenhoven" k7hoven@gmail.com Sent: 5/26/2016 22:27 To: "Ethan Furman" ethan@stoneleaf.us Cc: "python-ideas" python-ideas@python.org Subject: Re: [Python-ideas] Enhancing dict.values
On Fri, May 27, 2016 at 7:57 AM, Ethan Furman ethan@stoneleaf.us wrote:
On 05/26/2016 09:26 PM, Guido van Rossum wrote:
On Thursday, May 26, 2016, Steven D'Apranowrote:
On Thu, May 26, 2016 at 11:28:25PM +0100, Nathan Schneider wrote:
I think this is important enough to get a change in subject line, lest it be lost in the dict unpacking thread.
Instead of special syntax, what if dict.values() returned a tuple
when given keys as arguments:
partner_id, product_id, ship_to, product_ids = my_dict.values( 'partner_id', 'product_id', 'ship_to', 'product_ids')
That avoids repeating the dict variable, at least. And as there is dict.update(), I don't see the need for a new syntax for assigning to multiple keys.
I like this idea. I think it beats the status quo:
+1
Interesting. It should probably have a different name. What type should it return? Iterator? Sequence? It can't really be a ValuesView because that class is just a view on the hash table. Even though you technically could combine this functionality into values(), I don't think it would be helpful to do so -- if only because of the surprising edge case where if you were to pass it a list of keys to extract using **args, if the list is empty, values() would default to its original behavior or returning all keys, in hash table order.
Good point. The time bomb would be even worse if sometimes the dict had the same number of elements as were being asked for, as then it would be an intermittent problem.
However, if we make a new method we could just as easily make a new function:
def get_values_from(a_dict, keys=()): if not keys: raise an_error yield a_dict[k] for k in keys
Hmmm. That could even be handy for a list/tuple:
offset, name = get_values_from(a_list, [1, 7])
;)
getitems(obj, subscripts) ?
We almost have this:
from operator import itemgetter
itemgetter(1,7)(a_list)
-- Koos
At any rate, the return type should be an iterator.
-- ~Ethan~
Python-ideas mailing list Python-ideas@python.org https://mail.python.org/mailman/listinfo/python-ideas Code of Conduct: http://python.org/psf/codeofconduct/
On Fri, May 27, 2016 at 06:43:35AM -0700, Steve Dower wrote:
Neat!
My bikeshed is coloured dict.getmany(*keys, default=None)
,
"getmany" doesn't tell you, many of what?
and while returning a namedtuple might be cool, an iterator is probably the way to go.
The disadvantage of a namedtuple is that every invocation would create a new class, which is then used once and once only for a singleton instance. Could get very expensive.
I don't think an iterator would be needed. The motivating use-case is for sequence unpacking:
# apologies for breaking my own rule about realistic names
fee, fi, fo, fum = mydict.getmany('fee', 'fi', 'fo', 'fum')
so I don't think the lazy aspect of an iterator is useful. It's going to be consumed eagerly, and immediately. I think a regular tuple is better. That also matches the behaviour of itemgetter:
py> d = {'fe': 1, 'fi': 2, 'fo': 3, 'fum': 4} py> from operator import itemgetter py> f = itemgetter('fe', 'fi', 'fo', 'fum') py> f(d) (1, 2, 3, 4)
So the core functionality already exists, it's just hidden away in the operator module. (Guido's time machine strikes again.)
Open questions:
Is it worth making this a dict method? +1 from me.
Name? "getvalues"?
Any other functionality?
Possibly a keyword-only "default" argument:
mydict.getvalues(*keys, default=None)
On more shaky ground, how about a "pop" argument?
mydict.getvalues(*keys, pop=True)
will delete the keys as well as return the values. Use-case: methods (particularly __init__ or __new__) which take extra keyword args which need to be popped before calling super.
def __init__(self, **kwargs):
colour, height = kwargs.getvalues('colour', 'height', pop=True)
super().__init__(**kwargs)
self.process(colour, height)
-- Steve
On Fri, May 27, 2016 at 9:17 AM, Steven D'Aprano steve@pearwood.info wrote:
# apologies for breaking my own rule about realistic names
fee, fi, fo, fum = mydict.getmany('fee', 'fi', 'fo', 'fum')
so how again is the much better than:
Isn't this the status quo? a, b, c = [mapping[k] for k in ('a', 'b', 'c')]
the objection to that was that it gets ugly when you've got longer, more realistic names (and maybe more of them.)
fee, fi, fo, fum = [mapping[k] for k in ('fee', 'fi', 'fo', 'fum')]
only a little more typing, and in both cases, you need to specify the names twice.
(of course, they don't need to be the same names...)
-CHB
--
Christopher Barker, Ph.D. Oceanographer
Emergency Response Division NOAA/NOS/OR&R (206) 526-6959 voice 7600 Sand Point Way NE (206) 526-6329 fax Seattle, WA 98115 (206) 526-6317 main reception
Chris.Barker@noaa.gov
On Fri, May 27, 2016 at 9:36 AM, Chris Barker chris.barker@noaa.gov wrote: >
On Fri, May 27, 2016 at 9:17 AM, Steven D'Aprano steve@pearwood.info wrote: >
# apologies for breaking my own rule about realistic names
fee, fi, fo, fum = mydict.getmany('fee', 'fi', 'fo', 'fum')
so how again is the much better than:
Isn't this the status quo? a, b, c = [mapping[k] for k in ('a', 'b', 'c')]
the objection to that was that it gets ugly when you've got longer, more realistic names (and maybe more of them.)
fee, fi, fo, fum = [mapping[k] for k in ('fee', 'fi', 'fo', 'fum')]
only a little more typing, and in both cases, you need to specify the names twice.
(of course, they don't need to be the same names...)
Honestly I don't like the comprehension version much; it reeks of cleverness. The advantage of the getnamy() spelling is that you can Google for it more easily. (But I don't know how it would handle nested unpacking, which some have said is an important use case to warrant adding complexity to the language.)
If this was a Dropbox code review I'd say rewrite it as
fee = mapping['fee'] fi = mapping['fi']
etc. -- then nobody will have any trouble understanding what it does.
-- --Guido van Rossum (python.org/~guido)
On 27 May 2016 at 17:17, Steven D'Aprano steve@pearwood.info wrote:
Open questions:
Quite possibly, but maybe a 3rd-party function implementing this would be a worthwhile test of how useful it is in practice (although conceded, a big part of its usefulness would be that it doesn't need a dependency).
Nested unpacking:
location = = { 'name': 'The north pole', 'position': { 'x': 0, 'y': 0 }} location_name, x, y = location.getvalues('name', 'position.x', 'position.y')
Paul.
On Fri, May 27, 2016 at 05:42:44PM +0100, Paul Moore wrote:
Nested unpacking:
location = = { 'name': 'The north pole', 'position': { 'x': 0, 'y': 0 }} location_name, x, y = location.getvalues('name', 'position.x', 'position.y')
How do you distinguish between a key 'position.x' and a key 'position' with a dict with a key 'x'?
Nested unpacking seems too Javascripty for my liking.
-- Steve
On 27 May 2016 at 18:42, Steven D'Aprano steve@pearwood.info wrote:
On Fri, May 27, 2016 at 05:42:44PM +0100, Paul Moore wrote:
Nested unpacking:
location = = { 'name': 'The north pole', 'position': { 'x': 0, 'y': 0 }} location_name, x, y = location.getvalues('name', 'position.x', 'position.y')
How do you distinguish between a key 'position.x' and a key 'position' with a dict with a key 'x'?
Nested unpacking seems too Javascripty for my liking.
Good point.
The problem is that without handling nested unpacking (and in particular, the error checking needed - what if location['position'] isn't a dict? I'd like a uniform error I can trap or report on) I don't see any practical advantage over
location_name = location['name'] x = location['position']['x'] y = location['position']['y']
So let's turn the question round. What advantages does the getvalues() proposal have over the above sequence of assignments? I'm genuinely no longer sure.
Paul
""getmany" doesn't tell you, many of what?"
Eh, neither does "get", but like I said, I like the idea regardless of colour.
Top-posted from my Windows Phone
-----Original Message----- From: "Steven D'Aprano" steve@pearwood.info Sent: 5/27/2016 9:24 To: "python-ideas@python.org" python-ideas@python.org Subject: Re: [Python-ideas] Enhancing dict.values
On Fri, May 27, 2016 at 06:43:35AM -0700, Steve Dower wrote:
Neat!
My bikeshed is coloured dict.getmany(*keys, default=None)
,
"getmany" doesn't tell you, many of what?
and while returning a namedtuple might be cool, an iterator is probably the way to go.
The disadvantage of a namedtuple is that every invocation would create a new class, which is then used once and once only for a singleton instance. Could get very expensive.
I don't think an iterator would be needed. The motivating use-case is for sequence unpacking:
# apologies for breaking my own rule about realistic names
fee, fi, fo, fum = mydict.getmany('fee', 'fi', 'fo', 'fum')
so I don't think the lazy aspect of an iterator is useful. It's going to be consumed eagerly, and immediately. I think a regular tuple is better. That also matches the behaviour of itemgetter:
py> d = {'fe': 1, 'fi': 2, 'fo': 3, 'fum': 4} py> from operator import itemgetter py> f = itemgetter('fe', 'fi', 'fo', 'fum') py> f(d) (1, 2, 3, 4)
So the core functionality already exists, it's just hidden away in the operator module. (Guido's time machine strikes again.)
Open questions:
Is it worth making this a dict method? +1 from me.
Name? "getvalues"?
Any other functionality?
Possibly a keyword-only "default" argument:
mydict.getvalues(*keys, default=None)
On more shaky ground, how about a "pop" argument?
mydict.getvalues(*keys, pop=True)
will delete the keys as well as return the values. Use-case: methods (particularly __init__ or __new__) which take extra keyword args which need to be popped before calling super.
def __init__(self, **kwargs):
colour, height = kwargs.getvalues('colour', 'height', pop=True)
super().__init__(**kwargs)
self.process(colour, height)
-- Steve
Python-ideas mailing list Python-ideas@python.org https://mail.python.org/mailman/listinfo/python-ideas Code of Conduct: http://python.org/psf/codeofconduct/
Also it probably plays on the dbapi convention fetchmany().
--Guido (mobile)
On Sat, May 28, 2016 at 2:41 AM, Greg Ewing greg.ewing@canterbury.ac.nz wrote:
Steven D'Aprano wrote:
"getmany" doesn't tell you, many of what? >
I think the idea is that it would get the same kind of things that get() gets, i.e. items by their keys.
My slight hesitation about "many" is that it's a subjective quantity. (Are 2 or 3 keys enough to count as "many"?)
Another option would be 'geteach'—i.e., for each key provided, get a value. Or 'getmult' (multiple), but that could be mistaken as multiplication.
Nathan
Worrying about how many .getmany() is seems silly. As Guido notes, it follows the pattern of .fetchmany() in the DBAPI. That "many" might be one, or even zero, which is fine. On May 28, 2016 3:23 AM, "Nathan Schneider" neatnate@gmail.com wrote:
> >
On Sat, May 28, 2016 at 2:41 AM, Greg Ewing greg.ewing@canterbury.ac.nz wrote:
Steven D'Aprano wrote:
"getmany" doesn't tell you, many of what? >
I think the idea is that it would get the same kind of things that get() gets, i.e. items by their keys.
My slight hesitation about "many" is that it's a subjective quantity. (Are 2 or 3 keys enough to count as "many"?)
Another option would be 'geteach'—i.e., for each key provided, get a value. Or 'getmult' (multiple), but that could be mistaken as multiplication.
Nathan
Python-ideas mailing list Python-ideas@python.org https://mail.python.org/mailman/listinfo/python-ideas Code of Conduct: http://python.org/psf/codeofconduct/
So, apart from whatever idea you find more suitable, as I was playing around with a module for dictionary utils just this week, I made a proof of concept thingy that uses context managers and import from -
Whoever want to try it is welcome, of course:
(env)[gwidion@localhost tmp30]$ pip install extradict ... Successfully installed extradict-0.1.9
(env)[gwidion@localhost tmp30]$ python ...
from extradict import MapGetter a = dict(b="test", c="another test") with MapGetter(a) as a: ... from a import b, c ... print (b, c) test another test
For the time being it works as a naive implementation monkey patching "__import__" - When (and if) I get a proper thing using the importlib machinery , I will upgrade "extradict" to 0.2
On 28 May 2016 at 15:03, David Mertz mertz@gnosis.cx wrote:
Worrying about how many .getmany() is seems silly. As Guido notes, it follows the pattern of .fetchmany() in the DBAPI. That "many" might be one, or even zero, which is fine.
On May 28, 2016 3:23 AM, "Nathan Schneider" neatnate@gmail.com wrote: > > >
On Sat, May 28, 2016 at 2:41 AM, Greg Ewing greg.ewing@canterbury.ac.nz wrote: >
Steven D'Aprano wrote:
"getmany" doesn't tell you, many of what?
I think the idea is that it would get the same kind of things that get() gets, i.e. items by their keys.
My slight hesitation about "many" is that it's a subjective quantity. (Are 2 or 3 keys enough to count as "many"?)
Another option would be 'geteach'—i.e., for each key provided, get a value. Or 'getmult' (multiple), but that could be mistaken as multiplication.
Nathan
Python-ideas mailing list Python-ideas@python.org https://mail.python.org/mailman/listinfo/python-ideas Code of Conduct: http://python.org/psf/codeofconduct/
Python-ideas mailing list Python-ideas@python.org https://mail.python.org/mailman/listinfo/python-ideas Code of Conduct: http://python.org/psf/codeofconduct/
On Thu, May 26, 2016 at 11:39 PM Steven D'Aprano steve@pearwood.info wrote:
On Thu, May 26, 2016 at 11:28:25PM +0100, Nathan Schneider wrote:
Instead of special syntax, what if dict.values() returned a tuple when given keys as arguments:
partner_id, product_id, ship_to, product_ids = my_dict.values( 'partner_id', 'product_id', 'ship_to', 'product_ids')
I like this idea. I think it beats the status quo:
Isn't this the status quo? a, b, c = [mapping[k] for k in ('a', 'b', 'c')]
That was already someone's argument that a special dict unpacking syntax is unnecessary.
On 05/27/2016 08:02 AM, Michael Selik wrote:
On Thu, May 26, 2016 at 11:39 PM Steven D'Aprano wrote:
On Thu, May 26, 2016 at 11:28:25PM +0100, Nathan Schneider wrote:
Instead of special syntax, what if dict.values() returned a tuple when given keys as arguments:
partner_id, product_id, ship_to, product_ids = my_dict.values( 'partner_id', 'product_id', 'ship_to', 'product_ids')
I like this idea. I think it beats the status quo:
Isn't this the status quo? a, b, c = [mapping[k] for k in ('a', 'b', 'c')]
Yes.
That was already someone's argument that a special dict unpacking syntax is unnecessary.
It looks really cool at first blush, but when substituting real names in for the place-holders a, b, and c it gets ugly fast.
-- ~Ethan~
On 27.05.2016 18:12, Ethan Furman wrote:
On 05/27/2016 08:02 AM, Michael Selik wrote:
On Thu, May 26, 2016 at 11:39 PM Steven D'Aprano wrote:
On Thu, May 26, 2016 at 11:28:25PM +0100, Nathan Schneider wrote:
Instead of special syntax, what if dict.values() returned a tuple when given keys as arguments:
partner_id, product_id, ship_to, product_ids = my_dict.values( 'partner_id', 'product_id', 'ship_to', 'product_ids')
I like this idea. I think it beats the status quo:
Isn't this the status quo? a, b, c = [mapping[k] for k in ('a', 'b', 'c')]
Yes.
Uhm, what about...
from operator import itemgetter
a, b, c = itemgetter('a', 'b', 'c')(mapping)
(itemgetter is not the best name, but it does get the job done)
That was already someone's argument that a special dict unpacking syntax is unnecessary.
It looks really cool at first blush, but when substituting real names in for the place-holders a, b, and c it gets ugly fast.
-- Marc-Andre Lemburg eGenix.com
Professional Python Services directly from the Experts (#1, May 27 2016)
Python Projects, Coaching and Consulting ... http://www.egenix.com/ Python Database Interfaces ... http://products.egenix.com/ Plone/Zope Database Interfaces ... http://zope.egenix.com/
::: We implement business ideas - efficiently in both time and costs :::
eGenix.com Software, Skills and Services GmbH Pastor-Loeh-Str.48 D-40764 Langenfeld, Germany. CEO Dipl.-Math. Marc-Andre Lemburg Registered at Amtsgericht Duesseldorf: HRB 46611 http://www.egenix.com/company/contact/ http://www.malemburg.com/
On 05/27/2016 09:52 AM, M.-A. Lemburg wrote:
On 27.05.2016 18:12, Ethan Furman wrote:
On 05/27/2016 08:02 AM, Michael Selik wrote:
On Thu, May 26, 2016 at 11:39 PM Steven D'Aprano wrote:
On Thu, May 26, 2016 at 11:28:25PM +0100, Nathan Schneider wrote:
Instead of special syntax, what if dict.values() returned a tuple when given keys as arguments:
partner_id, product_id, ship_to, product_ids = my_dict.values(
'partner_id', 'product_id', 'ship_to', 'product_ids')
I like this idea. I think it beats the status quo:
Isn't this the status quo? a, b, c = [mapping[k] for k in ('a', 'b', 'c')]
Yes.
Uhm, what about...
from operator import itemgetter
a, b, c = itemgetter('a', 'b', 'c')(mapping)
I may have misspoken. The probably most used status quo would be:
partner_id = values['partner_id'] state = values['state'] due_date = values['due_date'] ...
So far, the only syntax I've seen that is both readable and has even a remote shot at acceptance would be:
{partner_id, state, due_date, **rest} = values
The consensus is that that is too magical (which I can't argue with ;).
itemgetter, comprehensions, name changes, etc., are all (extremely) verbose.
-- ~Ethan~
On 05/26/2016 12:25 PM, Paul Moore wrote:
On 26 May 2016 at 18:55, Ethan Furman wrote:
With the simple syntax that I could live with, a real example could be:
{active_id, active_ids, active_model} = context
or
{partner_id, product_id, ship_to, product_ids} = values
The behaviour of using the names of the variables from the LHS to introspect the value on the RHS is, to me, extremely magical and unlike anything I've seen in any other language. I don't think it sits well in Python, even though it is certainly a very readable idiom for the sort of unpacking we're talking about here.
One other disadvantage of this syntax is that it would break badly if someone refactored the code and renamed one of the variables. Of course, the semantics of this construct means that renaming the variables changes the meaning - but once again, that's not something I can recall ever having seen in any language.
Having said all that...
which is more readable than
(partner_id, product_id, ship_to, product_ids = (values[k] for k in ['partner_id', 'product_id', 'ship_to', 'product_ids']))
Wow. That's a lot of room for typos and wrong order.
I agree - this is pretty horrible. Although marginally better than the proposed {'partner_id': partner_id, ...} form with explicit naming of the keys.
Personally, though, I don't see that much wrong with
partner_id = values['partner_id']
product_id = values['product_id']
ship_to = values['ship_to']
product_ids = values['product_ids']
And this is what I currently do. But a fellow can dream, right? :)
-- ~Ethan~
On May 26 2016, Paul Moore p.f.moore-Re5JQEeQqe8AvxtiuMwx3w@public.gmane.org wrote:
On 26 May 2016 at 18:55, Ethan Furman ethan-gcWI5d7PMXnvaiG9KC9N7Q@public.gmane.org wrote:
With the simple syntax that I could live with, a real example could be:
{active_id, active_ids, active_model} = context
or
{partner_id, product_id, ship_to, product_ids} = values
The behaviour of using the names of the variables from the LHS to introspect the value on the RHS is, to me, extremely magical and unlike anything I've seen in any other language. I don't think it sits well in Python, even though it is certainly a very readable idiom for the sort of unpacking we're talking about here.
Very true. But as someone else already said (I can't find the email right now), we have a different construct that everyone is familiar with and that's easily adapted for this situation:
from dict context import active_id, active_ids, active_model
or more general:
"from dict" <expr> "import" <identifier list>
Everyone knows that "from .. import .." modifies the local namespace. We just have to extend it to work not just on modules, but also on dictionaries.
Best, -Nikolaus
-- GPG encrypted emails preferred. Key id: 0xD113FCAC3C4E599F Fingerprint: ED31 791B 2C5C 1613 AF38 8B8A D113 FCAC 3C4E 599F
»Time flies like an arrow, fruit flies like a Banana.«
On Fri, May 27, 2016 at 10:09 AM, Nikolaus Rath Nikolaus@rath.org wrote:
Everyone knows that "from .. import .." modifies the local namespace. We just have to extend it to work not just on modules, but also on dictionaries.
Here's a crazy thought that might be best dismissed out of hand: what about extending 'from name import other_names' to accept any object for <name>? First try to get values via __getitem__() (possibly only for dict/dict subclasses?), next try getattr(), finally try to import the module and pull values from it as per usual.
Pros:
Cons:
The two pros are nice, but I'm not sure they beat the four cons.
FTR, I've not seen anything else in this thread that excites me, but I have only been skimming.
-- Zach
On Fri, May 27, 2016 at 11:28 AM Zachary Ware zachary.ware+pyideas@gmail.com wrote:
Here's a crazy thought that might be best dismissed out of hand: what about extending 'from name import other_names' to accept any object for <name>? First try to get values via __getitem__() (possibly only for dict/dict subclasses?), next try getattr(), finally try to import the module and pull values from it as per usual.
Pros:
Would it solve nested dict unpacking?
On Fri, May 27, 2016 at 10:32 AM, Michael Selik michael.selik@gmail.com wrote: > >
On Fri, May 27, 2016 at 11:28 AM Zachary Ware
zachary.ware+pyideas@gmail.com wrote: >
Here's a crazy thought that might be best dismissed out of hand: what about extending 'from name import other_names' to accept any object for <name>? First try to get values via __getitem__() (possibly only for dict/dict subclasses?), next try getattr(), finally try to import the module and pull values from it as per usual.
Pros:
Would it solve nested dict unpacking?
How do you mean? Replacing some_name =
some_dict['some_key']['some_name']
with from some_dict['some_key']
import some_name
?
Sure, why not? :)
-- Zach
On Fri, May 27, 2016 at 11:38 AM Zachary Ware zachary.ware+pyideas@gmail.com wrote:
On Fri, May 27, 2016 at 10:32 AM, Michael Selik michael.selik@gmail.com wrote: > >
On Fri, May 27, 2016 at 11:28 AM Zachary Ware
zachary.ware+pyideas@gmail.com wrote: >
Here's a crazy thought that might be best dismissed out of hand: what about extending 'from name import other_names' to accept any object for <name>? First try to get values via __getitem__() (possibly only for dict/dict subclasses?), next try getattr(), finally try to import the module and pull values from it as per usual.
Pros:
Would it solve nested dict unpacking?
How do you mean? Replacing some_name =
some_dict['some_key']['some_name']
with from some_dict['some_key']
import some_name
?
No, more like py> {'name': name, 'location': {0: row, 1: col} = mapping py> name, row, col ('Mike', 3, 5)
On Fri, May 27, 2016 at 10:41 AM, Michael Selik michael.selik@gmail.com wrote:
On Fri, May 27, 2016 at 11:38 AM Zachary Ware
zachary.ware+pyideas@gmail.com wrote:
On Fri, May 27, 2016 at 10:32 AM, Michael Selik michael.selik@gmail.com wrote:
Would it solve nested dict unpacking?
How do you mean? Replacing some_name =
some_dict['some_key']['some_name']
with from some_dict['some_key']
import some_name
?
No, more like py> {'name': name, 'location': {0: row, 1: col} = mapping py> name, row, col ('Mike', 3, 5)
That reads as gibberish to me (and also as SyntaxError, you're missing a '}'), so probably not :)
-- Zach
On 27 May 2016 at 12:37, Zachary Ware zachary.ware+pyideas@gmail.com wrote:
On Fri, May 27, 2016 at 10:32 AM, Michael Selik michael.selik@gmail.com wrote: > >
On Fri, May 27, 2016 at 11:28 AM Zachary Ware
zachary.ware+pyideas@gmail.com wrote: >
Here's a crazy thought that might be best dismissed out of hand: what about extending 'from name import other_names' to accept any object for <name>? First try to get values via __getitem__() (possibly only for dict/dict subclasses?), next try getattr(), finally try to import the module and pull values from it as per usual.
Pros:
Would it solve nested dict unpacking?
How do you mean? Replacing some_name =
some_dict['some_key']['some_name']
with from some_dict['some_key']
import some_name
?
Sure, why not? :)
That is the best idea I've seem on this thread.
And them, why having to specify the keys, just to violate DRY?
Maybe just allwoing Mappings to be usedwith from ... import ...
syntax
will work nicely, unambiguously, with no new weird syntaxes introduced -
and the syntax even allows one to rename the dict keys to other
variables, with the
`from mymapping import a as c, b as d " variant.
That would be certainly nice.
>
-- Zach
Python-ideas mailing list Python-ideas@python.org https://mail.python.org/mailman/listinfo/python-ideas Code of Conduct: http://python.org/psf/codeofconduct/
On 28 May 2016 at 14:34, Joao S. O. Bueno jsbueno@python.org.br wrote:
On 27 May 2016 at 12:37, Zachary Ware zachary.ware+pyideas@gmail.com wrote:
On Fri, May 27, 2016 at 10:32 AM, Michael Selik michael.selik@gmail.com wrote: > >
On Fri, May 27, 2016 at 11:28 AM Zachary Ware
zachary.ware+pyideas@gmail.com wrote: >
Here's a crazy thought that might be best dismissed out of hand: what about extending 'from name import other_names' to accept any object for <name>? First try to get values via __getitem__() (possibly only for dict/dict subclasses?), next try getattr(), finally try to import the module and pull values from it as per usual.
Pros:
Would it solve nested dict unpacking?
How do you mean? Replacing some_name =
some_dict['some_key']['some_name']
with from some_dict['some_key']
import some_name
?
Sure, why not? :)
That is the best idea I've seem on this thread.
And them, why having to specify the keys, just to violate DRY?
Maybe just allwoing Mappings to be usedwith from ... import ...
syntax
will work nicely, unambiguously, with no new weird syntaxes introduced -
and the syntax even allows one to rename the dict keys to other
variables, with the
`from mymapping import a as c, b as d " variant.
That would be certainly nice.
Well, I jsut replied upon hitting the "import" suggestion for the first time. Distinguishing it from module imports, of course, is a must.
And them, even if using another keyword than "import" (and requiring a specfic name after from) I stil find it much better than the proposals introducing brackets on the LHS ,and loaded with DRY violations.
> >
>
-- Zach
Python-ideas mailing list Python-ideas@python.org https://mail.python.org/mailman/listinfo/python-ideas Code of Conduct: http://python.org/psf/codeofconduct/
On 5/28/2016 1:34 PM, Joao S. O. Bueno wrote:
On 27 May 2016 at 12:37, Zachary Ware zachary.ware+pyideas@gmail.com wrote:
On Fri, May 27, 2016 at 10:32 AM, Michael Selik michael.selik@gmail.com wrote: > >
On Fri, May 27, 2016 at 11:28 AM Zachary Ware
zachary.ware+pyideas@gmail.com wrote: >
Here's a crazy thought that might be best dismissed out of hand: what about extending 'from name import other_names' to accept any object for <name>? First try to get values via __getitem__() (possibly only for dict/dict subclasses?), next try getattr(), finally try to import the module and pull values from it as per usual.
Pros:
Would it solve nested dict unpacking?
How do you mean? Replacing some_name =
some_dict['some_key']['some_name']
with from some_dict['some_key']
import some_name
?
Sure, why not? :)
That is the best idea I've seem on this thread.
And them, why having to specify the keys, just to violate DRY?
Maybe just allwoing Mappings to be usedwith from ... import ...
syntax
will work nicely, unambiguously, with no new weird syntaxes introduced -
and the syntax even allows one to rename the dict keys to other
variables, with the
`from mymapping import a as c, b as d " variant.
That would be certainly nice.
Wouldn't this approach require that the keys be constants? That is, you couldn't implement a replacement for:
val = d[key+'bar']
I'm not sure that's a reasonable restriction.
Eric.
On 29 May 2016 at 09:35, Eric V. Smith eric@trueblade.com wrote:
On 5/28/2016 1:34 PM, Joao S. O. Bueno wrote: >
On 27 May 2016 at 12:37, Zachary Ware zachary.ware+pyideas@gmail.com wrote: >
On Fri, May 27, 2016 at 10:32 AM, Michael Selik michael.selik@gmail.com wrote: > > >
On Fri, May 27, 2016 at 11:28 AM Zachary Ware
zachary.ware+pyideas@gmail.com wrote: > >
Here's a crazy thought that might be best dismissed out of hand: what about extending 'from name import other_names' to accept any object for <name>? First try to get values via __getitem__() (possibly only for dict/dict subclasses?), next try getattr(), finally try to import the module and pull values from it as per usual.
Pros:
Would it solve nested dict unpacking?
How do you mean? Replacing some_name =
some_dict['some_key']['some_name']
with from some_dict['some_key']
import some_name
?
Sure, why not? :)
That is the best idea I've seem on this thread.
And them, why having to specify the keys, just to violate DRY?
Maybe just allwoing Mappings to be usedwith from ... import ...
syntax
will work nicely, unambiguously, with no new weird syntaxes introduced -
and the syntax even allows one to rename the dict keys to other
variables, with the
`from mymapping import a as c, b as d " variant.
That would be certainly nice.
Wouldn't this approach require that the keys be constants? That is, you couldn't implement a replacement for:
val = d[key+'bar']
I'm not sure that's a reasonable restriction.
As I posted up on the other thread, I've implemented a poof of concept for this in a somewhat toyish package I've started earlier.
So right now, one can do $ pip install extradict $ python
from extradict import MapGetter with MapGetter({"a": 1, "b": 2}) as mydict: ... from mydict import a, b ... print(a, b)
The code is at http://github.com/jsbueno/extradict -
(btw, since that e-mail, I've studied the import hook mechanisms and decided to keep my first design of temporarily replacing __import__. )
If more people decide to use it, it would be easy to include some more mapping parameters to the call to MapGetter to overcome static restrictions of the "import" syntax
with MapGetter(mapping, keysuffix="bar"): ...
or rather:
with MapGetter(mapping, keytransform=lambda key: key + 'bar'): ...
js -><-
Zachary Ware wrote:
Here's a crazy thought that might be best dismissed out of hand: what about extending 'from name import other_names' to accept any object for <name>?
Indeed. It would be better to have some syntactic marker to distinguish this new kind of import from a normal import.
-- Greg
On 05/27/2016 08:09 AM, Nikolaus Rath wrote:
On May 26 2016, Paul Moore wrote:
On 26 May 2016 at 18:55, Ethan Furman wrote:
With the simple syntax that I could live with, a real example could be:
{active_id, active_ids, active_model} = context
or
{partner_id, product_id, ship_to, product_ids} = values
The behaviour of using the names of the variables from the LHS to introspect the value on the RHS is, to me, extremely magical and unlike anything I've seen in any other language. I don't think it sits well in Python, even though it is certainly a very readable idiom for the sort of unpacking we're talking about here.
Very true. But as someone else already said (I can't find the email right now), we have a different construct that everyone is familiar with and that's easily adapted for this situation:
from dict context import active_id, active_ids, active_model
or more general:
"from dict" <expr> "import" <identifier list>
Everyone knows that "from .. import .." modifies the local namespace. We just have to extend it to work not just on modules, but also on dictionaries.
-1
import works with modules. Having it work with other things would muddy the concept, plus make module/object naming conflicts an even bigger hassle.
-- ~Ethan~
On May 27 2016, Ethan Furman ethan-gcWI5d7PMXnvaiG9KC9N7Q@public.gmane.org wrote:
On 05/27/2016 08:09 AM, Nikolaus Rath wrote:
On May 26 2016, Paul Moore wrote:
On 26 May 2016 at 18:55, Ethan Furman wrote:
With the simple syntax that I could live with, a real example could be:
{active_id, active_ids, active_model} = context
or
{partner_id, product_id, ship_to, product_ids} = values
The behaviour of using the names of the variables from the LHS to introspect the value on the RHS is, to me, extremely magical and unlike anything I've seen in any other language. I don't think it sits well in Python, even though it is certainly a very readable idiom for the sort of unpacking we're talking about here.
Very true. But as someone else already said (I can't find the email right now), we have a different construct that everyone is familiar with and that's easily adapted for this situation:
from dict context import active_id, active_ids, active_model
or more general:
"from dict" <expr> "import" <identifier list>
Everyone knows that "from .. import .." modifies the local namespace. We just have to extend it to work not just on modules, but also on dictionaries.
-1
import works with modules.
You don't think of it as "importing something from another namespace into the local namespace"? That's the first thing that I associate with it.
Having it work with other things would muddy the concept, plus make module/object naming conflicts an even bigger hassle.
You did see that I proposed "from dict <> import ..", instead of "from <> import ..", right? The latter would continue to work only for modules. The form would be new syntax and only work for dicts.
Best, -Nikolaus
-- GPG encrypted emails preferred. Key id: 0xD113FCAC3C4E599F Fingerprint: ED31 791B 2C5C 1613 AF38 8B8A D113 FCAC 3C4E 599F
»Time flies like an arrow, fruit flies like a Banana.«
I am against this. Try something else please. Keep import for modules.
--Guido (mobile) On May 27, 2016 11:08 AM, "Nikolaus Rath" Nikolaus@rath.org wrote:
On May 27 2016, Ethan Furman ethan-gcWI5d7PMXnvaiG9KC9N7Q@public.gmane.org wrote:
On 05/27/2016 08:09 AM, Nikolaus Rath wrote:
On May 26 2016, Paul Moore wrote:
On 26 May 2016 at 18:55, Ethan Furman wrote:
With the simple syntax that I could live with, a real example could be:
{active_id, active_ids, active_model} = context
or
{partner_id, product_id, ship_to, product_ids} = values
The behaviour of using the names of the variables from the LHS to introspect the value on the RHS is, to me, extremely magical and unlike anything I've seen in any other language. I don't think it sits well in Python, even though it is certainly a very readable idiom for the sort of unpacking we're talking about here.
Very true. But as someone else already said (I can't find the email right now), we have a different construct that everyone is familiar with and that's easily adapted for this situation:
from dict context import active_id, active_ids, active_model
or more general:
"from dict" <expr> "import" <identifier list>
Everyone knows that "from .. import .." modifies the local namespace. We just have to extend it to work not just on modules, but also on dictionaries.
-1
import works with modules.
You don't think of it as "importing something from another namespace into the local namespace"? That's the first thing that I associate with it.
Having it work with other things would muddy the concept, plus make module/object naming conflicts an even bigger hassle.
You did see that I proposed "from dict <> import ..", instead of "from <> import ..", right? The latter would continue to work only for modules. The form would be new syntax and only work for dicts.
Best, -Nikolaus
-- GPG encrypted emails preferred. Key id: 0xD113FCAC3C4E599F Fingerprint: ED31 791B 2C5C 1613 AF38 8B8A D113 FCAC 3C4E 599F
»Time flies like an arrow, fruit flies like a Banana.«
Python-ideas mailing list Python-ideas@python.org https://mail.python.org/mailman/listinfo/python-ideas Code of Conduct: http://python.org/psf/codeofconduct/
I see backward compatibility problem with import :
sys = dict(version_info=4.0) from sys import version_info
This is legal and using import for unpacking dict could change this behavior.
In case we put higher priority to import from module then unexpected module in PYTHONPATH could change unpacked value. Sort of problem not easy to found. From library maintainer point of view not easy to avoid too.
Any idea using import syntax or even syntax similar to import is dead. Import is about modules and needs to stay about that.
On Sun, May 29, 2016 at 10:10 AM, Pavol Lisy pavol.lisy@gmail.com wrote:
I see backward compatibility problem with import :
sys = dict(version_info=4.0) from sys import version_info
This is legal and using import for unpacking dict could change this behavior.
In case we put higher priority to import from module then unexpected module in PYTHONPATH could change unpacked value. Sort of problem not easy to found. From library maintainer point of view not easy to avoid too.
Python-ideas mailing list Python-ideas@python.org https://mail.python.org/mailman/listinfo/python-ideas Code of Conduct: http://python.org/psf/codeofconduct/
-- --Guido van Rossum (python.org/~guido)
Sorry I just tried to help understand why it is wrong idea. :)
But (and sorry it is another story) what surprised me during my analysis of this problem is that import variable from module is more likely unpacking value than getting access to module's variable.
VAR = 1
def setter(a): global VAR VAR = a
def getter():
return VAR
from my_test import VAR, setter, getter print(VAR) # 1 print(getter()) # 1 setter(7) print(VAR) # 1 ! print(getter()) # 7 from my_test import VAR print(VAR) # 7 !
The more I understand python the more I see that I don't understand enough. :)
2016-05-29 19:23 GMT+02:00, Guido van Rossum guido@python.org:
Any idea using import syntax or even syntax similar to import is dead. Import is about modules and needs to stay about that.
On Sun, May 29, 2016 at 10:10 AM, Pavol Lisy pavol.lisy@gmail.com wrote:
I see backward compatibility problem with import :
sys = dict(version_info=4.0) from sys import version_info
This is legal and using import for unpacking dict could change this behavior.
In case we put higher priority to import from module then unexpected module in PYTHONPATH could change unpacked value. Sort of problem not easy to found. From library maintainer point of view not easy to avoid too.
Python-ideas mailing list Python-ideas@python.org https://mail.python.org/mailman/listinfo/python-ideas Code of Conduct: http://python.org/psf/codeofconduct/
-- --Guido van Rossum (python.org/~guido)
On Sun, May 29, 2016 at 1:49 PM Pavol Lisy pavol.lisy@gmail.com wrote:
from my_test import VAR, setter, getter
The more I understand python the more I see that I don't understand enough.
Seems like you expected the VAR
in your __main__ module would be
the
same variable as the one in the my_test
module
Each module's globals are separate namespaces.
from module import name
is roughly equivalent to 3 lines:
import module
name = module.name
del module
On May 29 2016, Guido van Rossum guido-+ZN9ApsXKcEdnm+yROfE0A@public.gmane.org wrote:
Any idea using import syntax or even syntax similar to import is dead.
Does that mean the whole idea is dead? Because as I see it, there are only two ways this could possibly be implemented:
As a statement similar to import, which you declared dead.
As an assignment, which you declared dead IIRC primarily because the RHS should not be "peeking" into the LHS.
Or am I interpreting "similar to import" too broadly? Is a statement using other words still on the table?
d = {"foo": 42} <something> d <something> 42 # or the other way around assert foo == 42
Best, Nikolaus
-- GPG encrypted emails preferred. Key id: 0xD113FCAC3C4E599F Fingerprint: ED31 791B 2C5C 1613 AF38 8B8A D113 FCAC 3C4E 599F
»Time flies like an arrow, fruit flies like a Banana.«
On Sun, May 29, 2016 at 1:05 PM, Nikolaus Rath Nikolaus@rath.org wrote:
On May 29 2016, Guido van Rossum guido-+ZN9ApsXKcEdnm+yROfE0A@public.gmane.org wrote:
Any idea using import syntax or even syntax similar to import is dead.
Does that mean the whole idea is dead? Because as I see it, there are only two ways this could possibly be implemented:
Any use of the word 'import' (even in combination with other words) is verboten. Anything starting with 'from' also sounds like a bad idea.
Right.
Or am I interpreting "similar to import" too broadly? Is a statement using other words still on the table?
d = {"foo": 42} <something> d <something> 42 # or the other way around assert foo == 42
This I don't understand -- why would the '42' appear in the extraction syntax? I guess you meant "foo"?
Maybe we can riff on
extract foo from d
? Though honestly that looks like it would be extracting d.foo, no d['foo'].
-- --Guido van Rossum (python.org/~guido)
On May 29 2016, Guido van Rossum guido-+ZN9ApsXKcEdnm+yROfE0A@public.gmane.org wrote:
using other words still on the table?
d = {"foo": 42} <something> d <something> 42 # or the other way around assert foo == 42
This I don't understand -- why would the '42' appear in the extraction syntax? I guess you meant "foo"?
Yes, sorry.
Maybe we can riff on
extract foo from d
? Though honestly that looks like it would be extracting d.foo, no d['foo'].
Yeah, but that might be useful too :-). How about:
extract key foo from d extract attribute foo import d
or
export key foo from d export attribute foo import d
As for "import", with both foo and d required to be identifiers.
Best, Nikolaus
-- GPG encrypted emails preferred. Key id: 0xD113FCAC3C4E599F Fingerprint: ED31 791B 2C5C 1613 AF38 8B8A D113 FCAC 3C4E 599F
»Time flies like an arrow, fruit flies like a Banana.«
On 30 May 2016 at 16:07, Nikolaus Rath Nikolaus@rath.org wrote:
Yeah, but that might be useful too :-). How about:
extract key foo from d extract attribute foo import d
or
export key foo from d export attribute foo import d
As for "import", with both foo and d required to be identifiers.
At this point, the question has to be, how is this any better than
foo = d.foo foo = d['foo']
???
Paul
On May 30 2016, Paul Moore p.f.moore-Re5JQEeQqe8AvxtiuMwx3w@public.gmane.org wrote:
On 30 May 2016 at 16:07, Nikolaus Rath Nikolaus-BTH8mxji4b0@public.gmane.org wrote:
Yeah, but that might be useful too :-). How about:
extract key foo from d extract attribute foo import d
or
export key foo from d export attribute foo import d
As for "import", with both foo and d required to be identifiers.
At this point, the question has to be, how is this any better than
foo = d.foo foo = d['foo']
Huh? I thought this has been discussed at length. It's not better in toy examples, but consider this:
r = query_result product_id = r['product_id'] quantity = r['quantity'] distributor = r['distributor'] description = r['destription']
Compare to
export key (product_id, quantity, distributor, description) from query_result
Best, -Nikolaus
-- GPG encrypted emails preferred. Key id: 0xD113FCAC3C4E599F Fingerprint: ED31 791B 2C5C 1613 AF38 8B8A D113 FCAC 3C4E 599F
»Time flies like an arrow, fruit flies like a Banana.«
On 31 May 2016 at 17:10, Nikolaus Rath Nikolaus@rath.org wrote:
On May 30 2016, Paul Moore p.f.moore-Re5JQEeQqe8AvxtiuMwx3w@public.gmane.org wrote:
On 30 May 2016 at 16:07, Nikolaus Rath Nikolaus-BTH8mxji4b0@public.gmane.org wrote:
Yeah, but that might be useful too :-). How about:
extract key foo from d extract attribute foo import d
or
export key foo from d export attribute foo import d
As for "import", with both foo and d required to be identifiers.
At this point, the question has to be, how is this any better than
foo = d.foo foo = d['foo']
Huh? I thought this has been discussed at length. It's not better in toy examples, but consider this:
r = query_result product_id = r['product_id'] quantity = r['quantity'] distributor = r['distributor'] description = r['destription']
Compare to
export key (product_id, quantity, distributor, description) from query_result
Not saying the current idiom can't be improved, but I don't like this approach - it seems to be focused on compressing the extraction into one line, which isn't a benefit to me I typically find it easier to line up, and scan, information vertically rather than horizontally. So the "export" approach for me would need to be written
export key ( product_id, quantity, distributor, description ) \ from query_result
That backslash after the close bracket is ugly. But not having a "blank" line before the "from query_result" line is also difficult to read. The list of repeated assignments is too repetitive, but nevertheless can be lined up more neatly.
Technically, the "export key" approach could probably me made readable in a way I'd be OK with, for example:
export key from query_result ( product_id, quantity, distributor, description )
but I'm not at all sure that needing 2 new keywords ("export" and "key") and a reuse of an existing one ("from") is going to fly - it's too wordy, feels like SQL or COBOL to me. Maybe if someone comes up with a one-word option for "export key from" then it would be viable...
Paul
On May 31 2016, Paul Moore p.f.moore-Re5JQEeQqe8AvxtiuMwx3w@public.gmane.org wrote:
Technically, the "export key" approach could probably me made readable in a way I'd be OK with, for example:
export key from query_result ( product_id, quantity, distributor, description )
but I'm not at all sure that needing 2 new keywords ("export" and "key") and a reuse of an existing one ("from") is going to fly - it's too wordy, feels like SQL or COBOL to me. Maybe if someone comes up with a one-word option for "export key from" then it would be viable...
How about
unravel query_result (this, that, something)
Best, -Nikolaus
-- GPG encrypted emails preferred. Key id: 0xD113FCAC3C4E599F Fingerprint: ED31 791B 2C5C 1613 AF38 8B8A D113 FCAC 3C4E 599F
»Time flies like an arrow, fruit flies like a Banana.«
On 31 May 2016 at 20:57, Nikolaus Rath Nikolaus@rath.org wrote:
On May 31 2016, Paul Moore p.f.moore-Re5JQEeQqe8AvxtiuMwx3w@public.gmane.org wrote:
Technically, the "export key" approach could probably me made readable in a way I'd be OK with, for example:
export key from query_result ( product_id, quantity, distributor, description )
but I'm not at all sure that needing 2 new keywords ("export" and "key") and a reuse of an existing one ("from") is going to fly - it's too wordy, feels like SQL or COBOL to me. Maybe if someone comes up with a one-word option for "export key from" then it would be viable...
How about
unravel query_result (this, that, something)
Meh. Don't assume that if you come up with a good word, I'll be in favour. At best, I'll go from -1 to -0 or maybe +0. Should we get to something that looks reasonably attractive to me, there's still issues with the whole thing being a niche problem, limited applicability (by leaping at "unravel" you lost the ability to extract attributes from an object - did you mean to do that?), etc.
Basically, don't waste too much time trying to convince me. A better bet would be to get sufficient support from others that my opinion is irrelevant (which it may well be anyway :-))
Paul
On May 31 2016, Paul Moore p.f.moore-Re5JQEeQqe8AvxtiuMwx3w@public.gmane.org wrote:
On 31 May 2016 at 20:57, Nikolaus Rath Nikolaus-BTH8mxji4b0@public.gmane.org wrote:
On May 31 2016, Paul Moore p.f.moore-Re5JQEeQqe8AvxtiuMwx3w-XMD5yJDbdMReXY1tMh2IBg@public.gmane.org wrote:
Technically, the "export key" approach could probably me made readable in a way I'd be OK with, for example:
export key from query_result ( product_id, quantity, distributor, description )
but I'm not at all sure that needing 2 new keywords ("export" and "key") and a reuse of an existing one ("from") is going to fly - it's too wordy, feels like SQL or COBOL to me. Maybe if someone comes up with a one-word option for "export key from" then it would be viable...
How about
unravel query_result (this, that, something)
Meh. Don't assume that if you come up with a good word, I'll be in favour. At best, I'll go from -1 to -0 or maybe +0. Should we get to something that looks reasonably attractive to me, there's still issues with the whole thing being a niche problem, limited applicability (by leaping at "unravel" you lost the ability to extract attributes from an object - did you mean to do that?), etc.
Basically, don't waste too much time trying to convince me. A better bet would be to get sufficient support from others that my opinion is irrelevant (which it may well be anyway :-))
Nah, I think I don't like my own idea anymore. I think the whole issue is much better addressed by using an assignment with a placeholder on the LHS. This also solves the DRY problem when creating things like namedtuples:
d = query_result product_id = q[$lhs] quantity = q[$lhs] distributor = q[$lhs] description = q[$lhs]
# or
my_favorite_named_tuple = namedtuple($lhs, "foo bar com")
$lhs would be a special construct that is only allowed in the RHS of an assignment statement and evaluates to a string representation of the identifier on the LHS.
But since I don't have time to learn how to extend the Python parser, I will shut up about this now.
Best, -Nikolaus
-- GPG encrypted emails preferred. Key id: 0xD113FCAC3C4E599F Fingerprint: ED31 791B 2C5C 1613 AF38 8B8A D113 FCAC 3C4E 599F
»Time flies like an arrow, fruit flies like a Banana.«
On Tue, May 31, 2016 at 1:22 PM, Paul Moore p.f.moore@gmail.com wrote:
but I'm not at all sure that needing 2 new keywords ("export" and "key") and a reuse of an existing one ("from") is going to fly - it's too wordy, feels like SQL or COBOL to me. Maybe if someone comes up with a one-word option for "export key from" then it would be viable...
As two words: fromdict <expression evaluating to dict> import <valid identifier>[ as <valid identifier>][, <valid identifier>[, ...]]
IMO:
extract
and export
don't satisfy that. (extract key
makes me itch when
I
see more than one key follow, and it isn't clear without reading
further that "key" isn't what's being extracted.)import
.It's too bad "fromdict" is used in the wild (https://pythonhosted.org/dictalchemy/#dictalchemy.utils.fromdict), and "frommapping" is too long (in letters, syllables, and constituent parts: "from", "map", and "-ing").
fromdict d import *
:
People might expect this, for symmetry with from ... import *
.
In CPy, there's no sense of dynamically-created local names (since
locals are looked up in the compiler), and you couldn't use them
anyway. (We don't talk about eval
and exec
.) In global scope,
import *
makes more sense, but I think you can just say, "If you
need something like that, you should've created a module instead."
Possible (gross) extensions: fromdict d import 'literalstring' as name2, (evaluatedname) as name2
In Tue, May 31, 2016 at 3:58 PM, Franklin? Lee
leewangzhong+python@gmail.com wrote:
In CPy, there's no sense of dynamically-created local names (since locals are looked up in the compiler)
I mean, in an array.
On Tue, May 31, 2016 at 3:59 PM Franklin? Lee leewangzhong+python@gmail.com wrote:
The desired feature really is more like import than unpacking.
That's the feature some people desire, but not what I originally proposed. I did want unpacking, including recursive unpacking and too-many/too-few errors.
I'm satisfied with the result that I'll need to use my own module for now.
On 30 May 2016 at 17:07, Nikolaus Rath Nikolaus@rath.org wrote:
extract attribute foo import d
Does this mean that attributes, values and properties from objects will be extractable too? If so, with this syntax, will the getter/setter properties still work as if it was in that object, or will the imported value contain the property.getter() value returned?
On 05/30/2016 08:07 AM, Nikolaus Rath wrote:
On May 29 2016, Guido van Rossum guido-+ZN9ApsXKcEdnm+yROfE0A@public.gmane.org wrote:
using other words still on the table?
d = {"foo": 42} <something> d <something> 42 # or the other way around assert foo == 42
This I don't understand -- why would the '42' appear in the extraction syntax? I guess you meant "foo"?
Yes, sorry.
Maybe we can riff on
extract foo from d
? Though honestly that looks like it would be extracting d.foo, no d['foo'].
Yeah, but that might be useful too :-). How about:
extract key foo from d extract attribute foo import d
or
export key foo from d export attribute foo import d
As for "import", with both foo and d required to be identifiers.
The versions with "import" in them are DAAP (dead-as-a-parrot). No point in even talking about them.
-- ~Ethan~
On May 30 2016, Ethan Furman ethan-gcWI5d7PMXnvaiG9KC9N7Q@public.gmane.org wrote:
On 05/30/2016 08:07 AM, Nikolaus Rath wrote:
On May 29 2016, Guido van Rossum guido-+ZN9ApsXKcEdnm+yROfE0A-XMD5yJDbdMReXY1tMh2IBg@public.gmane.org wrote:
using other words still on the table?
d = {"foo": 42} <something> d <something> 42 # or the other way around assert foo == 42
This I don't understand -- why would the '42' appear in the extraction syntax? I guess you meant "foo"?
Yes, sorry.
Maybe we can riff on
extract foo from d
? Though honestly that looks like it would be extracting d.foo, no d['foo'].
Yeah, but that might be useful too :-). How about:
extract key foo from d extract attribute foo import d
or
export key foo from d export attribute foo import d
As for "import", with both foo and d required to be identifiers.
The versions with "import" in them are DAAP (dead-as-a-parrot). No point in even talking about them.
Aeh, yeah. No idea how I ended up writing that. What I meant was
extract key foo from d extract attribute foo from d
or
export key foo from d export attribute foo from d
i.e, contrast "extract" with "export" for attribute access and dict access.
Best, -Nikolaus
-- GPG encrypted emails preferred. Key id: 0xD113FCAC3C4E599F Fingerprint: ED31 791B 2C5C 1613 AF38 8B8A D113 FCAC 3C4E 599F
»Time flies like an arrow, fruit flies like a Banana.«
On 2016-05-27 08:09, Nikolaus Rath wrote:
Very true. But as someone else already said (I can't find the email right now), we have a different construct that everyone is familiar with and that's easily adapted for this situation:
from dict context import active_id, active_ids, active_model
or more general:
"from dict" <expr> "import" <identifier list>
Everyone knows that "from .. import .." modifies the local namespace. We just have to extend it to work not just on modules, but also on dictionaries.
One problem with this is that currently "from" doesn't operate on names
or objects. It operates on module paths. You can't do this:
import somepackage as othername from othername import something
(You'll get "No module named othername".)
If we change this, it will be confusing, because when you see "from
blah import stuff" you won't know offhand whether it's going to just read a value from a dict or go searching the filesystem, which are quite different operations.
It might be possible, though, to leverage "from" without also having to
use "import":
from somedict get thiskey, thatkey
I'm not sure I actually like this idea, but I like it more than
repurposing "import". "import" has a quite idiosyncratic meaning in Python that has to do with looking for files on disk and running them; overloading this with totally file-local operations like getting keys from dicts seems too wild to me.
-- Brendan Barnwell "Do not follow where the path may lead. Go, instead, where there is no path, and leave a trail." --author unknown
On May 27 2016, Brendan Barnwell brenbarn-OJtuk2o/0Ank1uMJSBkQmQ@public.gmane.org wrote:
On 2016-05-27 08:09, Nikolaus Rath wrote:
Very true. But as someone else already said (I can't find the email right now), we have a different construct that everyone is familiar with and that's easily adapted for this situation:
from dict context import active_id, active_ids, active_model
or more general:
"from dict" <expr> "import" <identifier list>
Everyone knows that "from .. import .." modifies the local namespace. We just have to extend it to work not just on modules, but also on dictionaries.
One problem with this is that currently "from" doesn't operate on names or objects. It operates on module paths. You can't do this:
import somepackage as othername from othername import something
(You'll get "No module named othername".)
If we change this
I'm not proposing to change this. I'm proposing to add a new "from dict
<foo> import <bar>" statement that provides the new functionality.
Best, -Nikolaus
-- GPG encrypted emails preferred. Key id: 0xD113FCAC3C4E599F Fingerprint: ED31 791B 2C5C 1613 AF38 8B8A D113 FCAC 3C4E 599F
»Time flies like an arrow, fruit flies like a Banana.«
On 2016-05-27 11:29, Nikolaus Rath wrote:
On May 27 2016, Brendan Barnwell brenbarn-OJtuk2o/0Ank1uMJSBkQmQ@public.gmane.org wrote:
On 2016-05-27 08:09, Nikolaus Rath wrote:
Very true. But as someone else already said (I can't find the email right now), we have a different construct that everyone is familiar with and that's easily adapted for this situation:
from dict context import active_id, active_ids, active_model
or more general:
"from dict" <expr> "import" <identifier list>
Everyone knows that "from .. import .." modifies the local namespace. We just have to extend it to work not just on modules, but also on dictionaries.
One problem with this is that currently "from" doesn't operate on names or objects. It operates on module paths. You can't do this:
import somepackage as othername from othername import something
(You'll get "No module named othername".)
If we change this
I'm not proposing to change this. I'm proposing to add a new "from dict
<foo> import <bar>" statement that provides the new functionality.
Ah, I had missed that. I agree that resolves the ambiguity, but I
don't care much for that way of doing it. Does the object have to be a dict? What about some other mapping type? If new syntax were to be added, it makes more sense to me to change the "verb", a la "from someobject get somename".
-- Brendan Barnwell "Do not follow where the path may lead. Go, instead, where there is no path, and leave a trail." --author unknown
On May 26 2016, Ethan Furman ethan-gcWI5d7PMXnvaiG9KC9N7Q@public.gmane.org wrote:
On 05/26/2016 10:40 AM, Steven D'Aprano wrote:
On Thu, May 26, 2016 at 06:50:12PM +0200, Sven R. Kunze wrote:
So far, all proposals which deviate from Michael's one are ju
'a': s1, 'b': s2, 'c': s3 = mapping # no braces :)
That looks quite good to me. What do you think?
I think that if you submitted code to me with keys 'a', 'b', 'c' and variables s1, s2, s3, I'd probably reject it and tell you to use descriptive, meaningful keys and names.
I wish people would stop giving toy examples as examples of how nice the syntax looks, and instead try to use it with descriptive names taken from real code. I believe that, by far the majority of the time, you will be repeating the same names twice, and likely exceeding most reasonable line lengths:
'referer': referer, 'useragent': useragent, 'use_proxy': use_proxy, 'follow_links': follow_links, 'clobber': clobber, 'timeout': timeout = mapping
Still think it looks quite good? If you do, that's your right, of course, it's a matter of personal taste. But using toy examples with one or two letter variable names is not a fair or realistic test of what it will be like to use this syntax in real code.
With the simple syntax that I could live with, a real example could be:
{active_id, active_ids, active_model} = context
or
{partner_id, product_id, ship_to, product_ids} = values
which is more readable than
(partner_id, product_id, ship_to, product_ids = (values[k] for k in ['partner_id', 'product_id', 'ship_to', 'product_ids']))
Wow. That's a lot of room for typos and wrong order.
Where possible, I use
for n in ('partner_id', 'product_id', 'ship_to', 'product_ids'): globals()[n] = values[n]
but it would be nice to have a solution that's friendlier for static analyzers (the above code almost always produces warnings when e.g. partner_id is first accessed afterwards).
Best, -Nikolaus
-- GPG encrypted emails preferred. Key id: 0xD113FCAC3C4E599F Fingerprint: ED31 791B 2C5C 1613 AF38 8B8A D113 FCAC 3C4E 599F
»Time flies like an arrow, fruit flies like a Banana.«
On 26/05/2016 18:55, Ethan Furman wrote: >
With the simple syntax that I could live with, a real example could be:
{active_id, active_ids, active_model} = context
or
{partner_id, product_id, ship_to, product_ids} = values
which is more readable than
(partner_id, product_id, ship_to, product_ids = (values[k] for k in ['partner_id', 'product_id', 'ship_to', 'product_ids']))
Wow. That's a lot of room for typos and wrong order.
How about
locals().update(values)
or if you just want a subset
for k in ['partner_id', 'product_id', 'ship_to', 'product_ids']:
locals()[k] = values[k] # Look Ma, DRY
On Fri, May 27, 2016 at 12:43 PM, Rob Cliffe rob.cliffe@btinternet.com wrote:
How about
locals().update(values)
Because locals() can't be updated unless it happens to be the same dict as globals(), in which case it's clearer to use that name.
ChrisA
On 05/26/2016 07:57 PM, Chris Angelico wrote:
On Fri, May 27, 2016 at 12:43 PM, Rob Cliffe wrote:
How about
locals().update(values)
Because locals() can't be updated unless it happens to be the same dict as globals(), in which case it's clearer to use that name.
Actually, the dict returned by locals() can be updated, but at least in cPython those updates don't make it back to the function's locals (although they do in other Python's).
-- ~Ethan~
On Fri, May 27, 2016 at 1:05 PM, Ethan Furman ethan@stoneleaf.us wrote:
On 05/26/2016 07:57 PM, Chris Angelico wrote: >
On Fri, May 27, 2016 at 12:43 PM, Rob Cliffe wrote:
How about
locals().update(values)
Because locals() can't be updated unless it happens to be the same dict as globals(), in which case it's clearer to use that name.
Actually, the dict returned by locals() can be updated, but at least in cPython those updates don't make it back to the function's locals (although they do in other Python's).
Sorry, yeah. The dict is a dict, so of course it can be updated, but the changes aren't guaranteed to have effect. Point is, it's not a suitable way to do this. :)
ChrisA
Steven D'Aprano wrote:
I believe that, by far the majority of the time, you will be repeating the same names twice, and likely exceeding most reasonable line lengths:
'referer': referer, 'useragent': useragent, 'use_proxy': use_proxy, 'follow_links': follow_links, 'clobber': clobber, 'timeout': timeout = mapping
I think a more realistic usage would be to give short local names to something having long keys.
('referer': ref, 'useragent': ua, 'use_proxy': prox,
'follow_links': fl, 'clobber': clob, 'timeout': to) = mapping
So tying the keys to the names of the target variables might be too restrictive.
-- Greg
On Wed, May 25, 2016 at 9:56 PM Steven D'Aprano steve@pearwood.info wrote:
This will be approximately as helpful as iterable unpacking
What is your evidence for this claim?
[For extracting from a large dictionary] I'd rather unpack manually:
classify, spacer, width = (prefs[k] for k in
('classify', 'spacer',
'width'))
I agree that comprehensions plus tuple unpacking handle many possible use cases for dict unpacking.
There are many marginally-better situations that would just drum up endless back-and-forth about aesthetics. So let's look for a situation where dict unpacking handles well what current syntax struggles with.
An example of schema validation plus binding in the current syntax:
py> mapping = {'a': 1, 'b': 2}
py> schema = ('a', 'b')
py> unexpected = mapping.keys() - set(schema)
py> if unexpected:
... raise ValueError('unexpected keys %r' % unexpected)
...
py> x, y = (mapping[key] for key in schema)
With sets and comprehensions, that's very nice. More pleasant than most if not all other mainstream languages. Yet dict unpacking can be just a bit better -- more declarative, more say-what-you-mean, less thinking about algorithms. Fewer steps, too, but that's not very important.
The proposed syntax, examples of missing and excess keys.
py> mapping = {'a': 1, 'b': 2, 'c': 3}
py> {'a': x, 'b': y, 'c': z, 'd': s, 'e': t} = mapping
ValueError: missing 2 required keys 'd' and 'e'
py> {'a': x, 'b': y} = mapping
ValueError: got an unexpected key 'c'
If you tell me that's not enough better to justify changing the language, I won't argue very much. I agree that example, if that's the only use case, would need to be backed up by extensive review of code of major projects to see how much improvement it'd really provide.
Unpacking really starts to shine when doing nested/recursive destructuring.
My proposed syntax:
py> d = {'a': 1,
... 'b': {'c': 2, 'd': 3}}
py> {'a': x, 'b': {'c': y, 'd': y}} = d
py> x, y, z
1, 2, 3
In current syntax, even simply specifying the schema is troublesome if the order of keys is to be preserved for binding to the desired names.
>>> mapping = {'a': 1, 'b': {'c': 2, 'd': 3}}
>>> schema = OrderedDict([('a', None),
... ('b', OrderedDict([('c', None), ('d',
None)]))])
>>> x, y, z = ...
I tried writing out a couple comprehensions for the binding, but they were ugly. The traversal of nested mapping for validation, flattening and assignment needs a recursive function call or you'll end up with a disaster of nested loops.
And lastly, if you have your eye on the prize (pattern matching) then establishing a full-featured dict unpacking is a big step in the right direction. I may not have stated it when I started this thread, but the initial motivation for dict unpacking was the trouble we were having in our discussion of pattern matching. I wanted to break that big problem apart into smaller problems.
A possible (not proposed) syntax for dict and tuple pattern matching:
py> {'a': x, 'b': 0} = {'a': 1, 'b': 2}
ValueError: key 'b' does not match value 0
py> (a, 0) = (1, 2)
ValueError: index 1 does not match value 0
def spam(self, a, b, **kwargs):
...
I'd like to unpack a small number of keys:values from kwargs, extract them from the dict, and pass that on to another method:
fnord = kwargs.pop('fnord', 'default')
wibble = kwargs.pop('wibble', 42)
super().spam(a, b, **kwargs)
I don't have a concise syntax for this use-case, and yours won't work either. There's no way to supply defaults, nor can you list all the keys because you don't know what they will be. (The caller can provide arbitrary keyword arguments.)
You're right, I haven't thought about defaults. Hmm. Tuple unpacking doesn't support defaults either, so I guess I'll let this one go as not appropriate for dict unpacking.
I think your syntax is too verbose and repetitive for the simple case. >
It's no more repetitive than str.format with keyword arguments :-)
On 5/26/2016 1:09 AM, Michael Selik wrote:
I agree that comprehensions plus tuple unpacking handle many possible use cases for dict unpacking.
There is another dict method that has been ignored in this discussion.
mapping = {'a': 1, 'b': 2, 'c': 3} x = mapping.pop('a') mapping {'c': 3, 'b': 2}
We have both subscripting and popping to handle getting a value and either leaving or removing the pair.
-- Terry Jan Reedy
We could provide several options though.
a, b, c = mapping a, b, c = **mapping
For the simple case.
AND
{"1": a, "foo.bar": b, c} = mapping
For more complex cases.
We already have suitabilities with regular unpacking such as:
a, (b, c) = 1, range(2)
The thing with those details is that you can completly ignore them, and don't know they exist, and simply look it up when you need it.
But I must say:
{"1": a, "foo.bar": b, c} = mapping
Looks very ugly.
Le 26/05/2016 07:48, Terry Reedy a écrit :
On 5/26/2016 1:09 AM, Michael Selik wrote:
I agree that comprehensions plus tuple unpacking handle many possible use cases for dict unpacking.
There is another dict method that has been ignored in this discussion.
mapping = {'a': 1, 'b': 2, 'c': 3} x = mapping.pop('a') mapping {'c': 3, 'b': 2}
We have both subscripting and popping to handle getting a value and either leaving or removing the pair.
On Thu, May 26, 2016 at 04:10:10PM +0200, Michel Desmoulin wrote:
The thing with those details is that you can completly ignore them, and don't know they exist, and simply look it up when you need it.
What Google search terms would a Python programmer use to find out what
{"1": a, "foo.bar": b, c} = mapping
does?
Please don't dismiss the effect of unfamiliar syntax on the reader. Adding more magic syntax increases the cost and difficulty of reading the code and learning the language. That cost might be justified if the new syntax is useful enough, but so far this syntax appears to be of very marginal usefulness. There's no obvious use-case where it would be an overwhelming benefit, at least not yet.
I believe this syntax comes from Clojure. Can you give some examples of real-world code using this syntax in Clojure? (Not toy demonstrations of how it works, but working code that uses it to solve real problems.)
If Clojure programmers don't use it, then I expect neither will Python programmers.
-- Steve
On May 27 2016, Steven D'Aprano steve-iDnA/YwAAsAk+I/owrrOrA@public.gmane.org wrote:
On Thu, May 26, 2016 at 04:10:10PM +0200, Michel Desmoulin wrote:
The thing with those details is that you can completly ignore them, and don't know they exist, and simply look it up when you need it.
What Google search terms would a Python programmer use to find out what
{"1": a, "foo.bar": b, c} = mapping
does?
"python syntax reference"
That said, I don't like the idea either. But anything that introduces new syntax is going to be hard to Google. How do you Google what
def foo(bar) -> "something obscure"
means in Python?
Best, -Nikolaus
-- GPG encrypted emails preferred. Key id: 0xD113FCAC3C4E599F Fingerprint: ED31 791B 2C5C 1613 AF38 8B8A D113 FCAC 3C4E 599F
»Time flies like an arrow, fruit flies like a Banana.«
On 26/05/2016 18:51, Steven D'Aprano wrote:
On Thu, May 26, 2016 at 04:10:10PM +0200, Michel Desmoulin wrote:
The thing with those details is that you can completly ignore them, and don't know they exist, and simply look it up when you need it. What Google search terms would a Python programmer use to find out what
{"1": a, "foo.bar": b, c} = mapping
does?
Please don't dismiss the effect of unfamiliar syntax on the reader. Adding more magic syntax increases the cost and difficulty of reading the code and learning the language. That cost might be justified if the new syntax is useful enough, but so far this syntax appears to be of very marginal usefulness. There's no obvious use-case where it would be an overwhelming benefit, at least not yet. +1 With tuple and list unpacking (a,b) = (1,2) [a,b] = [1,2] # or even mixing: [a,b] = (1,2) the structure of the LHS and RHS mirror each other, making the meaning intuitive/obvious. Doing the same thing for dicts: { 'a' : a, 'b' : b } = { 'a' : 1, 'b' : 2 } makes the LHS too verbose to be very useful IMO. All the examples so far can be done in other, arguably better, ways. While a more concise syntax would break the "mirroring" and be confusing, or at least more to learn as Steven says.
I believe this syntax comes from Clojure. Can you give some examples of real-world code using this syntax in Clojure? (Not toy demonstrations of how it works, but working code that uses it to solve real problems.)
If Clojure programmers don't use it, then I expect neither will Python programmers.
On Thu, May 26, 2016, 10:46 PM Rob Cliffe rob.cliffe@btinternet.com wrote:
On 26/05/2016 18:51, Steven D'Aprano wrote:
On Thu, May 26, 2016 at 04:10:10PM +0200, Michel Desmoulin wrote: With tuple and list unpacking (a,b) = (1,2) the structure of the LHS and RHS mirror each other, making the meaning intuitive/obvious. Doing the same thing for dicts: { 'a' : a, 'b' : b } = { 'a' : 1, 'b' : 2 } makes the LHS too verbose to be very useful IMO.
But is it intuitive/obvious? Sounds like you think so.
All the examples so far can be done in other, arguably better, ways.
Not for nested destructuring. Maybe my example got lost in the shuffle. No one replied to it.
When I worked on the iterable unpacking (PEP 448), I noticed that it would be pretty easy to add dict unpacking like what you're asking for to the grammar, parsing, and source code generation. I will leave the practical language development questions to other people, but from a CPython extension standpoint, I don't think this is a lot of work.
Best,
Neil
On Wednesday, May 25, 2016 at 9:12:30 AM UTC-4, Michael Selik wrote: >
Python's iterable unpacking is what Lispers might call a destructuring bind.
py> iterable = 1, 2, 3, 4, 5
py> a, b, *rest = iterable
py> a, b, rest
(1, 2, (3, 4, 5))
Clojure also supports mapping destructuring. Let's add that to Python!
py> mapping = {"a": 1, "b": 2, "c": 3}
py> {"a": x, "b": y, "c": z} = mapping
py> x, y, z
(1, 2, 3)
py> {"a": x, "b": y} = mapping
Traceback:
ValueError: too many keys to unpack
This will be approximately as helpful as iterable unpacking was before PEP 3132 (https://www.python.org/dev/peps/pep-3132/).
I hope to keep discussion in this thread focused on the most basic form of dict unpacking, but we could extended mapping unpacking similarly to how PEP 3132 extended iterable unpacking. Just brainstorming...
py> mapping = {"a": 1, "b": 2, "c": 3}
py> {"a": x, **rest} = mapping
py> x, rest
(1, {"b": 2, "c": 3})