Analog of PEP 448 for dicts (unpacking in assignment with dict rhs)

The following works now:
seq = [1, 2] d = {'c': 3, 'a': 1, 'b': 2}
(el1, el2) = *seq el1, el2 = *seq head, *tail = *seq
seq_new = (*seq, *tail) dict_new = {**d, **{'c': 4}}
def f(arg1, arg2, a, b, c): pass
f(*seq, **d)
It seems like dict unpacking syntax would not be fully coherent with list unpacking syntax without something like:
{b, a, **other} = **d
Because iterables have both syntax for function call unpacking and "rhs in assignment unpacking" and dict has only function call unpacking syntax.
I was not able to find any PEPs that suggest this (search keywords: "PEP 445 dicts", "dictionary unpacking assignment", checked PEP-0), however, let me know if I am wrong.
The main use-case, in my understating, is getting shortcuts to elements of a dictionary if they are going to be used more then ones later in the scope. A made-up example is using a config to initiate a bunch of things with many config arguments with long names that have overlap in keywords used in initialization.
One should either write long calls like
start_a(config['parameter1'], config['parameter2'], config['parameter3'], config['parameter4'])
start_b(config['parameter3'], config['parameter2'], config['parameter3'], config['parameter4'])
many times or use a list-comprehension solution mentioned above.
It becomes even worse (in terms of readability) with nested structures.
start_b(config['group2']['parameter3'], config['parameter2'], config['parameter3'], config['group2']['parameter3'])
## Rationale
Right now this problem is often solved using [list] comprehensions, but this is somewhat verbose:
a, b = (d[k] for k in ['a', 'b'])
or direct per-instance assignment (looks simple for with single-character keys, but often becomes very verbose with real-world long key names)
a = d['a'] b = d['b']
Alternatively one could have a very basic method\function get_n() or __getitem__() accepting more then a single argument
a, b = d.get_n('a', 'b') a, b = get_n(d, 'a', 'b') a, b = d['a', 'b']
All these approaches require verbose double-mentioning of same key. It becomes even worse if you have nested structures of dictionaries.
## Concerns and questions:
0. This is the most troubling part, imho, other questions are more like common thoughts. It seems (to put it mildly) weird that execution flow depends on names of local variables.
For example, one can not easily refactor these variable names. However, same is true for dictionary keys anyway: you can not suddenly decide and refactor your code to expect dictionaries with keys 'c' and 'd' whereas your entire system still expects you to use dictionaries with keys 'a' and 'b'. A counter-objection is that this specific scenario is usually handled with record\struct-like classes with fixed members rather then dicts, so this is not an issue.
Quite a few languages (closure and javascript to name a few) seem to have this feature now and it seems like they did not suffer too much from refactoring hell. This does not mean that their approach is good, just that it is "manageable".
1. This line seems coherent with sequence syntax, but redundant: {b, a, **other} = **d
and the following use of "throwaway" variable just looks poor visually {b, a, **_} = **d
could it be less verbose like this {b, a} = **d
but it is not very coherent with lists behavior.
E.g. what if that line did not raise something like "ValueError: Too many keys to unpack, got an unexpected keyword argument 'c'".
2. Unpacking in other contexts
{self.a, b, **other} = **d
should it be interpreted as self.a, b = d['a'], d['b']
or
self.a, b = d['self.a'], d['b']
probably the first, but what I am saying is that these name-extracting rules should be strictly specified and it might not be trivial.
--- Ben

2017-11-10 19:53 GMT-08:00 Ben Usman bigobangux@gmail.com:
The following works now:
seq = [1, 2] d = {'c': 3, 'a': 1, 'b': 2}
(el1, el2) = *seq el1, el2 = *seq head, *tail = *seq
seq_new = (*seq, *tail) dict_new = {**d, **{'c': 4}}
def f(arg1, arg2, a, b, c): pass
f(*seq, **d)
It seems like dict unpacking syntax would not be fully coherent with list unpacking syntax without something like:
{b, a, **other} = **d
Because iterables have both syntax for function call unpacking and "rhs in assignment unpacking" and dict has only function call unpacking syntax.
I was not able to find any PEPs that suggest this (search keywords: "PEP 445 dicts", "dictionary unpacking assignment", checked PEP-0), however, let me know if I am wrong.
It was discussed at great length on Python-ideas about a year ago. There
is a thread called "Unpacking a dict" from May 2016.
The main use-case, in my understating, is getting shortcuts to elements of a dictionary if they are going to be used more then ones later in the scope. A made-up example is using a config to initiate a bunch of things with many config arguments with long names that have overlap in keywords used in initialization.
One should either write long calls like
start_a(config['parameter1'], config['parameter2'], config['parameter3'], config['parameter4'])
start_b(config['parameter3'], config['parameter2'], config['parameter3'], config['parameter4'])
many times or use a list-comprehension solution mentioned above.
It becomes even worse (in terms of readability) with nested structures.
start_b(config['group2']['parameter3'], config['parameter2'], config['parameter3'], config['group2']['parameter3'])
## Rationale
Right now this problem is often solved using [list] comprehensions, but this is somewhat verbose:
a, b = (d[k] for k in ['a', 'b'])
or direct per-instance assignment (looks simple for with single-character keys, but often becomes very verbose with real-world long key names)
a = d['a'] b = d['b']
Alternatively one could have a very basic method\function get_n() or __getitem__() accepting more then a single argument
a, b = d.get_n('a', 'b') a, b = get_n(d, 'a', 'b') a, b = d['a', 'b']
All these approaches require verbose double-mentioning of same key. It becomes even worse if you have nested structures of dictionaries.
## Concerns and questions:
- This is the most troubling part, imho, other questions
are more like common thoughts. It seems (to put it mildly) weird that execution flow depends on names of local variables.
For example, one can not easily refactor these variable names. However, same is true for dictionary keys anyway: you can not suddenly decide and refactor your code to expect dictionaries with keys 'c' and 'd' whereas your entire system still expects you to use dictionaries with keys 'a' and 'b'. A counter-objection is that this specific scenario is usually handled with record\struct-like classes with fixed members rather then dicts, so this is not an issue.
Quite a few languages (closure and javascript to name a few) seem to have this feature now and it seems like they did not suffer too much from refactoring hell. This does not mean that their approach is good, just that it is "manageable".
- This line seems coherent with sequence syntax, but redundant:
{b, a, **other} = **d
and the following use of "throwaway" variable just looks poor visually {b, a, **_} = **d
could it be less verbose like this {b, a} = **d
but it is not very coherent with lists behavior.
E.g. what if that line did not raise something like "ValueError: Too many keys to unpack, got an unexpected keyword argument 'c'".
- Unpacking in other contexts
{self.a, b, **other} = **d
should it be interpreted as self.a, b = d['a'], d['b']
or
self.a, b = d['self.a'], d['b']
probably the first, but what I am saying is that these name-extracting rules should be strictly specified and it might not be trivial.
Ben
Python-Dev mailing list Python-Dev@python.org https://mail.python.org/mailman/listinfo/python-dev Unsubscribe: https://mail.python.org/mailman/options/python-dev/ jelle.zijlstra%40gmail.com

Got it, thank you. I'll go and check it out!
On Nov 11, 2017 01:22, "Jelle Zijlstra" jelle.zijlstra@gmail.com wrote:
2017-11-10 19:53 GMT-08:00 Ben Usman bigobangux@gmail.com:
The following works now:
seq = [1, 2] d = {'c': 3, 'a': 1, 'b': 2}
(el1, el2) = *seq el1, el2 = *seq head, *tail = *seq
seq_new = (*seq, *tail) dict_new = {**d, **{'c': 4}}
def f(arg1, arg2, a, b, c): pass
f(*seq, **d)
It seems like dict unpacking syntax would not be fully coherent with list unpacking syntax without something like:
{b, a, **other} = **d
Because iterables have both syntax for function call unpacking and "rhs in assignment unpacking" and dict has only function call unpacking syntax.
I was not able to find any PEPs that suggest this (search keywords: "PEP 445 dicts", "dictionary unpacking assignment", checked PEP-0), however, let me know if I am wrong.
It was discussed at great length on Python-ideas about a year ago. There
is a thread called "Unpacking a dict" from May 2016.
The main use-case, in my understating, is getting shortcuts to elements of a dictionary if they are going to be used more then ones later in the scope. A made-up example is using a config to initiate a bunch of things with many config arguments with long names that have overlap in keywords used in initialization.
One should either write long calls like
start_a(config['parameter1'], config['parameter2'], config['parameter3'], config['parameter4'])
start_b(config['parameter3'], config['parameter2'], config['parameter3'], config['parameter4'])
many times or use a list-comprehension solution mentioned above.
It becomes even worse (in terms of readability) with nested structures.
start_b(config['group2']['parameter3'], config['parameter2'], config['parameter3'], config['group2']['parameter3'])
## Rationale
Right now this problem is often solved using [list] comprehensions, but this is somewhat verbose:
a, b = (d[k] for k in ['a', 'b'])
or direct per-instance assignment (looks simple for with single-character keys, but often becomes very verbose with real-world long key names)
a = d['a'] b = d['b']
Alternatively one could have a very basic method\function get_n() or __getitem__() accepting more then a single argument
a, b = d.get_n('a', 'b') a, b = get_n(d, 'a', 'b') a, b = d['a', 'b']
All these approaches require verbose double-mentioning of same key. It becomes even worse if you have nested structures of dictionaries.
## Concerns and questions:
- This is the most troubling part, imho, other questions
are more like common thoughts. It seems (to put it mildly) weird that execution flow depends on names of local variables.
For example, one can not easily refactor these variable names. However, same is true for dictionary keys anyway: you can not suddenly decide and refactor your code to expect dictionaries with keys 'c' and 'd' whereas your entire system still expects you to use dictionaries with keys 'a' and 'b'. A counter-objection is that this specific scenario is usually handled with record\struct-like classes with fixed members rather then dicts, so this is not an issue.
Quite a few languages (closure and javascript to name a few) seem to have this feature now and it seems like they did not suffer too much from refactoring hell. This does not mean that their approach is good, just that it is "manageable".
- This line seems coherent with sequence syntax, but redundant:
{b, a, **other} = **d
and the following use of "throwaway" variable just looks poor visually {b, a, **_} = **d
could it be less verbose like this {b, a} = **d
but it is not very coherent with lists behavior.
E.g. what if that line did not raise something like "ValueError: Too many keys to unpack, got an unexpected keyword argument 'c'".
- Unpacking in other contexts
{self.a, b, **other} = **d
should it be interpreted as self.a, b = d['a'], d['b']
or
self.a, b = d['self.a'], d['b']
probably the first, but what I am saying is that these name-extracting rules should be strictly specified and it might not be trivial.
Ben
Python-Dev mailing list Python-Dev@python.org https://mail.python.org/mailman/listinfo/python-dev Unsubscribe: https://mail.python.org/mailman/options/python-dev/jelle. zijlstra%40gmail.com

Ben, I have a small package which enables one to do:
with MapGetter(my_dictionary): from my_dictionary import a, b, parameter3
If this interests you, contributions so it can get hardenned for mainstram acceptance are welcome. https://github.com/jsbueno/extradict
On 11 November 2017 at 04:26, Ben Usman bigobangux@gmail.com wrote:
Got it, thank you. I'll go and check it out!
On Nov 11, 2017 01:22, "Jelle Zijlstra" jelle.zijlstra@gmail.com wrote:
2017-11-10 19:53 GMT-08:00 Ben Usman bigobangux@gmail.com:
The following works now:
seq = [1, 2] d = {'c': 3, 'a': 1, 'b': 2}
(el1, el2) = *seq el1, el2 = *seq head, *tail = *seq
seq_new = (*seq, *tail) dict_new = {**d, **{'c': 4}}
def f(arg1, arg2, a, b, c): pass
f(*seq, **d)
It seems like dict unpacking syntax would not be fully coherent with list unpacking syntax without something like:
{b, a, **other} = **d
Because iterables have both syntax for function call unpacking and "rhs in assignment unpacking" and dict has only function call unpacking syntax.
I was not able to find any PEPs that suggest this (search keywords: "PEP 445 dicts", "dictionary unpacking assignment", checked PEP-0), however, let me know if I am wrong.
It was discussed at great length on Python-ideas about a year ago. There is a thread called "Unpacking a dict" from May 2016.
The main use-case, in my understating, is getting shortcuts to elements of a dictionary if they are going to be used more then ones later in the scope. A made-up example is using a config to initiate a bunch of things with many config arguments with long names that have overlap in keywords used in initialization.
One should either write long calls like
start_a(config['parameter1'], config['parameter2'], config['parameter3'], config['parameter4'])
start_b(config['parameter3'], config['parameter2'], config['parameter3'], config['parameter4'])
many times or use a list-comprehension solution mentioned above.
It becomes even worse (in terms of readability) with nested structures.
start_b(config['group2']['parameter3'], config['parameter2'], config['parameter3'], config['group2']['parameter3'])
## Rationale
Right now this problem is often solved using [list] comprehensions, but this is somewhat verbose:
a, b = (d[k] for k in ['a', 'b'])
or direct per-instance assignment (looks simple for with single-character keys, but often becomes very verbose with real-world long key names)
a = d['a'] b = d['b']
Alternatively one could have a very basic method\function get_n() or __getitem__() accepting more then a single argument
a, b = d.get_n('a', 'b') a, b = get_n(d, 'a', 'b') a, b = d['a', 'b']
All these approaches require verbose double-mentioning of same key. It becomes even worse if you have nested structures of dictionaries.
## Concerns and questions:
- This is the most troubling part, imho, other questions
are more like common thoughts. It seems (to put it mildly) weird that execution flow depends on names of local variables.
For example, one can not easily refactor these variable names. However, same is true for dictionary keys anyway: you can not suddenly decide and refactor your code to expect dictionaries with keys 'c' and 'd' whereas your entire system still expects you to use dictionaries with keys 'a' and 'b'. A counter-objection is that this specific scenario is usually handled with record\struct-like classes with fixed members rather then dicts, so this is not an issue.
Quite a few languages (closure and javascript to name a few) seem to have this feature now and it seems like they did not suffer too much from refactoring hell. This does not mean that their approach is good, just that it is "manageable".
- This line seems coherent with sequence syntax, but redundant:
{b, a, **other} = **d
and the following use of "throwaway" variable just looks poor visually {b, a, **_} = **d
could it be less verbose like this {b, a} = **d
but it is not very coherent with lists behavior.
E.g. what if that line did not raise something like "ValueError: Too many keys to unpack, got an unexpected keyword argument 'c'".
- Unpacking in other contexts
{self.a, b, **other} = **d
should it be interpreted as self.a, b = d['a'], d['b']
or
self.a, b = d['self.a'], d['b']
probably the first, but what I am saying is that these name-extracting rules should be strictly specified and it might not be trivial.
Ben
Python-Dev mailing list Python-Dev@python.org https://mail.python.org/mailman/listinfo/python-dev Unsubscribe: https://mail.python.org/mailman/options/python-dev/jelle.zijlstra%40gmail.co...
Python-Dev mailing list Python-Dev@python.org https://mail.python.org/mailman/listinfo/python-dev Unsubscribe: https://mail.python.org/mailman/options/python-dev/jsbueno%40python.org.br

On 11 November 2017 at 16:22, Jelle Zijlstra jelle.zijlstra@gmail.com wrote:
2017-11-10 19:53 GMT-08:00 Ben Usman bigobangux@gmail.com:
I was not able to find any PEPs that suggest this (search keywords: "PEP 445 dicts", "dictionary unpacking assignment", checked PEP-0), however, let me know if I am wrong.
It was discussed at great length on Python-ideas about a year ago. There is a thread called "Unpacking a dict" from May 2016.
I tend to post this every time the topic comes up, but: it's highly unlikely we'll get syntax for this when we don't even have a builtin to extract multiple items from a mapping in a single operation.
So if folks would like dict unpacking syntax, then a suitable place to start would be a proposal for a "getitems" builtin that allowed operations like:
b, a = getitems(d, ("b", "a"))
operator.itemgetter and operator.attrgetter may provide some inspiration for possible proposals.
Cheers, Nick.

Do you mean making getitems call itemgetter?
At the moment we can already do with itemgetter:
from operator import itemgetter a,b = itemgetter("a", "b")(d)
I tend to post this every time the topic comes up, but: it's highly unlikely we'll get syntax for this when we don't even have a builtin to extract multiple items from a mapping in a single operation.
You mean subitems as attrgetter does? That would be actually quite cool!
d = dict(a=dict(b=1), b=dict(c=2)) ab, ac = itemgetter("a.b", "b.c", separator=".")(d)
I've created an issue in case something like that is desired: https://bugs.python.org/issue32010 No real strong push for it, happy to just close it if it does not get interest.
That said I am not sure it solves Ben requests as he seamed to be targetting a way to bind the variable name with the dictionary keys implicitly.
On 12 November 2017 at 10:06, Nick Coghlan ncoghlan@gmail.com wrote:
On 11 November 2017 at 16:22, Jelle Zijlstra jelle.zijlstra@gmail.com wrote:
2017-11-10 19:53 GMT-08:00 Ben Usman bigobangux@gmail.com:
I was not able to find any PEPs that suggest this (search keywords: "PEP 445 dicts", "dictionary unpacking assignment", checked PEP-0), however, let me know if I am wrong.
It was discussed at great length on Python-ideas about a year ago. There
is
a thread called "Unpacking a dict" from May 2016.
I tend to post this every time the topic comes up, but: it's highly unlikely we'll get syntax for this when we don't even have a builtin to extract multiple items from a mapping in a single operation.
So if folks would like dict unpacking syntax, then a suitable place to start would be a proposal for a "getitems" builtin that allowed operations like:
b, a = getitems(d, ("b", "a"))
operator.itemgetter and operator.attrgetter may provide some inspiration for possible proposals.
Cheers, Nick.
-- Nick Coghlan | ncoghlan@gmail.com | Brisbane, Australia _______________________________________________ Python-Dev mailing list Python-Dev@python.org https://mail.python.org/mailman/listinfo/python-dev Unsubscribe: https://mail.python.org/mailman/options/python-dev/ mariocj89%40gmail.com

12.11.17 12:06, Nick Coghlan пише:
So if folks would like dict unpacking syntax, then a suitable place to start would be a proposal for a "getitems" builtin that allowed operations like:
b, a = getitems(d, ("b", "a"))
operator.itemgetter and operator.attrgetter may provide some inspiration for possible proposals.
I don't see any relations between this getitems and operator.itemgetter or operator.attrgetter. getitems can be implemented as:
(the most obvious way)
def getitems(mapping, keys): for key in keys: yield mapping[key]
or
def getitems(mapping, keys): return map(functools.partial(operator.getitem, mapping), keys)
or (simpler but rough equivalent)
def getitems(mapping, keys): return map(mapping.__getitem__, keys)

Sounds like that happens quite often.
Yep, I totally agree with your point, I think I mentioned something like this in the post as a possible partial solution: a drop-in replacement for an ugly list compression people seem to be using now to solve the problem. It's easy to implement, but the adoption by community is questionable. I mean, if this is a relatively rare use case, but those who need it seem to have their own one-liners for that already, is there even a need for a method or function like this in standard library? To unify to improve readability (single standard "getitems" instead of many different get_n, gets, get_mutliple)? The only motivation I can think of, and even it is questionable.
On Nov 12, 2017 05:06, "Nick Coghlan" ncoghlan@gmail.com wrote:
On 11 November 2017 at 16:22, Jelle Zijlstra jelle.zijlstra@gmail.com wrote:
2017-11-10 19:53 GMT-08:00 Ben Usman bigobangux@gmail.com:
I was not able to find any PEPs that suggest this (search keywords: "PEP 445 dicts", "dictionary unpacking assignment", checked PEP-0), however, let me know if I am wrong.
It was discussed at great length on Python-ideas about a year ago. There
is
a thread called "Unpacking a dict" from May 2016.
I tend to post this every time the topic comes up, but: it's highly unlikely we'll get syntax for this when we don't even have a builtin to extract multiple items from a mapping in a single operation.
So if folks would like dict unpacking syntax, then a suitable place to start would be a proposal for a "getitems" builtin that allowed operations like:
b, a = getitems(d, ("b", "a"))
operator.itemgetter and operator.attrgetter may provide some inspiration for possible proposals.
Cheers, Nick.
-- Nick Coghlan | ncoghlan@gmail.com | Brisbane, Australia

Anyway, considering that this has been discussed a lot in the original post in 2016, I suggest stopping any further discussions here to avoid littering dev mailing list. Sorry for starting the thread in the first place and thank you, Jelle, for pointing me to the original discussion.
On Nov 12, 2017 14:33, "Ben Usman" bigobangux@gmail.com wrote:
Sounds like that happens quite often.
Yep, I totally agree with your point, I think I mentioned something like this in the post as a possible partial solution: a drop-in replacement for an ugly list compression people seem to be using now to solve the problem. It's easy to implement, but the adoption by community is questionable. I mean, if this is a relatively rare use case, but those who need it seem to have their own one-liners for that already, is there even a need for a method or function like this in standard library? To unify to improve readability (single standard "getitems" instead of many different get_n, gets, get_mutliple)? The only motivation I can think of, and even it is questionable.
On Nov 12, 2017 05:06, "Nick Coghlan" ncoghlan@gmail.com wrote:
On 11 November 2017 at 16:22, Jelle Zijlstra jelle.zijlstra@gmail.com wrote:
2017-11-10 19:53 GMT-08:00 Ben Usman bigobangux@gmail.com:
I was not able to find any PEPs that suggest this (search keywords: "PEP 445 dicts", "dictionary unpacking assignment", checked PEP-0), however, let me know if I am wrong.
It was discussed at great length on Python-ideas about a year ago. There
is
a thread called "Unpacking a dict" from May 2016.
I tend to post this every time the topic comes up, but: it's highly unlikely we'll get syntax for this when we don't even have a builtin to extract multiple items from a mapping in a single operation.
So if folks would like dict unpacking syntax, then a suitable place to start would be a proposal for a "getitems" builtin that allowed operations like:
b, a = getitems(d, ("b", "a"))
operator.itemgetter and operator.attrgetter may provide some inspiration for possible proposals.
Cheers, Nick.
-- Nick Coghlan | ncoghlan@gmail.com | Brisbane, Australia
participants (6)
-
Ben Usman
-
Jelle Zijlstra
-
Joao S. O. Bueno
-
Mario Corchero
-
Nick Coghlan
-
Serhiy Storchaka