[Python-checkins] bpo-35224: PEP 572 Implementation (#10497)

Emily Morehouse webhook-mailer at python.org
Thu Jan 24 18:50:01 EST 2019


https://github.com/python/cpython/commit/8f59ee01be3d83d5513a9a3f654a237d77d80d9a
commit: 8f59ee01be3d83d5513a9a3f654a237d77d80d9a
branch: master
author: Emily Morehouse <emilyemorehouse at gmail.com>
committer: GitHub <noreply at github.com>
date: 2019-01-24T16:49:56-07:00
summary:

bpo-35224: PEP 572 Implementation (#10497)

* Add tokenization of :=
- Add token to Include/token.h. Add token to documentation in Doc/library/token.rst.
- Run `./python Lib/token.py` to regenerate Lib/token.py.
- Update Parser/tokenizer.c: add case to handle `:=`.

* Add initial usage of := in grammar.

* Update Python.asdl to match the grammar updates. Regenerated Include/Python-ast.h and Python/Python-ast.c

* Update AST and compiler files in Python/ast.c and Python/compile.c. Basic functionality, this isn't scoped properly

* Regenerate Lib/symbol.py using `./python Lib/symbol.py`

* Tests - Fix failing tests in test_parser.py due to changes in token numbers for internal representation

* Tests - Add simple test for := token

* Tests - Add simple tests for named expressions using expr and suite

* Tests - Update number of levels for nested expressions to prevent stack overflow

* Update symbol table to handle NamedExpr

* Update Grammar to allow assignment expressions in if statements.
Regenerate Python/graminit.c accordingly using `make regen-grammar`

* Tests - Add additional tests for named expressions in RoundtripLegalSyntaxTestCase, based on examples and information directly from PEP 572

Note: failing tests are currently commented out (4 out of 24 tests currently fail)

* Tests - Add temporary syntax test failure tests in test_parser.py

Note: There is an outstanding TODO for this -- syntax tests need to be
moved to a different file (presumably test_syntax.py), but this is
covering what needs to be tested at the moment, and it's more convenient
to run a single test for the time being

* Add support for allowing assignment expressions as function argument annotations. Uncomment tests for these cases because they all pass now!

* Tests - Move existing syntax tests out of test_parser.py and into test_named_expressions.py. Refactor syntax tests to use unittest

* Add TargetScopeError exception to extend SyntaxError

Note: This simply creates the TargetScopeError exception, it is not yet
used anywhere

* Tests - Update tests per PEP 572

Continue refactoring test suite:
The named expression test suite now checks for any invalid cases that
throw exceptions (no longer limited to SyntaxErrors), assignment tests
to ensure that variables are properly assigned, and scope tests to
ensure that variable availability and values are correct

Note:
- There are still tests that are marked to skip, as they are not yet
implemented
- There are approximately 300 lines of the PEP that have not yet been
addressed, though these may be deferred

* Documentation - Small updates to XXX/todo comments

- Remove XXX from child description in ast.c
- Add comment with number of previously supported nested expressions for
3.7.X in test_parser.py

* Fix assert in seq_for_testlist()

* Cleanup - Denote "Not implemented -- No keyword args" on failing test case. Fix PEP8 error for blank lines at beginning of test classes in test_parser.py

* Tests - Wrap all file opens in `with...as` to ensure files are closed

* WIP: handle f(a := 1)

* Tests and Cleanup - No longer skips keyword arg test. Keyword arg test now uses a simpler test case and does not rely on an external file. Remove print statements from ast.c

* Tests - Refactor last remaining test case that relied on on external file to use a simpler test case without the dependency

* Tests - Add better description of remaning skipped tests. Add test checking scope when using assignment expression in a function argument

* Tests - Add test for nested comprehension, testing value and scope. Fix variable name in skipped comprehension scope test

* Handle restriction of LHS for named expressions - can only assign to LHS of type NAME. Specifically, restrict assignment to tuples

This adds an alternative set_context specifically for named expressions,
set_namedexpr_context. Thus, context is now set differently for standard
assignment versus assignment for named expressions in order to handle
restrictions.

* Tests - Update negative test case for assigning to lambda to match new error message. Add negative test case for assigning to tuple

* Tests - Reorder test cases to group invalid syntax cases and named assignment target errors

* Tests - Update test case for named expression in function argument - check that result and variable are set correctly

* Todo - Add todo for TargetScopeError based on Guido's comment (https://github.com/python/cpython/commit/2b3acd37bdfc2d35e5094228c6684050d2aa8b0a#r30472562)

* Tests - Add named expression tests for assignment operator in function arguments

Note: One of two tests are skipped, as function arguments are currently treating
an assignment expression inside of parenthesis as one child, which does
not properly catch the named expression, nor does it count arguments
properly

* Add NamedStore to expr_context. Regenerate related code with `make regen-ast`

* Add usage of NamedStore to ast_for_named_expr in ast.c. Update occurances of checking for Store to also handle NamedStore where appropriate

* Add ste_comprehension to _symtable_entry to track if the namespace is a comprehension. Initialize ste_comprehension to 0. Set set_comprehension to 1 in symtable_handle_comprehension

* s/symtable_add_def/symtable_add_def_helper. Add symtable_add_def to handle grabbing st->st_cur and passing it to symtable_add_def_helper. This now allows us to call the original code from symtable_add_def by instead calling symtable_add_def_helper with a different ste.

* Refactor symtable_record_directive to take lineno and col_offset as arguments instead of stmt_ty. This allows symtable_record_directive to be used for stmt_ty and expr_ty

* Handle elevating scope for named expressions in comprehensions.

* Handle error for usage of named expression inside a class block

* Tests - No longer skip scope tests. Add additional scope tests

* Cleanup - Update error message for named expression within a comprehension within a class. Update comments. Add assert for symtable_extend_namedexpr_scope to validate that we always find at least a ModuleScope if we don't find a Class or FunctionScope

* Cleanup - Add missing case for NamedStore in expr_context_name. Remove unused var in set_namedexpr_content

* Refactor - Consolidate set_context and set_namedexpr_context to reduce duplicated code. Special cases for named expressions are handled by checking if ctx is NamedStore

* Cleanup - Add additional use cases for ast_for_namedexpr in usage comment. Fix multiple blank lines in test_named_expressions

* Tests - Remove unnecessary test case. Renumber test case function names

* Remove TargetScopeError for now. Will add back if needed

* Cleanup - Small comment nit for consistency

* Handle positional argument check with named expression

* Add TargetScopeError exception definition. Add documentation for TargetScopeError in c-api docs. Throw TargetScopeError instead of SyntaxError when using a named expression in a comprehension within a class scope

* Increase stack size for parser by 200. This is a minimal change (approx. 5kb) and should not have an impact on any systems. Update parser test to allow 99 nested levels again

* Add TargetScopeError to exception_hierarchy.txt for test_baseexception.py_

* Tests - Major update for named expression tests, both in test_named_expressions and test_parser

- Add test for TargetScopeError
- Add tests for named expressions in comprehension scope and edge cases
- Add tests for named expressions in function arguments (declarations
and call sites)
- Reorganize tests to group them more logically

* Cleanup - Remove unnecessary comment

* Cleanup - Comment nitpicks

* Explicitly disallow assignment expressions to a name inside parentheses, e.g.: ((x) := 0)

- Add check for LHS types to detect a parenthesis then a name (see note)
- Add test for this scenario
- Update tests for changed error message for named assignment to a tuple
(also, see note)

Note: This caused issues with the previous error handling for named assignment
to a LHS that contained an expression, such as a tuple. Thus, the check
for the LHS of a named expression must be changed to be more specific if
we wish to maintain the previous error messages

* Cleanup - Wrap lines more strictly in test file

* Revert "Explicitly disallow assignment expressions to a name inside parentheses, e.g.: ((x) := 0)"

This reverts commit f1531400ca7d7a2d148830c8ac703f041740896d.

* Add NEWS.d entry

* Tests - Fix error in test_pickle.test_exceptions by adding TargetScopeError to list of exceptions

* Tests - Update error message tests to reflect improved messaging convention (s/can't/cannot)

* Remove cases that cannot be reached in compile.c. Small linting update.

* Update Grammar/Tokens to add COLONEQUAL. Regenerate all files

* Update TargetScopeError PRE_INIT and POST_INIT, as this was purposefully left out when fixing rebase conflicts

* Add NamedStore back and regenerate files

* Pass along line number and end col info for named expression

* Simplify News entry

* Fix compiler warning and explicity mark fallthrough

files:
A Lib/test/test_named_expressions.py
A Misc/NEWS.d/next/Core and Builtins/2018-11-13-14-26-54.bpo-35224.F0B6UQ.rst
M Doc/c-api/exceptions.rst
M Doc/library/token-list.inc
M Grammar/Grammar
M Grammar/Tokens
M Include/Python-ast.h
M Include/graminit.h
M Include/pyerrors.h
M Include/symtable.h
M Include/token.h
M Lib/_compat_pickle.py
M Lib/symbol.py
M Lib/test/exception_hierarchy.txt
M Lib/test/test_parser.py
M Lib/test/test_tokenize.py
M Lib/token.py
M Objects/exceptions.c
M PC/python3.def
M Parser/Python.asdl
M Parser/parser.h
M Parser/token.c
M Python/Python-ast.c
M Python/ast.c
M Python/compile.c
M Python/graminit.c
M Python/symtable.c

diff --git a/Doc/c-api/exceptions.rst b/Doc/c-api/exceptions.rst
index dd1e026cb071..6dd0c02c6d7b 100644
--- a/Doc/c-api/exceptions.rst
+++ b/Doc/c-api/exceptions.rst
@@ -800,6 +800,7 @@ the variables:
    single: PyExc_SystemError
    single: PyExc_SystemExit
    single: PyExc_TabError
+   single: PyExc_TargetScopeError
    single: PyExc_TimeoutError
    single: PyExc_TypeError
    single: PyExc_UnboundLocalError
@@ -901,6 +902,8 @@ the variables:
 +-----------------------------------------+---------------------------------+----------+
 | :c:data:`PyExc_TabError`                | :exc:`TabError`                 |          |
 +-----------------------------------------+---------------------------------+----------+
+| :c:data:`PyExc_TargetScopeError`        | :exc:`TargetScopeError`         |          |
++-----------------------------------------+---------------------------------+----------+
 | :c:data:`PyExc_TimeoutError`            | :exc:`TimeoutError`             |          |
 +-----------------------------------------+---------------------------------+----------+
 | :c:data:`PyExc_TypeError`               | :exc:`TypeError`                |          |
diff --git a/Doc/library/token-list.inc b/Doc/library/token-list.inc
index cd6e0f26968e..3ea9439be859 100644
--- a/Doc/library/token-list.inc
+++ b/Doc/library/token-list.inc
@@ -197,6 +197,10 @@
 
    Token value for ``"..."``.
 
+.. data:: COLONEQUAL
+
+   Token value for ``":="``.
+
 .. data:: OP
 
 .. data:: ERRORTOKEN
diff --git a/Grammar/Grammar b/Grammar/Grammar
index e232df979e2d..f21fa1136432 100644
--- a/Grammar/Grammar
+++ b/Grammar/Grammar
@@ -69,7 +69,7 @@ assert_stmt: 'assert' test [',' test]
 
 compound_stmt: if_stmt | while_stmt | for_stmt | try_stmt | with_stmt | funcdef | classdef | decorated | async_stmt
 async_stmt: 'async' (funcdef | with_stmt | for_stmt)
-if_stmt: 'if' test ':' suite ('elif' test ':' suite)* ['else' ':' suite]
+if_stmt: 'if' namedexpr_test ':' suite ('elif' namedexpr_test ':' suite)* ['else' ':' suite]
 while_stmt: 'while' test ':' suite ['else' ':' suite]
 for_stmt: 'for' exprlist 'in' testlist ':' suite ['else' ':' suite]
 try_stmt: ('try' ':' suite
@@ -83,6 +83,7 @@ with_item: test ['as' expr]
 except_clause: 'except' [test ['as' NAME]]
 suite: simple_stmt | NEWLINE INDENT stmt+ DEDENT
 
+namedexpr_test: test [':=' test]
 test: or_test ['if' or_test 'else' test] | lambdef
 test_nocond: or_test | lambdef_nocond
 lambdef: 'lambda' [varargslist] ':' test
@@ -108,7 +109,7 @@ atom: ('(' [yield_expr|testlist_comp] ')' |
        '[' [testlist_comp] ']' |
        '{' [dictorsetmaker] '}' |
        NAME | NUMBER | STRING+ | '...' | 'None' | 'True' | 'False')
-testlist_comp: (test|star_expr) ( comp_for | (',' (test|star_expr))* [','] )
+testlist_comp: (namedexpr_test|star_expr) ( comp_for | (',' (namedexpr_test|star_expr))* [','] )
 trailer: '(' [arglist] ')' | '[' subscriptlist ']' | '.' NAME
 subscriptlist: subscript (',' subscript)* [',']
 subscript: test | [test] ':' [test] [sliceop]
@@ -134,6 +135,7 @@ arglist: argument (',' argument)*  [',']
 # multiple (test comp_for) arguments are blocked; keyword unpackings
 # that precede iterable unpackings are blocked; etc.
 argument: ( test [comp_for] |
+            test ':=' test |
             test '=' test |
             '**' test |
             '*' test )
diff --git a/Grammar/Tokens b/Grammar/Tokens
index 9595673a5af7..f6f303bd5292 100644
--- a/Grammar/Tokens
+++ b/Grammar/Tokens
@@ -52,6 +52,7 @@ AT                      '@'
 ATEQUAL                 '@='
 RARROW                  '->'
 ELLIPSIS                '...'
+COLONEQUAL              ':='
 
 OP
 ERRORTOKEN
diff --git a/Include/Python-ast.h b/Include/Python-ast.h
index f8394e6c26ad..3527ae8949e7 100644
--- a/Include/Python-ast.h
+++ b/Include/Python-ast.h
@@ -17,7 +17,7 @@ typedef struct _stmt *stmt_ty;
 typedef struct _expr *expr_ty;
 
 typedef enum _expr_context { Load=1, Store=2, Del=3, AugLoad=4, AugStore=5,
-                             Param=6 } expr_context_ty;
+                             Param=6, NamedStore=7 } expr_context_ty;
 
 typedef struct _slice *slice_ty;
 
@@ -214,14 +214,14 @@ struct _stmt {
     int end_col_offset;
 };
 
-enum _expr_kind {BoolOp_kind=1, BinOp_kind=2, UnaryOp_kind=3, Lambda_kind=4,
-                  IfExp_kind=5, Dict_kind=6, Set_kind=7, ListComp_kind=8,
-                  SetComp_kind=9, DictComp_kind=10, GeneratorExp_kind=11,
-                  Await_kind=12, Yield_kind=13, YieldFrom_kind=14,
-                  Compare_kind=15, Call_kind=16, FormattedValue_kind=17,
-                  JoinedStr_kind=18, Constant_kind=19, Attribute_kind=20,
-                  Subscript_kind=21, Starred_kind=22, Name_kind=23,
-                  List_kind=24, Tuple_kind=25};
+enum _expr_kind {BoolOp_kind=1, NamedExpr_kind=2, BinOp_kind=3, UnaryOp_kind=4,
+                  Lambda_kind=5, IfExp_kind=6, Dict_kind=7, Set_kind=8,
+                  ListComp_kind=9, SetComp_kind=10, DictComp_kind=11,
+                  GeneratorExp_kind=12, Await_kind=13, Yield_kind=14,
+                  YieldFrom_kind=15, Compare_kind=16, Call_kind=17,
+                  FormattedValue_kind=18, JoinedStr_kind=19, Constant_kind=20,
+                  Attribute_kind=21, Subscript_kind=22, Starred_kind=23,
+                  Name_kind=24, List_kind=25, Tuple_kind=26};
 struct _expr {
     enum _expr_kind kind;
     union {
@@ -230,6 +230,11 @@ struct _expr {
             asdl_seq *values;
         } BoolOp;
 
+        struct {
+            expr_ty target;
+            expr_ty value;
+        } NamedExpr;
+
         struct {
             expr_ty left;
             operator_ty op;
@@ -541,6 +546,10 @@ stmt_ty _Py_Continue(int lineno, int col_offset, int end_lineno, int
 #define BoolOp(a0, a1, a2, a3, a4, a5, a6) _Py_BoolOp(a0, a1, a2, a3, a4, a5, a6)
 expr_ty _Py_BoolOp(boolop_ty op, asdl_seq * values, int lineno, int col_offset,
                    int end_lineno, int end_col_offset, PyArena *arena);
+#define NamedExpr(a0, a1, a2, a3, a4, a5, a6) _Py_NamedExpr(a0, a1, a2, a3, a4, a5, a6)
+expr_ty _Py_NamedExpr(expr_ty target, expr_ty value, int lineno, int
+                      col_offset, int end_lineno, int end_col_offset, PyArena
+                      *arena);
 #define BinOp(a0, a1, a2, a3, a4, a5, a6, a7) _Py_BinOp(a0, a1, a2, a3, a4, a5, a6, a7)
 expr_ty _Py_BinOp(expr_ty left, operator_ty op, expr_ty right, int lineno, int
                   col_offset, int end_lineno, int end_col_offset, PyArena
diff --git a/Include/graminit.h b/Include/graminit.h
index bdfe821ad716..e3acff8a1e83 100644
--- a/Include/graminit.h
+++ b/Include/graminit.h
@@ -49,41 +49,42 @@
 #define with_item 302
 #define except_clause 303
 #define suite 304
-#define test 305
-#define test_nocond 306
-#define lambdef 307
-#define lambdef_nocond 308
-#define or_test 309
-#define and_test 310
-#define not_test 311
-#define comparison 312
-#define comp_op 313
-#define star_expr 314
-#define expr 315
-#define xor_expr 316
-#define and_expr 317
-#define shift_expr 318
-#define arith_expr 319
-#define term 320
-#define factor 321
-#define power 322
-#define atom_expr 323
-#define atom 324
-#define testlist_comp 325
-#define trailer 326
-#define subscriptlist 327
-#define subscript 328
-#define sliceop 329
-#define exprlist 330
-#define testlist 331
-#define dictorsetmaker 332
-#define classdef 333
-#define arglist 334
-#define argument 335
-#define comp_iter 336
-#define sync_comp_for 337
-#define comp_for 338
-#define comp_if 339
-#define encoding_decl 340
-#define yield_expr 341
-#define yield_arg 342
+#define namedexpr_test 305
+#define test 306
+#define test_nocond 307
+#define lambdef 308
+#define lambdef_nocond 309
+#define or_test 310
+#define and_test 311
+#define not_test 312
+#define comparison 313
+#define comp_op 314
+#define star_expr 315
+#define expr 316
+#define xor_expr 317
+#define and_expr 318
+#define shift_expr 319
+#define arith_expr 320
+#define term 321
+#define factor 322
+#define power 323
+#define atom_expr 324
+#define atom 325
+#define testlist_comp 326
+#define trailer 327
+#define subscriptlist 328
+#define subscript 329
+#define sliceop 330
+#define exprlist 331
+#define testlist 332
+#define dictorsetmaker 333
+#define classdef 334
+#define arglist 335
+#define argument 336
+#define comp_iter 337
+#define sync_comp_for 338
+#define comp_for 339
+#define comp_if 340
+#define encoding_decl 341
+#define yield_expr 342
+#define yield_arg 343
diff --git a/Include/pyerrors.h b/Include/pyerrors.h
index efe1c49d2d08..5c6751868df4 100644
--- a/Include/pyerrors.h
+++ b/Include/pyerrors.h
@@ -108,6 +108,7 @@ PyAPI_DATA(PyObject *) PyExc_NotImplementedError;
 PyAPI_DATA(PyObject *) PyExc_SyntaxError;
 PyAPI_DATA(PyObject *) PyExc_IndentationError;
 PyAPI_DATA(PyObject *) PyExc_TabError;
+PyAPI_DATA(PyObject *) PyExc_TargetScopeError;
 PyAPI_DATA(PyObject *) PyExc_ReferenceError;
 PyAPI_DATA(PyObject *) PyExc_SystemError;
 PyAPI_DATA(PyObject *) PyExc_SystemExit;
diff --git a/Include/symtable.h b/Include/symtable.h
index 949022bd6630..9392e6438779 100644
--- a/Include/symtable.h
+++ b/Include/symtable.h
@@ -50,6 +50,7 @@ typedef struct _symtable_entry {
                                      including free refs to globals */
     unsigned ste_generator : 1;   /* true if namespace is a generator */
     unsigned ste_coroutine : 1;   /* true if namespace is a coroutine */
+    unsigned ste_comprehension : 1; /* true if namespace is a list comprehension */
     unsigned ste_varargs : 1;     /* true if block has varargs */
     unsigned ste_varkeywords : 1; /* true if block has varkeywords */
     unsigned ste_returns_value : 1;  /* true if namespace uses return with
diff --git a/Include/token.h b/Include/token.h
index 2d491e6927d1..b87b84cd966d 100644
--- a/Include/token.h
+++ b/Include/token.h
@@ -63,9 +63,10 @@ extern "C" {
 #define ATEQUAL         50
 #define RARROW          51
 #define ELLIPSIS        52
-#define OP              53
-#define ERRORTOKEN      54
-#define N_TOKENS        58
+#define COLONEQUAL      53
+#define OP              54
+#define ERRORTOKEN      55
+#define N_TOKENS        59
 #define NT_OFFSET       256
 
 /* Special definitions for cooperation with parser */
diff --git a/Lib/_compat_pickle.py b/Lib/_compat_pickle.py
index f68496ae639f..8bb1cf80afa5 100644
--- a/Lib/_compat_pickle.py
+++ b/Lib/_compat_pickle.py
@@ -128,6 +128,7 @@
     "SystemError",
     "SystemExit",
     "TabError",
+    "TargetScopeError",
     "TypeError",
     "UnboundLocalError",
     "UnicodeDecodeError",
diff --git a/Lib/symbol.py b/Lib/symbol.py
index 40d0ed1e0355..b3fa08984d87 100644
--- a/Lib/symbol.py
+++ b/Lib/symbol.py
@@ -61,44 +61,45 @@
 with_item = 302
 except_clause = 303
 suite = 304
-test = 305
-test_nocond = 306
-lambdef = 307
-lambdef_nocond = 308
-or_test = 309
-and_test = 310
-not_test = 311
-comparison = 312
-comp_op = 313
-star_expr = 314
-expr = 315
-xor_expr = 316
-and_expr = 317
-shift_expr = 318
-arith_expr = 319
-term = 320
-factor = 321
-power = 322
-atom_expr = 323
-atom = 324
-testlist_comp = 325
-trailer = 326
-subscriptlist = 327
-subscript = 328
-sliceop = 329
-exprlist = 330
-testlist = 331
-dictorsetmaker = 332
-classdef = 333
-arglist = 334
-argument = 335
-comp_iter = 336
-sync_comp_for = 337
-comp_for = 338
-comp_if = 339
-encoding_decl = 340
-yield_expr = 341
-yield_arg = 342
+namedexpr_test = 305
+test = 306
+test_nocond = 307
+lambdef = 308
+lambdef_nocond = 309
+or_test = 310
+and_test = 311
+not_test = 312
+comparison = 313
+comp_op = 314
+star_expr = 315
+expr = 316
+xor_expr = 317
+and_expr = 318
+shift_expr = 319
+arith_expr = 320
+term = 321
+factor = 322
+power = 323
+atom_expr = 324
+atom = 325
+testlist_comp = 326
+trailer = 327
+subscriptlist = 328
+subscript = 329
+sliceop = 330
+exprlist = 331
+testlist = 332
+dictorsetmaker = 333
+classdef = 334
+arglist = 335
+argument = 336
+comp_iter = 337
+sync_comp_for = 338
+comp_for = 339
+comp_if = 340
+encoding_decl = 341
+yield_expr = 342
+yield_arg = 343
 #--end constants--
 
 sym_name = {}
diff --git a/Lib/test/exception_hierarchy.txt b/Lib/test/exception_hierarchy.txt
index 763a6c899b48..15f4491cf237 100644
--- a/Lib/test/exception_hierarchy.txt
+++ b/Lib/test/exception_hierarchy.txt
@@ -42,6 +42,7 @@ BaseException
       |    +-- NotImplementedError
       |    +-- RecursionError
       +-- SyntaxError
+      |    +-- TargetScopeError
       |    +-- IndentationError
       |         +-- TabError
       +-- SystemError
diff --git a/Lib/test/test_named_expressions.py b/Lib/test/test_named_expressions.py
new file mode 100644
index 000000000000..e49fd7de20de
--- /dev/null
+++ b/Lib/test/test_named_expressions.py
@@ -0,0 +1,415 @@
+import os
+import unittest
+
+
+class NamedExpressionInvalidTest(unittest.TestCase):
+
+    def test_named_expression_invalid_01(self):
+        code = """x := 0"""
+
+        with self.assertRaisesRegex(SyntaxError, "invalid syntax"):
+            exec(code, {}, {})
+
+    def test_named_expression_invalid_02(self):
+        code = """x = y := 0"""
+
+        with self.assertRaisesRegex(SyntaxError, "invalid syntax"):
+            exec(code, {}, {})
+
+    def test_named_expression_invalid_03(self):
+        code = """y := f(x)"""
+
+        with self.assertRaisesRegex(SyntaxError, "invalid syntax"):
+            exec(code, {}, {})
+
+    def test_named_expression_invalid_04(self):
+        code = """y0 = y1 := f(x)"""
+
+        with self.assertRaisesRegex(SyntaxError, "invalid syntax"):
+            exec(code, {}, {})
+
+    def test_named_expression_invalid_06(self):
+        code = """((a, b) := (1, 2))"""
+
+        with self.assertRaisesRegex(SyntaxError, "cannot use named assignment with tuple"):
+            exec(code, {}, {})
+
+    def test_named_expression_invalid_07(self):
+        code = """def spam(a = b := 42): pass"""
+
+        with self.assertRaisesRegex(SyntaxError, "invalid syntax"):
+            exec(code, {}, {})
+
+    def test_named_expression_invalid_08(self):
+        code = """def spam(a: b := 42 = 5): pass"""
+
+        with self.assertRaisesRegex(SyntaxError, "invalid syntax"):
+            exec(code, {}, {})
+
+    def test_named_expression_invalid_09(self):
+        code = """spam(a=b := 'c')"""
+
+        with self.assertRaisesRegex(SyntaxError, "invalid syntax"):
+            exec(code, {}, {})
+
+    def test_named_expression_invalid_10(self):
+        code = """spam(x = y := f(x))"""
+
+        with self.assertRaisesRegex(SyntaxError, "invalid syntax"):
+            exec(code, {}, {})
+
+    def test_named_expression_invalid_11(self):
+        code = """spam(a=1, b := 2)"""
+
+        with self.assertRaisesRegex(SyntaxError,
+            "positional argument follows keyword argument"):
+            exec(code, {}, {})
+
+    def test_named_expression_invalid_12(self):
+        code = """spam(a=1, (b := 2))"""
+
+        with self.assertRaisesRegex(SyntaxError,
+            "positional argument follows keyword argument"):
+            exec(code, {}, {})
+
+    def test_named_expression_invalid_13(self):
+        code = """spam(a=1, (b := 2))"""
+
+        with self.assertRaisesRegex(SyntaxError,
+            "positional argument follows keyword argument"):
+            exec(code, {}, {})
+
+    def test_named_expression_invalid_14(self):
+        code = """(x := lambda: y := 1)"""
+
+        with self.assertRaisesRegex(SyntaxError, "invalid syntax"):
+            exec(code, {}, {})
+
+    def test_named_expression_invalid_15(self):
+        code = """(lambda: x := 1)"""
+
+        with self.assertRaisesRegex(SyntaxError,
+            "cannot use named assignment with lambda"):
+            exec(code, {}, {})
+
+    def test_named_expression_invalid_16(self):
+        code = "[i + 1 for i in i := [1,2]]"
+
+        with self.assertRaisesRegex(SyntaxError, "invalid syntax"):
+            exec(code, {}, {})
+
+    def test_named_expression_invalid_17(self):
+        code = "[i := 0, j := 1 for i, j in [(1, 2), (3, 4)]]"
+
+        with self.assertRaisesRegex(SyntaxError, "invalid syntax"):
+            exec(code, {}, {})
+
+    def test_named_expression_invalid_18(self):
+        code = """class Foo():
+            [(42, 1 + ((( j := i )))) for i in range(5)]
+        """
+
+        with self.assertRaisesRegex(TargetScopeError,
+            "named expression within a comprehension cannot be used in a class body"):
+            exec(code, {}, {})
+
+
+class NamedExpressionAssignmentTest(unittest.TestCase):
+
+    def test_named_expression_assignment_01(self):
+        (a := 10)
+
+        self.assertEqual(a, 10)
+
+    def test_named_expression_assignment_02(self):
+        a = 20
+        (a := a)
+
+        self.assertEqual(a, 20)
+
+    def test_named_expression_assignment_03(self):
+        (total := 1 + 2)
+
+        self.assertEqual(total, 3)
+
+    def test_named_expression_assignment_04(self):
+        (info := (1, 2, 3))
+
+        self.assertEqual(info, (1, 2, 3))
+
+    def test_named_expression_assignment_05(self):
+        (x := 1, 2)
+
+        self.assertEqual(x, 1)
+
+    def test_named_expression_assignment_06(self):
+        (z := (y := (x := 0)))
+
+        self.assertEqual(x, 0)
+        self.assertEqual(y, 0)
+        self.assertEqual(z, 0)
+
+    def test_named_expression_assignment_07(self):
+        (loc := (1, 2))
+
+        self.assertEqual(loc, (1, 2))
+
+    def test_named_expression_assignment_08(self):
+        if spam := "eggs":
+            self.assertEqual(spam, "eggs")
+        else: self.fail("variable was not assigned using named expression")
+
+    def test_named_expression_assignment_09(self):
+        if True and (spam := True):
+            self.assertTrue(spam)
+        else: self.fail("variable was not assigned using named expression")
+
+    def test_named_expression_assignment_10(self):
+        if (match := 10) is 10:
+            pass
+        else: self.fail("variable was not assigned using named expression")
+
+    def test_named_expression_assignment_11(self):
+        def spam(a):
+            return a
+        input_data = [1, 2, 3]
+        res = [(x, y, x/y) for x in input_data if (y := spam(x)) > 0]
+
+        self.assertEqual(res, [(1, 1, 1.0), (2, 2, 1.0), (3, 3, 1.0)])
+
+    def test_named_expression_assignment_12(self):
+        def spam(a):
+            return a
+        res = [[y := spam(x), x/y] for x in range(1, 5)]
+
+        self.assertEqual(res, [[1, 1.0], [2, 1.0], [3, 1.0], [4, 1.0]])
+
+    def test_named_expression_assignment_13(self):
+        length = len(lines := [1, 2])
+
+        self.assertEqual(length, 2)
+        self.assertEqual(lines, [1,2])
+
+    def test_named_expression_assignment_14(self):
+        """
+        Where all variables are positive integers, and a is at least as large
+        as the n'th root of x, this algorithm returns the floor of the n'th
+        root of x (and roughly doubling the number of accurate bits per
+        iteration)::
+        """
+        a = 9
+        n = 2
+        x = 3
+
+        while a > (d := x // a**(n-1)):
+            a = ((n-1)*a + d) // n
+
+        self.assertEqual(a, 1)
+
+
+class NamedExpressionScopeTest(unittest.TestCase):
+
+    def test_named_expression_scope_01(self):
+        code = """def spam():
+    (a := 5)
+print(a)"""
+
+        with self.assertRaisesRegex(NameError, "name 'a' is not defined"):
+            exec(code, {}, {})
+
+    def test_named_expression_scope_02(self):
+        total = 0
+        partial_sums = [total := total + v for v in range(5)]
+
+        self.assertEqual(partial_sums, [0, 1, 3, 6, 10])
+        self.assertEqual(total, 10)
+
+    def test_named_expression_scope_03(self):
+        containsOne = any((lastNum := num) == 1 for num in [1, 2, 3])
+
+        self.assertTrue(containsOne)
+        self.assertEqual(lastNum, 1)
+
+    def test_named_expression_scope_04(self):
+        def spam(a):
+            return a
+        res = [[y := spam(x), x/y] for x in range(1, 5)]
+
+        self.assertEqual(y, 4)
+
+    def test_named_expression_scope_05(self):
+        def spam(a):
+            return a
+        input_data = [1, 2, 3]
+        res = [(x, y, x/y) for x in input_data if (y := spam(x)) > 0]
+
+        self.assertEqual(res, [(1, 1, 1.0), (2, 2, 1.0), (3, 3, 1.0)])
+        self.assertEqual(y, 3)
+
+    def test_named_expression_scope_06(self):
+        res = [[spam := i for i in range(3)] for j in range(2)]
+
+        self.assertEqual(res, [[0, 1, 2], [0, 1, 2]])
+        self.assertEqual(spam, 2)
+
+    def test_named_expression_scope_07(self):
+        len(lines := [1, 2])
+
+        self.assertEqual(lines, [1, 2])
+
+    def test_named_expression_scope_08(self):
+        def spam(a):
+            return a
+
+        def eggs(b):
+            return b * 2
+
+        res = [spam(a := eggs(b := h)) for h in range(2)]
+
+        self.assertEqual(res, [0, 2])
+        self.assertEqual(a, 2)
+        self.assertEqual(b, 1)
+
+    def test_named_expression_scope_09(self):
+        def spam(a):
+            return a
+
+        def eggs(b):
+            return b * 2
+
+        res = [spam(a := eggs(a := h)) for h in range(2)]
+
+        self.assertEqual(res, [0, 2])
+        self.assertEqual(a, 2)
+
+    def test_named_expression_scope_10(self):
+        res = [b := [a := 1 for i in range(2)] for j in range(2)]
+
+        self.assertEqual(res, [[1, 1], [1, 1]])
+        self.assertEqual(a, 1)
+        self.assertEqual(b, [1, 1])
+
+    def test_named_expression_scope_11(self):
+        res = [j := i for i in range(5)]
+
+        self.assertEqual(res, [0, 1, 2, 3, 4])
+        self.assertEqual(j, 4)
+
+    def test_named_expression_scope_12(self):
+        res = [i := i for i in range(5)]
+
+        self.assertEqual(res, [0, 1, 2, 3, 4])
+        self.assertEqual(i, 4)
+
+    def test_named_expression_scope_13(self):
+        res = [i := 0 for i, j in [(1, 2), (3, 4)]]
+
+        self.assertEqual(res, [0, 0])
+        self.assertEqual(i, 0)
+
+    def test_named_expression_scope_14(self):
+        res = [(i := 0, j := 1) for i, j in [(1, 2), (3, 4)]]
+
+        self.assertEqual(res, [(0, 1), (0, 1)])
+        self.assertEqual(i, 0)
+        self.assertEqual(j, 1)
+
+    def test_named_expression_scope_15(self):
+        res = [(i := i, j := j) for i, j in [(1, 2), (3, 4)]]
+
+        self.assertEqual(res, [(1, 2), (3, 4)])
+        self.assertEqual(i, 3)
+        self.assertEqual(j, 4)
+
+    def test_named_expression_scope_16(self):
+        res = [(i := j, j := i) for i, j in [(1, 2), (3, 4)]]
+
+        self.assertEqual(res, [(2, 2), (4, 4)])
+        self.assertEqual(i, 4)
+        self.assertEqual(j, 4)
+
+    def test_named_expression_scope_17(self):
+        b = 0
+        res = [b := i + b for i in range(5)]
+
+        self.assertEqual(res, [0, 1, 3, 6, 10])
+        self.assertEqual(b, 10)
+
+    def test_named_expression_scope_18(self):
+        def spam(a):
+            return a
+
+        res = spam(b := 2)
+
+        self.assertEqual(res, 2)
+        self.assertEqual(b, 2)
+
+    def test_named_expression_scope_19(self):
+        def spam(a):
+            return a
+
+        res = spam((b := 2))
+
+        self.assertEqual(res, 2)
+        self.assertEqual(b, 2)
+
+    def test_named_expression_scope_20(self):
+        def spam(a):
+            return a
+
+        res = spam(a=(b := 2))
+
+        self.assertEqual(res, 2)
+        self.assertEqual(b, 2)
+
+    def test_named_expression_scope_21(self):
+        def spam(a, b):
+            return a + b
+
+        res = spam(c := 2, b=1)
+
+        self.assertEqual(res, 3)
+        self.assertEqual(c, 2)
+
+    def test_named_expression_scope_22(self):
+        def spam(a, b):
+            return a + b
+
+        res = spam((c := 2), b=1)
+
+        self.assertEqual(res, 3)
+        self.assertEqual(c, 2)
+
+    def test_named_expression_scope_23(self):
+        def spam(a, b):
+            return a + b
+
+        res = spam(b=(c := 2), a=1)
+
+        self.assertEqual(res, 3)
+        self.assertEqual(c, 2)
+
+    def test_named_expression_scope_24(self):
+        a = 10
+        def spam():
+            nonlocal a
+            (a := 20)
+        spam()
+
+        self.assertEqual(a, 20)
+
+    def test_named_expression_scope_25(self):
+        ns = {}
+        code = """a = 10
+def spam():
+    global a
+    (a := 20)
+spam()"""
+
+        exec(code, ns, {})
+
+        self.assertEqual(ns["a"], 20)
+
+
+if __name__ == "__main__":
+    unittest.main()
diff --git a/Lib/test/test_parser.py b/Lib/test/test_parser.py
index 9b58bb93c81c..ac3899baedb6 100644
--- a/Lib/test/test_parser.py
+++ b/Lib/test/test_parser.py
@@ -115,6 +115,7 @@ def test_expressions(self):
         self.check_expr("foo * bar")
         self.check_expr("foo / bar")
         self.check_expr("foo // bar")
+        self.check_expr("(foo := 1)")
         self.check_expr("lambda: 0")
         self.check_expr("lambda x: 0")
         self.check_expr("lambda *y: 0")
@@ -421,6 +422,40 @@ def test_dict_comprehensions(self):
         self.check_expr('{x**2:x[3] for x in seq if condition(x)}')
         self.check_expr('{x:x for x in seq1 for y in seq2 if condition(x, y)}')
 
+    def test_named_expressions(self):
+        self.check_suite("(a := 1)")
+        self.check_suite("(a := a)")
+        self.check_suite("if (match := pattern.search(data)) is None: pass")
+        self.check_suite("[y := f(x), y**2, y**3]")
+        self.check_suite("filtered_data = [y for x in data if (y := f(x)) is None]")
+        self.check_suite("(y := f(x))")
+        self.check_suite("y0 = (y1 := f(x))")
+        self.check_suite("foo(x=(y := f(x)))")
+        self.check_suite("def foo(answer=(p := 42)): pass")
+        self.check_suite("def foo(answer: (p := 42) = 5): pass")
+        self.check_suite("lambda: (x := 1)")
+        self.check_suite("(x := lambda: 1)")
+        self.check_suite("(x := lambda: (y := 1))")  # not in PEP
+        self.check_suite("lambda line: (m := re.match(pattern, line)) and m.group(1)")
+        self.check_suite("x = (y := 0)")
+        self.check_suite("(z:=(y:=(x:=0)))")
+        self.check_suite("(info := (name, phone, *rest))")
+        self.check_suite("(x:=1,2)")
+        self.check_suite("(total := total + tax)")
+        self.check_suite("len(lines := f.readlines())")
+        self.check_suite("foo(x := 3, cat='vector')")
+        self.check_suite("foo(cat=(category := 'vector'))")
+        self.check_suite("if any(len(longline := l) >= 100 for l in lines): print(longline)")
+        self.check_suite(
+            "if env_base := os.environ.get('PYTHONUSERBASE', None): return env_base"
+        )
+        self.check_suite(
+            "if self._is_special and (ans := self._check_nans(context=context)): return ans"
+        )
+        self.check_suite("foo(b := 2, a=1)")
+        self.check_suite("foo(b := 2, a=1)")
+        self.check_suite("foo((b := 2), a=1)")
+        self.check_suite("foo(c=(b := 2), a=1)")
 
 #
 #  Second, we take *invalid* trees and make sure we get ParserError
@@ -694,16 +729,16 @@ def test_missing_import_source(self):
     def test_illegal_encoding(self):
         # Illegal encoding declaration
         tree = \
-            (340,
+            (341,
              (257, (0, '')))
         self.check_bad_tree(tree, "missed encoding")
         tree = \
-            (340,
+            (341,
              (257, (0, '')),
               b'iso-8859-1')
         self.check_bad_tree(tree, "non-string encoding")
         tree = \
-            (340,
+            (341,
              (257, (0, '')),
               '\udcff')
         with self.assertRaises(UnicodeEncodeError):
@@ -776,8 +811,9 @@ def _nested_expression(self, level):
         return "["*level+"]"*level
 
     def test_deeply_nested_list(self):
-        # XXX used to be 99 levels in 2.x
-        e = self._nested_expression(93)
+        # This has fluctuated between 99 levels in 2.x, down to 93 levels in
+        # 3.7.X and back up to 99 in 3.8.X. Related to MAXSTACK size in Parser.h
+        e = self._nested_expression(99)
         st = parser.expr(e)
         st.compile()
 
diff --git a/Lib/test/test_tokenize.py b/Lib/test/test_tokenize.py
index 04a12542c6ae..4c90092893a2 100644
--- a/Lib/test/test_tokenize.py
+++ b/Lib/test/test_tokenize.py
@@ -1429,6 +1429,7 @@ def test_exact_type(self):
         self.assertExactTypeEqual('**=', token.DOUBLESTAREQUAL)
         self.assertExactTypeEqual('//', token.DOUBLESLASH)
         self.assertExactTypeEqual('//=', token.DOUBLESLASHEQUAL)
+        self.assertExactTypeEqual(':=', token.COLONEQUAL)
         self.assertExactTypeEqual('...', token.ELLIPSIS)
         self.assertExactTypeEqual('->', token.RARROW)
         self.assertExactTypeEqual('@', token.AT)
diff --git a/Lib/token.py b/Lib/token.py
index 5af7e6b91eac..7224eca32fe0 100644
--- a/Lib/token.py
+++ b/Lib/token.py
@@ -56,13 +56,14 @@
 ATEQUAL = 50
 RARROW = 51
 ELLIPSIS = 52
-OP = 53
+COLONEQUAL = 53
+OP = 54
 # These aren't used by the C tokenizer but are needed for tokenize.py
-ERRORTOKEN = 54
-COMMENT = 55
-NL = 56
-ENCODING = 57
-N_TOKENS = 58
+ERRORTOKEN = 55
+COMMENT = 56
+NL = 57
+ENCODING = 58
+N_TOKENS = 59
 # Special definitions for cooperation with parser
 NT_OFFSET = 256
 
@@ -96,6 +97,7 @@
     '//=': DOUBLESLASHEQUAL,
     '/=': SLASHEQUAL,
     ':': COLON,
+    ':=': COLONEQUAL,
     ';': SEMI,
     '<': LESS,
     '<<': LEFTSHIFT,
diff --git a/Misc/NEWS.d/next/Core and Builtins/2018-11-13-14-26-54.bpo-35224.F0B6UQ.rst b/Misc/NEWS.d/next/Core and Builtins/2018-11-13-14-26-54.bpo-35224.F0B6UQ.rst
new file mode 100644
index 000000000000..fe54f362aef6
--- /dev/null
+++ b/Misc/NEWS.d/next/Core and Builtins/2018-11-13-14-26-54.bpo-35224.F0B6UQ.rst	
@@ -0,0 +1 @@
+Implement :pep:`572` (assignment expressions). Patch by Emily Morehouse.
diff --git a/Objects/exceptions.c b/Objects/exceptions.c
index 35e1df3ca1fa..75ede1c9c5de 100644
--- a/Objects/exceptions.c
+++ b/Objects/exceptions.c
@@ -1520,6 +1520,13 @@ MiddlingExtendsException(PyExc_SyntaxError, IndentationError, SyntaxError,
                          "Improper indentation.");
 
 
+/*
+ *    TargetScopeError extends SyntaxError
+ */
+MiddlingExtendsException(PyExc_SyntaxError, TargetScopeError, SyntaxError,
+                         "Improper scope target.");
+
+
 /*
  *    TabError extends IndentationError
  */
@@ -2531,6 +2538,7 @@ _PyExc_Init(void)
     PRE_INIT(AttributeError);
     PRE_INIT(SyntaxError);
     PRE_INIT(IndentationError);
+    PRE_INIT(TargetScopeError);
     PRE_INIT(TabError);
     PRE_INIT(LookupError);
     PRE_INIT(IndexError);
@@ -2671,6 +2679,7 @@ _PyBuiltins_AddExceptions(PyObject *bltinmod)
     POST_INIT(AttributeError);
     POST_INIT(SyntaxError);
     POST_INIT(IndentationError);
+    POST_INIT(TargetScopeError);
     POST_INIT(TabError);
     POST_INIT(LookupError);
     POST_INIT(IndexError);
diff --git a/PC/python3.def b/PC/python3.def
index 5d93c18af87e..e317864d0cd8 100644
--- a/PC/python3.def
+++ b/PC/python3.def
@@ -235,6 +235,7 @@ EXPORTS
   PyExc_SystemError=python38.PyExc_SystemError DATA
   PyExc_SystemExit=python38.PyExc_SystemExit DATA
   PyExc_TabError=python38.PyExc_TabError DATA
+  PyExc_TargetScopeError=python38.PyExc_TargetScopeError DATA
   PyExc_TimeoutError=python38.PyExc_TimeoutError DATA
   PyExc_TypeError=python38.PyExc_TypeError DATA
   PyExc_UnboundLocalError=python38.PyExc_UnboundLocalError DATA
diff --git a/Parser/Python.asdl b/Parser/Python.asdl
index cedf37a2d9f9..7b2a8737ab80 100644
--- a/Parser/Python.asdl
+++ b/Parser/Python.asdl
@@ -54,6 +54,7 @@ module Python
 
           -- BoolOp() can use left & right?
     expr = BoolOp(boolop op, expr* values)
+         | NamedExpr(expr target, expr value)
          | BinOp(expr left, operator op, expr right)
          | UnaryOp(unaryop op, expr operand)
          | Lambda(arguments args, expr body)
@@ -87,7 +88,7 @@ module Python
           -- col_offset is the byte offset in the utf8 string the parser uses
           attributes (int lineno, int col_offset, int? end_lineno, int? end_col_offset)
 
-    expr_context = Load | Store | Del | AugLoad | AugStore | Param
+    expr_context = Load | Store | Del | AugLoad | AugStore | Param | NamedStore
 
     slice = Slice(expr? lower, expr? upper, expr? step)
           | ExtSlice(slice* dims)
diff --git a/Parser/parser.h b/Parser/parser.h
index 95cd39d209dd..aee1c86cb044 100644
--- a/Parser/parser.h
+++ b/Parser/parser.h
@@ -7,7 +7,7 @@ extern "C" {
 
 /* Parser interface */
 
-#define MAXSTACK 1500
+#define MAXSTACK 1700
 
 typedef struct {
     int              s_state;       /* State in current DFA */
diff --git a/Parser/token.c b/Parser/token.c
index 35519aa4b611..d27f98a34d55 100644
--- a/Parser/token.c
+++ b/Parser/token.c
@@ -59,6 +59,7 @@ const char * const _PyParser_TokenNames[] = {
     "ATEQUAL",
     "RARROW",
     "ELLIPSIS",
+    "COLONEQUAL",
     "OP",
     "<ERRORTOKEN>",
     "<COMMENT>",
@@ -142,6 +143,11 @@ PyToken_TwoChars(int c1, int c2)
         case '=': return SLASHEQUAL;
         }
         break;
+    case ':':
+        switch (c2) {
+        case '=': return COLONEQUAL;
+        }
+        break;
     case '<':
         switch (c2) {
         case '<': return LEFTSHIFT;
diff --git a/Python/Python-ast.c b/Python/Python-ast.c
index e6c5bfe9b29e..a333ff95b110 100644
--- a/Python/Python-ast.c
+++ b/Python/Python-ast.c
@@ -203,6 +203,11 @@ static char *BoolOp_fields[]={
     "op",
     "values",
 };
+static PyTypeObject *NamedExpr_type;
+static char *NamedExpr_fields[]={
+    "target",
+    "value",
+};
 static PyTypeObject *BinOp_type;
 _Py_IDENTIFIER(left);
 _Py_IDENTIFIER(right);
@@ -344,7 +349,8 @@ static char *Tuple_fields[]={
 };
 static PyTypeObject *expr_context_type;
 static PyObject *Load_singleton, *Store_singleton, *Del_singleton,
-*AugLoad_singleton, *AugStore_singleton, *Param_singleton;
+*AugLoad_singleton, *AugStore_singleton, *Param_singleton,
+*NamedStore_singleton;
 static PyObject* ast2obj_expr_context(expr_context_ty);
 static PyTypeObject *Load_type;
 static PyTypeObject *Store_type;
@@ -352,6 +358,7 @@ static PyTypeObject *Del_type;
 static PyTypeObject *AugLoad_type;
 static PyTypeObject *AugStore_type;
 static PyTypeObject *Param_type;
+static PyTypeObject *NamedStore_type;
 static PyTypeObject *slice_type;
 static PyObject* ast2obj_slice(void*);
 static PyTypeObject *Slice_type;
@@ -872,6 +879,8 @@ static int init_types(void)
     if (!add_attributes(expr_type, expr_attributes, 4)) return 0;
     BoolOp_type = make_type("BoolOp", expr_type, BoolOp_fields, 2);
     if (!BoolOp_type) return 0;
+    NamedExpr_type = make_type("NamedExpr", expr_type, NamedExpr_fields, 2);
+    if (!NamedExpr_type) return 0;
     BinOp_type = make_type("BinOp", expr_type, BinOp_fields, 3);
     if (!BinOp_type) return 0;
     UnaryOp_type = make_type("UnaryOp", expr_type, UnaryOp_fields, 2);
@@ -949,6 +958,10 @@ static int init_types(void)
     if (!Param_type) return 0;
     Param_singleton = PyType_GenericNew(Param_type, NULL, NULL);
     if (!Param_singleton) return 0;
+    NamedStore_type = make_type("NamedStore", expr_context_type, NULL, 0);
+    if (!NamedStore_type) return 0;
+    NamedStore_singleton = PyType_GenericNew(NamedStore_type, NULL, NULL);
+    if (!NamedStore_singleton) return 0;
     slice_type = make_type("slice", &AST_type, NULL, 0);
     if (!slice_type) return 0;
     if (!add_attributes(slice_type, NULL, 0)) return 0;
@@ -1772,6 +1785,34 @@ BoolOp(boolop_ty op, asdl_seq * values, int lineno, int col_offset, int
     return p;
 }
 
+expr_ty
+NamedExpr(expr_ty target, expr_ty value, int lineno, int col_offset, int
+          end_lineno, int end_col_offset, PyArena *arena)
+{
+    expr_ty p;
+    if (!target) {
+        PyErr_SetString(PyExc_ValueError,
+                        "field target is required for NamedExpr");
+        return NULL;
+    }
+    if (!value) {
+        PyErr_SetString(PyExc_ValueError,
+                        "field value is required for NamedExpr");
+        return NULL;
+    }
+    p = (expr_ty)PyArena_Malloc(arena, sizeof(*p));
+    if (!p)
+        return NULL;
+    p->kind = NamedExpr_kind;
+    p->v.NamedExpr.target = target;
+    p->v.NamedExpr.value = value;
+    p->lineno = lineno;
+    p->col_offset = col_offset;
+    p->end_lineno = end_lineno;
+    p->end_col_offset = end_col_offset;
+    return p;
+}
+
 expr_ty
 BinOp(expr_ty left, operator_ty op, expr_ty right, int lineno, int col_offset,
       int end_lineno, int end_col_offset, PyArena *arena)
@@ -3062,6 +3103,20 @@ ast2obj_expr(void* _o)
             goto failed;
         Py_DECREF(value);
         break;
+    case NamedExpr_kind:
+        result = PyType_GenericNew(NamedExpr_type, NULL, NULL);
+        if (!result) goto failed;
+        value = ast2obj_expr(o->v.NamedExpr.target);
+        if (!value) goto failed;
+        if (_PyObject_SetAttrId(result, &PyId_target, value) == -1)
+            goto failed;
+        Py_DECREF(value);
+        value = ast2obj_expr(o->v.NamedExpr.value);
+        if (!value) goto failed;
+        if (_PyObject_SetAttrId(result, &PyId_value, value) == -1)
+            goto failed;
+        Py_DECREF(value);
+        break;
     case BinOp_kind:
         result = PyType_GenericNew(BinOp_type, NULL, NULL);
         if (!result) goto failed;
@@ -3464,6 +3519,9 @@ PyObject* ast2obj_expr_context(expr_context_ty o)
         case Param:
             Py_INCREF(Param_singleton);
             return Param_singleton;
+        case NamedStore:
+            Py_INCREF(NamedStore_singleton);
+            return NamedStore_singleton;
         default:
             /* should never happen, but just in case ... */
             PyErr_Format(PyExc_SystemError, "unknown expr_context found");
@@ -5895,6 +5953,45 @@ obj2ast_expr(PyObject* obj, expr_ty* out, PyArena* arena)
         if (*out == NULL) goto failed;
         return 0;
     }
+    isinstance = PyObject_IsInstance(obj, (PyObject*)NamedExpr_type);
+    if (isinstance == -1) {
+        return 1;
+    }
+    if (isinstance) {
+        expr_ty target;
+        expr_ty value;
+
+        if (_PyObject_LookupAttrId(obj, &PyId_target, &tmp) < 0) {
+            return 1;
+        }
+        if (tmp == NULL) {
+            PyErr_SetString(PyExc_TypeError, "required field \"target\" missing from NamedExpr");
+            return 1;
+        }
+        else {
+            int res;
+            res = obj2ast_expr(tmp, &target, arena);
+            if (res != 0) goto failed;
+            Py_CLEAR(tmp);
+        }
+        if (_PyObject_LookupAttrId(obj, &PyId_value, &tmp) < 0) {
+            return 1;
+        }
+        if (tmp == NULL) {
+            PyErr_SetString(PyExc_TypeError, "required field \"value\" missing from NamedExpr");
+            return 1;
+        }
+        else {
+            int res;
+            res = obj2ast_expr(tmp, &value, arena);
+            if (res != 0) goto failed;
+            Py_CLEAR(tmp);
+        }
+        *out = NamedExpr(target, value, lineno, col_offset, end_lineno,
+                         end_col_offset, arena);
+        if (*out == NULL) goto failed;
+        return 0;
+    }
     isinstance = PyObject_IsInstance(obj, (PyObject*)BinOp_type);
     if (isinstance == -1) {
         return 1;
@@ -7156,6 +7253,14 @@ obj2ast_expr_context(PyObject* obj, expr_context_ty* out, PyArena* arena)
         *out = Param;
         return 0;
     }
+    isinstance = PyObject_IsInstance(obj, (PyObject *)NamedStore_type);
+    if (isinstance == -1) {
+        return 1;
+    }
+    if (isinstance) {
+        *out = NamedStore;
+        return 0;
+    }
 
     PyErr_Format(PyExc_TypeError, "expected some sort of expr_context, but got %R", obj);
     return 1;
@@ -8251,6 +8356,8 @@ PyInit__ast(void)
     if (PyDict_SetItemString(d, "expr", (PyObject*)expr_type) < 0) return NULL;
     if (PyDict_SetItemString(d, "BoolOp", (PyObject*)BoolOp_type) < 0) return
         NULL;
+    if (PyDict_SetItemString(d, "NamedExpr", (PyObject*)NamedExpr_type) < 0)
+        return NULL;
     if (PyDict_SetItemString(d, "BinOp", (PyObject*)BinOp_type) < 0) return
         NULL;
     if (PyDict_SetItemString(d, "UnaryOp", (PyObject*)UnaryOp_type) < 0) return
@@ -8306,6 +8413,8 @@ PyInit__ast(void)
         return NULL;
     if (PyDict_SetItemString(d, "Param", (PyObject*)Param_type) < 0) return
         NULL;
+    if (PyDict_SetItemString(d, "NamedStore", (PyObject*)NamedStore_type) < 0)
+        return NULL;
     if (PyDict_SetItemString(d, "slice", (PyObject*)slice_type) < 0) return
         NULL;
     if (PyDict_SetItemString(d, "Slice", (PyObject*)Slice_type) < 0) return
diff --git a/Python/ast.c b/Python/ast.c
index 855acca29e7c..6560026109c8 100644
--- a/Python/ast.c
+++ b/Python/ast.c
@@ -94,6 +94,8 @@ expr_context_name(expr_context_ty ctx)
         return "Load";
     case Store:
         return "Store";
+    case NamedStore:
+        return "NamedStore";
     case Del:
         return "Del";
     case AugLoad:
@@ -975,14 +977,29 @@ set_context(struct compiling *c, expr_ty e, expr_context_ty ctx, const node *n)
 
     switch (e->kind) {
         case Attribute_kind:
+            if (ctx == NamedStore) {
+                expr_name = "attribute";
+                break;
+            }
+
             e->v.Attribute.ctx = ctx;
             if (ctx == Store && forbidden_name(c, e->v.Attribute.attr, n, 1))
                 return 0;
             break;
         case Subscript_kind:
+            if (ctx == NamedStore) {
+                expr_name = "subscript";
+                break;
+            }
+
             e->v.Subscript.ctx = ctx;
             break;
         case Starred_kind:
+            if (ctx == NamedStore) {
+                expr_name = "starred";
+                break;
+            }
+
             e->v.Starred.ctx = ctx;
             if (!set_context(c, e->v.Starred.value, ctx, n))
                 return 0;
@@ -995,10 +1012,20 @@ set_context(struct compiling *c, expr_ty e, expr_context_ty ctx, const node *n)
             e->v.Name.ctx = ctx;
             break;
         case List_kind:
+            if (ctx == NamedStore) {
+                expr_name = "list";
+                break;
+            }
+
             e->v.List.ctx = ctx;
             s = e->v.List.elts;
             break;
         case Tuple_kind:
+            if (ctx == NamedStore) {
+                expr_name = "tuple";
+                break;
+            }
+
             e->v.Tuple.ctx = ctx;
             s = e->v.Tuple.elts;
             break;
@@ -1060,17 +1087,27 @@ set_context(struct compiling *c, expr_ty e, expr_context_ty ctx, const node *n)
         case IfExp_kind:
             expr_name = "conditional expression";
             break;
+        case NamedExpr_kind:
+            expr_name = "named expression";
+            break;
         default:
             PyErr_Format(PyExc_SystemError,
-                         "unexpected expression in assignment %d (line %d)",
+                         "unexpected expression in %sassignment %d (line %d)",
+                         ctx == NamedStore ? "named ": "",
                          e->kind, e->lineno);
             return 0;
     }
     /* Check for error string set by switch */
     if (expr_name) {
-        return ast_error(c, n, "cannot %s %s",
+        if (ctx == NamedStore) {
+            return ast_error(c, n, "cannot use named assignment with %s",
+                         expr_name);
+        }
+        else {
+            return ast_error(c, n, "cannot %s %s",
                          ctx == Store ? "assign to" : "delete",
                          expr_name);
+        }
     }
 
     /* If the LHS is a list or tuple, we need to set the assignment
@@ -1198,7 +1235,7 @@ seq_for_testlist(struct compiling *c, const node *n)
 
     for (i = 0; i < NCH(n); i += 2) {
         const node *ch = CHILD(n, i);
-        assert(TYPE(ch) == test || TYPE(ch) == test_nocond || TYPE(ch) == star_expr);
+        assert(TYPE(ch) == test || TYPE(ch) == test_nocond || TYPE(ch) == star_expr || TYPE(ch) == namedexpr_test);
 
         expression = ast_for_expr(c, ch);
         if (!expression)
@@ -1691,6 +1728,35 @@ ast_for_decorated(struct compiling *c, const node *n)
     return thing;
 }
 
+static expr_ty
+ast_for_namedexpr(struct compiling *c, const node *n)
+{
+    /* if_stmt: 'if' namedexpr_test ':' suite ('elif' namedexpr_test ':' suite)*
+         ['else' ':' suite]
+       namedexpr_test: test [':=' test]
+       argument: ( test [comp_for] |
+            test ':=' test |
+            test '=' test |
+            '**' test |
+            '*' test )
+    */
+    expr_ty target, value;
+
+    target = ast_for_expr(c, CHILD(n, 0));
+    if (!target)
+        return NULL;
+
+    value = ast_for_expr(c, CHILD(n, 2));
+    if (!value)
+        return NULL;
+
+    if (!set_context(c, target, NamedStore, n))
+        return NULL;
+
+    return NamedExpr(target, value, LINENO(n), n->n_col_offset, n->n_end_lineno,
+                     n->n_end_col_offset, c->c_arena);
+}
+
 static expr_ty
 ast_for_lambdef(struct compiling *c, const node *n)
 {
@@ -2568,6 +2634,7 @@ static expr_ty
 ast_for_expr(struct compiling *c, const node *n)
 {
     /* handle the full range of simple expressions
+       namedexpr_test: test [':=' test]
        test: or_test ['if' or_test 'else' test] | lambdef
        test_nocond: or_test | lambdef_nocond
        or_test: and_test ('or' and_test)*
@@ -2591,6 +2658,10 @@ ast_for_expr(struct compiling *c, const node *n)
 
  loop:
     switch (TYPE(n)) {
+        case namedexpr_test:
+            if (NCH(n) == 3)
+                return ast_for_namedexpr(c, n);
+            /* Fallthrough */
         case test:
         case test_nocond:
             if (TYPE(CHILD(n, 0)) == lambdef ||
@@ -2770,6 +2841,9 @@ ast_for_call(struct compiling *c, const node *n, expr_ty func,
             }
             else if (TYPE(CHILD(ch, 0)) == STAR)
                 nargs++;
+            else if (TYPE(CHILD(ch, 1)) == COLONEQUAL) {
+                nargs++;
+            }
             else
                 /* TYPE(CHILD(ch, 0)) == DOUBLESTAR or keyword argument */
                 nkeywords++;
@@ -2850,6 +2924,26 @@ ast_for_call(struct compiling *c, const node *n, expr_ty func,
                     return NULL;
                 asdl_seq_SET(args, nargs++, e);
             }
+            else if (TYPE(CHILD(ch, 1)) == COLONEQUAL) {
+                /* treat colon equal as positional argument */
+                if (nkeywords) {
+                    if (ndoublestars) {
+                        ast_error(c, chch,
+                                "positional argument follows "
+                                "keyword argument unpacking");
+                    }
+                    else {
+                        ast_error(c, chch,
+                                "positional argument follows "
+                                "keyword argument");
+                    }
+                    return NULL;
+                }
+                e = ast_for_namedexpr(c, ch);
+                if (!e)
+                    return NULL;
+                asdl_seq_SET(args, nargs++, e);
+            }
             else {
                 /* a keyword argument */
                 keyword_ty kw;
diff --git a/Python/compile.c b/Python/compile.c
index 9713bfc9e9b7..eb2c3028b633 100644
--- a/Python/compile.c
+++ b/Python/compile.c
@@ -3428,7 +3428,10 @@ compiler_nameop(struct compiler *c, identifier name, expr_context_ty ctx)
         case Load:
             op = (c->u->u_ste->ste_type == ClassBlock) ? LOAD_CLASSDEREF : LOAD_DEREF;
             break;
-        case Store: op = STORE_DEREF; break;
+        case Store:
+        case NamedStore:
+            op = STORE_DEREF;
+            break;
         case AugLoad:
         case AugStore:
             break;
@@ -3443,7 +3446,10 @@ compiler_nameop(struct compiler *c, identifier name, expr_context_ty ctx)
     case OP_FAST:
         switch (ctx) {
         case Load: op = LOAD_FAST; break;
-        case Store: op = STORE_FAST; break;
+        case Store:
+        case NamedStore:
+            op = STORE_FAST;
+            break;
         case Del: op = DELETE_FAST; break;
         case AugLoad:
         case AugStore:
@@ -3459,7 +3465,10 @@ compiler_nameop(struct compiler *c, identifier name, expr_context_ty ctx)
     case OP_GLOBAL:
         switch (ctx) {
         case Load: op = LOAD_GLOBAL; break;
-        case Store: op = STORE_GLOBAL; break;
+        case Store:
+        case NamedStore:
+            op = STORE_GLOBAL;
+            break;
         case Del: op = DELETE_GLOBAL; break;
         case AugLoad:
         case AugStore:
@@ -3474,7 +3483,10 @@ compiler_nameop(struct compiler *c, identifier name, expr_context_ty ctx)
     case OP_NAME:
         switch (ctx) {
         case Load: op = LOAD_NAME; break;
-        case Store: op = STORE_NAME; break;
+        case Store:
+        case NamedStore:
+            op = STORE_NAME;
+            break;
         case Del: op = DELETE_NAME; break;
         case AugLoad:
         case AugStore:
@@ -3592,7 +3604,7 @@ static int
 compiler_list(struct compiler *c, expr_ty e)
 {
     asdl_seq *elts = e->v.List.elts;
-    if (e->v.List.ctx == Store) {
+    if (e->v.List.ctx == Store || e->v.List.ctx == NamedStore) {
         return assignment_helper(c, elts);
     }
     else if (e->v.List.ctx == Load) {
@@ -3608,7 +3620,7 @@ static int
 compiler_tuple(struct compiler *c, expr_ty e)
 {
     asdl_seq *elts = e->v.Tuple.elts;
-    if (e->v.Tuple.ctx == Store) {
+    if (e->v.Tuple.ctx == Store || e->v.Tuple.ctx == NamedStore) {
         return assignment_helper(c, elts);
     }
     else if (e->v.Tuple.ctx == Load) {
@@ -4569,6 +4581,11 @@ static int
 compiler_visit_expr1(struct compiler *c, expr_ty e)
 {
     switch (e->kind) {
+    case NamedExpr_kind:
+        VISIT(c, expr, e->v.NamedExpr.value);
+        ADDOP(c, DUP_TOP);
+        VISIT(c, expr, e->v.NamedExpr.target);
+        break;
     case BoolOp_kind:
         return compiler_boolop(c, e);
     case BinOp_kind:
@@ -5003,6 +5020,7 @@ compiler_handle_subscr(struct compiler *c, const char *kind,
         case AugStore:/* fall through to Store */
         case Store:   op = STORE_SUBSCR; break;
         case Del:     op = DELETE_SUBSCR; break;
+        case NamedStore:
         case Param:
             PyErr_Format(PyExc_SystemError,
                          "invalid %s kind %d in subscript\n",
diff --git a/Python/graminit.c b/Python/graminit.c
index 0a681f7b797f..91092f1e0b9e 100644
--- a/Python/graminit.c
+++ b/Python/graminit.c
@@ -891,7 +891,7 @@ static arc arcs_41_0[1] = {
     {97, 1},
 };
 static arc arcs_41_1[1] = {
-    {26, 2},
+    {98, 2},
 };
 static arc arcs_41_2[1] = {
     {27, 3},
@@ -900,8 +900,8 @@ static arc arcs_41_3[1] = {
     {28, 4},
 };
 static arc arcs_41_4[3] = {
-    {98, 1},
-    {99, 5},
+    {99, 1},
+    {100, 5},
     {0, 4},
 };
 static arc arcs_41_5[1] = {
@@ -924,7 +924,7 @@ static state states_41[8] = {
     {1, arcs_41_7},
 };
 static arc arcs_42_0[1] = {
-    {100, 1},
+    {101, 1},
 };
 static arc arcs_42_1[1] = {
     {26, 2},
@@ -936,7 +936,7 @@ static arc arcs_42_3[1] = {
     {28, 4},
 };
 static arc arcs_42_4[2] = {
-    {99, 5},
+    {100, 5},
     {0, 4},
 };
 static arc arcs_42_5[1] = {
@@ -959,13 +959,13 @@ static state states_42[8] = {
     {1, arcs_42_7},
 };
 static arc arcs_43_0[1] = {
-    {101, 1},
+    {102, 1},
 };
 static arc arcs_43_1[1] = {
     {66, 2},
 };
 static arc arcs_43_2[1] = {
-    {102, 3},
+    {103, 3},
 };
 static arc arcs_43_3[1] = {
     {9, 4},
@@ -977,7 +977,7 @@ static arc arcs_43_5[1] = {
     {28, 6},
 };
 static arc arcs_43_6[2] = {
-    {99, 7},
+    {100, 7},
     {0, 6},
 };
 static arc arcs_43_7[1] = {
@@ -1002,7 +1002,7 @@ static state states_43[10] = {
     {1, arcs_43_9},
 };
 static arc arcs_44_0[1] = {
-    {103, 1},
+    {104, 1},
 };
 static arc arcs_44_1[1] = {
     {27, 2},
@@ -1011,8 +1011,8 @@ static arc arcs_44_2[1] = {
     {28, 3},
 };
 static arc arcs_44_3[2] = {
-    {104, 4},
-    {105, 5},
+    {105, 4},
+    {106, 5},
 };
 static arc arcs_44_4[1] = {
     {27, 6},
@@ -1027,9 +1027,9 @@ static arc arcs_44_7[1] = {
     {28, 9},
 };
 static arc arcs_44_8[4] = {
-    {104, 4},
-    {99, 10},
-    {105, 5},
+    {105, 4},
+    {100, 10},
+    {106, 5},
     {0, 8},
 };
 static arc arcs_44_9[1] = {
@@ -1042,7 +1042,7 @@ static arc arcs_44_11[1] = {
     {28, 12},
 };
 static arc arcs_44_12[2] = {
-    {105, 5},
+    {106, 5},
     {0, 12},
 };
 static state states_44[13] = {
@@ -1061,10 +1061,10 @@ static state states_44[13] = {
     {2, arcs_44_12},
 };
 static arc arcs_45_0[1] = {
-    {106, 1},
+    {107, 1},
 };
 static arc arcs_45_1[1] = {
-    {107, 2},
+    {108, 2},
 };
 static arc arcs_45_2[2] = {
     {32, 1},
@@ -1091,7 +1091,7 @@ static arc arcs_46_1[2] = {
     {0, 1},
 };
 static arc arcs_46_2[1] = {
-    {108, 3},
+    {109, 3},
 };
 static arc arcs_46_3[1] = {
     {0, 3},
@@ -1103,7 +1103,7 @@ static state states_46[4] = {
     {1, arcs_46_3},
 };
 static arc arcs_47_0[1] = {
-    {109, 1},
+    {110, 1},
 };
 static arc arcs_47_1[2] = {
     {26, 2},
@@ -1134,14 +1134,14 @@ static arc arcs_48_1[1] = {
     {0, 1},
 };
 static arc arcs_48_2[1] = {
-    {110, 3},
+    {111, 3},
 };
 static arc arcs_48_3[1] = {
     {6, 4},
 };
 static arc arcs_48_4[2] = {
     {6, 4},
-    {111, 1},
+    {112, 1},
 };
 static state states_48[5] = {
     {2, arcs_48_0},
@@ -1150,70 +1150,66 @@ static state states_48[5] = {
     {1, arcs_48_3},
     {2, arcs_48_4},
 };
-static arc arcs_49_0[2] = {
-    {112, 1},
-    {113, 2},
+static arc arcs_49_0[1] = {
+    {26, 1},
 };
 static arc arcs_49_1[2] = {
-    {97, 3},
+    {113, 2},
     {0, 1},
 };
 static arc arcs_49_2[1] = {
-    {0, 2},
+    {26, 3},
 };
 static arc arcs_49_3[1] = {
-    {112, 4},
-};
-static arc arcs_49_4[1] = {
-    {99, 5},
-};
-static arc arcs_49_5[1] = {
-    {26, 2},
+    {0, 3},
 };
-static state states_49[6] = {
-    {2, arcs_49_0},
+static state states_49[4] = {
+    {1, arcs_49_0},
     {2, arcs_49_1},
     {1, arcs_49_2},
     {1, arcs_49_3},
-    {1, arcs_49_4},
-    {1, arcs_49_5},
 };
 static arc arcs_50_0[2] = {
-    {112, 1},
-    {115, 1},
+    {114, 1},
+    {115, 2},
 };
-static arc arcs_50_1[1] = {
+static arc arcs_50_1[2] = {
+    {97, 3},
     {0, 1},
 };
-static state states_50[2] = {
-    {2, arcs_50_0},
-    {1, arcs_50_1},
+static arc arcs_50_2[1] = {
+    {0, 2},
 };
-static arc arcs_51_0[1] = {
-    {116, 1},
+static arc arcs_50_3[1] = {
+    {114, 4},
 };
-static arc arcs_51_1[2] = {
-    {35, 2},
-    {27, 3},
+static arc arcs_50_4[1] = {
+    {100, 5},
 };
-static arc arcs_51_2[1] = {
-    {27, 3},
+static arc arcs_50_5[1] = {
+    {26, 2},
 };
-static arc arcs_51_3[1] = {
-    {26, 4},
+static state states_50[6] = {
+    {2, arcs_50_0},
+    {2, arcs_50_1},
+    {1, arcs_50_2},
+    {1, arcs_50_3},
+    {1, arcs_50_4},
+    {1, arcs_50_5},
+};
+static arc arcs_51_0[2] = {
+    {114, 1},
+    {117, 1},
 };
-static arc arcs_51_4[1] = {
-    {0, 4},
+static arc arcs_51_1[1] = {
+    {0, 1},
 };
-static state states_51[5] = {
-    {1, arcs_51_0},
-    {2, arcs_51_1},
-    {1, arcs_51_2},
-    {1, arcs_51_3},
-    {1, arcs_51_4},
+static state states_51[2] = {
+    {2, arcs_51_0},
+    {1, arcs_51_1},
 };
 static arc arcs_52_0[1] = {
-    {116, 1},
+    {118, 1},
 };
 static arc arcs_52_1[2] = {
     {35, 2},
@@ -1223,7 +1219,7 @@ static arc arcs_52_2[1] = {
     {27, 3},
 };
 static arc arcs_52_3[1] = {
-    {114, 4},
+    {26, 4},
 };
 static arc arcs_52_4[1] = {
     {0, 4},
@@ -1236,15 +1232,27 @@ static state states_52[5] = {
     {1, arcs_52_4},
 };
 static arc arcs_53_0[1] = {
-    {117, 1},
+    {118, 1},
 };
 static arc arcs_53_1[2] = {
-    {118, 0},
-    {0, 1},
+    {35, 2},
+    {27, 3},
+};
+static arc arcs_53_2[1] = {
+    {27, 3},
+};
+static arc arcs_53_3[1] = {
+    {116, 4},
 };
-static state states_53[2] = {
+static arc arcs_53_4[1] = {
+    {0, 4},
+};
+static state states_53[5] = {
     {1, arcs_53_0},
     {2, arcs_53_1},
+    {1, arcs_53_2},
+    {1, arcs_53_3},
+    {1, arcs_53_4},
 };
 static arc arcs_54_0[1] = {
     {119, 1},
@@ -1257,84 +1265,84 @@ static state states_54[2] = {
     {1, arcs_54_0},
     {2, arcs_54_1},
 };
-static arc arcs_55_0[2] = {
+static arc arcs_55_0[1] = {
     {121, 1},
-    {122, 2},
 };
-static arc arcs_55_1[1] = {
-    {119, 2},
+static arc arcs_55_1[2] = {
+    {122, 0},
+    {0, 1},
+};
+static state states_55[2] = {
+    {1, arcs_55_0},
+    {2, arcs_55_1},
 };
-static arc arcs_55_2[1] = {
+static arc arcs_56_0[2] = {
+    {123, 1},
+    {124, 2},
+};
+static arc arcs_56_1[1] = {
+    {121, 2},
+};
+static arc arcs_56_2[1] = {
     {0, 2},
 };
-static state states_55[3] = {
-    {2, arcs_55_0},
-    {1, arcs_55_1},
-    {1, arcs_55_2},
+static state states_56[3] = {
+    {2, arcs_56_0},
+    {1, arcs_56_1},
+    {1, arcs_56_2},
 };
-static arc arcs_56_0[1] = {
-    {108, 1},
+static arc arcs_57_0[1] = {
+    {109, 1},
 };
-static arc arcs_56_1[2] = {
-    {123, 0},
+static arc arcs_57_1[2] = {
+    {125, 0},
     {0, 1},
 };
-static state states_56[2] = {
-    {1, arcs_56_0},
-    {2, arcs_56_1},
+static state states_57[2] = {
+    {1, arcs_57_0},
+    {2, arcs_57_1},
 };
-static arc arcs_57_0[10] = {
-    {124, 1},
-    {125, 1},
+static arc arcs_58_0[10] = {
     {126, 1},
     {127, 1},
     {128, 1},
     {129, 1},
     {130, 1},
-    {102, 1},
-    {121, 2},
-    {131, 3},
+    {131, 1},
+    {132, 1},
+    {103, 1},
+    {123, 2},
+    {133, 3},
 };
-static arc arcs_57_1[1] = {
+static arc arcs_58_1[1] = {
     {0, 1},
 };
-static arc arcs_57_2[1] = {
-    {102, 1},
+static arc arcs_58_2[1] = {
+    {103, 1},
 };
-static arc arcs_57_3[2] = {
-    {121, 1},
+static arc arcs_58_3[2] = {
+    {123, 1},
     {0, 3},
 };
-static state states_57[4] = {
-    {10, arcs_57_0},
-    {1, arcs_57_1},
-    {1, arcs_57_2},
-    {2, arcs_57_3},
-};
-static arc arcs_58_0[1] = {
-    {33, 1},
-};
-static arc arcs_58_1[1] = {
-    {108, 2},
-};
-static arc arcs_58_2[1] = {
-    {0, 2},
-};
-static state states_58[3] = {
-    {1, arcs_58_0},
+static state states_58[4] = {
+    {10, arcs_58_0},
     {1, arcs_58_1},
     {1, arcs_58_2},
+    {2, arcs_58_3},
 };
 static arc arcs_59_0[1] = {
-    {132, 1},
+    {33, 1},
 };
-static arc arcs_59_1[2] = {
-    {133, 0},
-    {0, 1},
+static arc arcs_59_1[1] = {
+    {109, 2},
 };
-static state states_59[2] = {
+static arc arcs_59_2[1] = {
+    {0, 2},
+};
+static state states_59[3] = {
     {1, arcs_59_0},
-    {2, arcs_59_1},
+    {1, arcs_59_1},
+    {1, arcs_59_2},
 };
 static arc arcs_60_0[1] = {
     {134, 1},
@@ -1361,21 +1369,20 @@ static state states_61[2] = {
 static arc arcs_62_0[1] = {
     {138, 1},
 };
-static arc arcs_62_1[3] = {
+static arc arcs_62_1[2] = {
     {139, 0},
-    {140, 0},
     {0, 1},
 };
 static state states_62[2] = {
     {1, arcs_62_0},
-    {3, arcs_62_1},
+    {2, arcs_62_1},
 };
 static arc arcs_63_0[1] = {
-    {141, 1},
+    {140, 1},
 };
 static arc arcs_63_1[3] = {
+    {141, 0},
     {142, 0},
-    {143, 0},
     {0, 1},
 };
 static state states_63[2] = {
@@ -1383,543 +1390,556 @@ static state states_63[2] = {
     {3, arcs_63_1},
 };
 static arc arcs_64_0[1] = {
-    {144, 1},
+    {143, 1},
 };
-static arc arcs_64_1[6] = {
-    {33, 0},
-    {11, 0},
+static arc arcs_64_1[3] = {
+    {144, 0},
     {145, 0},
-    {146, 0},
-    {147, 0},
     {0, 1},
 };
 static state states_64[2] = {
     {1, arcs_64_0},
-    {6, arcs_64_1},
+    {3, arcs_64_1},
 };
-static arc arcs_65_0[4] = {
-    {142, 1},
-    {143, 1},
-    {148, 1},
-    {149, 2},
+static arc arcs_65_0[1] = {
+    {146, 1},
 };
-static arc arcs_65_1[1] = {
-    {144, 2},
+static arc arcs_65_1[6] = {
+    {33, 0},
+    {11, 0},
+    {147, 0},
+    {148, 0},
+    {149, 0},
+    {0, 1},
 };
-static arc arcs_65_2[1] = {
+static state states_65[2] = {
+    {1, arcs_65_0},
+    {6, arcs_65_1},
+};
+static arc arcs_66_0[4] = {
+    {144, 1},
+    {145, 1},
+    {150, 1},
+    {151, 2},
+};
+static arc arcs_66_1[1] = {
+    {146, 2},
+};
+static arc arcs_66_2[1] = {
     {0, 2},
 };
-static state states_65[3] = {
-    {4, arcs_65_0},
-    {1, arcs_65_1},
-    {1, arcs_65_2},
+static state states_66[3] = {
+    {4, arcs_66_0},
+    {1, arcs_66_1},
+    {1, arcs_66_2},
 };
-static arc arcs_66_0[1] = {
-    {150, 1},
+static arc arcs_67_0[1] = {
+    {152, 1},
 };
-static arc arcs_66_1[2] = {
+static arc arcs_67_1[2] = {
     {34, 2},
     {0, 1},
 };
-static arc arcs_66_2[1] = {
-    {144, 3},
+static arc arcs_67_2[1] = {
+    {146, 3},
 };
-static arc arcs_66_3[1] = {
+static arc arcs_67_3[1] = {
     {0, 3},
 };
-static state states_66[4] = {
-    {1, arcs_66_0},
-    {2, arcs_66_1},
-    {1, arcs_66_2},
-    {1, arcs_66_3},
+static state states_67[4] = {
+    {1, arcs_67_0},
+    {2, arcs_67_1},
+    {1, arcs_67_2},
+    {1, arcs_67_3},
 };
-static arc arcs_67_0[2] = {
-    {151, 1},
-    {152, 2},
+static arc arcs_68_0[2] = {
+    {153, 1},
+    {154, 2},
 };
-static arc arcs_67_1[1] = {
-    {152, 2},
+static arc arcs_68_1[1] = {
+    {154, 2},
 };
-static arc arcs_67_2[2] = {
-    {153, 2},
+static arc arcs_68_2[2] = {
+    {155, 2},
     {0, 2},
 };
-static state states_67[3] = {
-    {2, arcs_67_0},
-    {1, arcs_67_1},
-    {2, arcs_67_2},
+static state states_68[3] = {
+    {2, arcs_68_0},
+    {1, arcs_68_1},
+    {2, arcs_68_2},
 };
-static arc arcs_68_0[10] = {
+static arc arcs_69_0[10] = {
     {13, 1},
-    {155, 2},
-    {157, 3},
+    {157, 2},
+    {159, 3},
     {23, 4},
-    {160, 4},
-    {161, 5},
-    {83, 4},
     {162, 4},
-    {163, 4},
+    {163, 5},
+    {83, 4},
     {164, 4},
+    {165, 4},
+    {166, 4},
 };
-static arc arcs_68_1[3] = {
+static arc arcs_69_1[3] = {
     {50, 6},
-    {154, 6},
+    {156, 6},
     {15, 4},
 };
-static arc arcs_68_2[2] = {
-    {154, 7},
-    {156, 4},
+static arc arcs_69_2[2] = {
+    {156, 7},
+    {158, 4},
 };
-static arc arcs_68_3[2] = {
-    {158, 8},
-    {159, 4},
+static arc arcs_69_3[2] = {
+    {160, 8},
+    {161, 4},
 };
-static arc arcs_68_4[1] = {
+static arc arcs_69_4[1] = {
     {0, 4},
 };
-static arc arcs_68_5[2] = {
-    {161, 5},
+static arc arcs_69_5[2] = {
+    {163, 5},
     {0, 5},
 };
-static arc arcs_68_6[1] = {
+static arc arcs_69_6[1] = {
     {15, 4},
 };
-static arc arcs_68_7[1] = {
-    {156, 4},
+static arc arcs_69_7[1] = {
+    {158, 4},
 };
-static arc arcs_68_8[1] = {
-    {159, 4},
+static arc arcs_69_8[1] = {
+    {161, 4},
 };
-static state states_68[9] = {
-    {10, arcs_68_0},
-    {3, arcs_68_1},
-    {2, arcs_68_2},
-    {2, arcs_68_3},
-    {1, arcs_68_4},
-    {2, arcs_68_5},
-    {1, arcs_68_6},
-    {1, arcs_68_7},
-    {1, arcs_68_8},
-};
-static arc arcs_69_0[2] = {
-    {26, 1},
+static state states_69[9] = {
+    {10, arcs_69_0},
+    {3, arcs_69_1},
+    {2, arcs_69_2},
+    {2, arcs_69_3},
+    {1, arcs_69_4},
+    {2, arcs_69_5},
+    {1, arcs_69_6},
+    {1, arcs_69_7},
+    {1, arcs_69_8},
+};
+static arc arcs_70_0[2] = {
+    {98, 1},
     {51, 1},
 };
-static arc arcs_69_1[3] = {
-    {165, 2},
+static arc arcs_70_1[3] = {
+    {167, 2},
     {32, 3},
     {0, 1},
 };
-static arc arcs_69_2[1] = {
+static arc arcs_70_2[1] = {
     {0, 2},
 };
-static arc arcs_69_3[3] = {
-    {26, 4},
+static arc arcs_70_3[3] = {
+    {98, 4},
     {51, 4},
     {0, 3},
 };
-static arc arcs_69_4[2] = {
+static arc arcs_70_4[2] = {
     {32, 3},
     {0, 4},
 };
-static state states_69[5] = {
-    {2, arcs_69_0},
-    {3, arcs_69_1},
-    {1, arcs_69_2},
-    {3, arcs_69_3},
-    {2, arcs_69_4},
+static state states_70[5] = {
+    {2, arcs_70_0},
+    {3, arcs_70_1},
+    {1, arcs_70_2},
+    {3, arcs_70_3},
+    {2, arcs_70_4},
 };
-static arc arcs_70_0[3] = {
+static arc arcs_71_0[3] = {
     {13, 1},
-    {155, 2},
+    {157, 2},
     {82, 3},
 };
-static arc arcs_70_1[2] = {
+static arc arcs_71_1[2] = {
     {14, 4},
     {15, 5},
 };
-static arc arcs_70_2[1] = {
-    {166, 6},
+static arc arcs_71_2[1] = {
+    {168, 6},
 };
-static arc arcs_70_3[1] = {
+static arc arcs_71_3[1] = {
     {23, 5},
 };
-static arc arcs_70_4[1] = {
+static arc arcs_71_4[1] = {
     {15, 5},
 };
-static arc arcs_70_5[1] = {
+static arc arcs_71_5[1] = {
     {0, 5},
 };
-static arc arcs_70_6[1] = {
-    {156, 5},
+static arc arcs_71_6[1] = {
+    {158, 5},
 };
-static state states_70[7] = {
-    {3, arcs_70_0},
-    {2, arcs_70_1},
-    {1, arcs_70_2},
-    {1, arcs_70_3},
-    {1, arcs_70_4},
-    {1, arcs_70_5},
-    {1, arcs_70_6},
+static state states_71[7] = {
+    {3, arcs_71_0},
+    {2, arcs_71_1},
+    {1, arcs_71_2},
+    {1, arcs_71_3},
+    {1, arcs_71_4},
+    {1, arcs_71_5},
+    {1, arcs_71_6},
 };
-static arc arcs_71_0[1] = {
-    {167, 1},
+static arc arcs_72_0[1] = {
+    {169, 1},
 };
-static arc arcs_71_1[2] = {
+static arc arcs_72_1[2] = {
     {32, 2},
     {0, 1},
 };
-static arc arcs_71_2[2] = {
-    {167, 1},
+static arc arcs_72_2[2] = {
+    {169, 1},
     {0, 2},
 };
-static state states_71[3] = {
-    {1, arcs_71_0},
-    {2, arcs_71_1},
-    {2, arcs_71_2},
+static state states_72[3] = {
+    {1, arcs_72_0},
+    {2, arcs_72_1},
+    {2, arcs_72_2},
 };
-static arc arcs_72_0[2] = {
+static arc arcs_73_0[2] = {
     {26, 1},
     {27, 2},
 };
-static arc arcs_72_1[2] = {
+static arc arcs_73_1[2] = {
     {27, 2},
     {0, 1},
 };
-static arc arcs_72_2[3] = {
+static arc arcs_73_2[3] = {
     {26, 3},
-    {168, 4},
+    {170, 4},
     {0, 2},
 };
-static arc arcs_72_3[2] = {
-    {168, 4},
+static arc arcs_73_3[2] = {
+    {170, 4},
     {0, 3},
 };
-static arc arcs_72_4[1] = {
+static arc arcs_73_4[1] = {
     {0, 4},
 };
-static state states_72[5] = {
-    {2, arcs_72_0},
-    {2, arcs_72_1},
-    {3, arcs_72_2},
-    {2, arcs_72_3},
-    {1, arcs_72_4},
+static state states_73[5] = {
+    {2, arcs_73_0},
+    {2, arcs_73_1},
+    {3, arcs_73_2},
+    {2, arcs_73_3},
+    {1, arcs_73_4},
 };
-static arc arcs_73_0[1] = {
+static arc arcs_74_0[1] = {
     {27, 1},
 };
-static arc arcs_73_1[2] = {
+static arc arcs_74_1[2] = {
     {26, 2},
     {0, 1},
 };
-static arc arcs_73_2[1] = {
+static arc arcs_74_2[1] = {
     {0, 2},
 };
-static state states_73[3] = {
-    {1, arcs_73_0},
-    {2, arcs_73_1},
-    {1, arcs_73_2},
+static state states_74[3] = {
+    {1, arcs_74_0},
+    {2, arcs_74_1},
+    {1, arcs_74_2},
 };
-static arc arcs_74_0[2] = {
-    {108, 1},
+static arc arcs_75_0[2] = {
+    {109, 1},
     {51, 1},
 };
-static arc arcs_74_1[2] = {
+static arc arcs_75_1[2] = {
     {32, 2},
     {0, 1},
 };
-static arc arcs_74_2[3] = {
-    {108, 1},
+static arc arcs_75_2[3] = {
+    {109, 1},
     {51, 1},
     {0, 2},
 };
-static state states_74[3] = {
-    {2, arcs_74_0},
-    {2, arcs_74_1},
-    {3, arcs_74_2},
+static state states_75[3] = {
+    {2, arcs_75_0},
+    {2, arcs_75_1},
+    {3, arcs_75_2},
 };
-static arc arcs_75_0[1] = {
+static arc arcs_76_0[1] = {
     {26, 1},
 };
-static arc arcs_75_1[2] = {
+static arc arcs_76_1[2] = {
     {32, 2},
     {0, 1},
 };
-static arc arcs_75_2[2] = {
+static arc arcs_76_2[2] = {
     {26, 1},
     {0, 2},
 };
-static state states_75[3] = {
-    {1, arcs_75_0},
-    {2, arcs_75_1},
-    {2, arcs_75_2},
+static state states_76[3] = {
+    {1, arcs_76_0},
+    {2, arcs_76_1},
+    {2, arcs_76_2},
 };
-static arc arcs_76_0[3] = {
+static arc arcs_77_0[3] = {
     {26, 1},
     {34, 2},
     {51, 3},
 };
-static arc arcs_76_1[4] = {
+static arc arcs_77_1[4] = {
     {27, 4},
-    {165, 5},
+    {167, 5},
     {32, 6},
     {0, 1},
 };
-static arc arcs_76_2[1] = {
-    {108, 7},
+static arc arcs_77_2[1] = {
+    {109, 7},
 };
-static arc arcs_76_3[3] = {
-    {165, 5},
+static arc arcs_77_3[3] = {
+    {167, 5},
     {32, 6},
     {0, 3},
 };
-static arc arcs_76_4[1] = {
+static arc arcs_77_4[1] = {
     {26, 7},
 };
-static arc arcs_76_5[1] = {
+static arc arcs_77_5[1] = {
     {0, 5},
 };
-static arc arcs_76_6[3] = {
+static arc arcs_77_6[3] = {
     {26, 8},
     {51, 8},
     {0, 6},
 };
-static arc arcs_76_7[3] = {
-    {165, 5},
+static arc arcs_77_7[3] = {
+    {167, 5},
     {32, 9},
     {0, 7},
 };
-static arc arcs_76_8[2] = {
+static arc arcs_77_8[2] = {
     {32, 6},
     {0, 8},
 };
-static arc arcs_76_9[3] = {
+static arc arcs_77_9[3] = {
     {26, 10},
     {34, 11},
     {0, 9},
 };
-static arc arcs_76_10[1] = {
+static arc arcs_77_10[1] = {
     {27, 12},
 };
-static arc arcs_76_11[1] = {
-    {108, 13},
+static arc arcs_77_11[1] = {
+    {109, 13},
 };
-static arc arcs_76_12[1] = {
+static arc arcs_77_12[1] = {
     {26, 13},
 };
-static arc arcs_76_13[2] = {
+static arc arcs_77_13[2] = {
     {32, 9},
     {0, 13},
 };
-static state states_76[14] = {
-    {3, arcs_76_0},
-    {4, arcs_76_1},
-    {1, arcs_76_2},
-    {3, arcs_76_3},
-    {1, arcs_76_4},
-    {1, arcs_76_5},
-    {3, arcs_76_6},
-    {3, arcs_76_7},
-    {2, arcs_76_8},
-    {3, arcs_76_9},
-    {1, arcs_76_10},
-    {1, arcs_76_11},
-    {1, arcs_76_12},
-    {2, arcs_76_13},
-};
-static arc arcs_77_0[1] = {
-    {169, 1},
+static state states_77[14] = {
+    {3, arcs_77_0},
+    {4, arcs_77_1},
+    {1, arcs_77_2},
+    {3, arcs_77_3},
+    {1, arcs_77_4},
+    {1, arcs_77_5},
+    {3, arcs_77_6},
+    {3, arcs_77_7},
+    {2, arcs_77_8},
+    {3, arcs_77_9},
+    {1, arcs_77_10},
+    {1, arcs_77_11},
+    {1, arcs_77_12},
+    {2, arcs_77_13},
 };
-static arc arcs_77_1[1] = {
+static arc arcs_78_0[1] = {
+    {171, 1},
+};
+static arc arcs_78_1[1] = {
     {23, 2},
 };
-static arc arcs_77_2[2] = {
+static arc arcs_78_2[2] = {
     {13, 3},
     {27, 4},
 };
-static arc arcs_77_3[2] = {
+static arc arcs_78_3[2] = {
     {14, 5},
     {15, 6},
 };
-static arc arcs_77_4[1] = {
+static arc arcs_78_4[1] = {
     {28, 7},
 };
-static arc arcs_77_5[1] = {
+static arc arcs_78_5[1] = {
     {15, 6},
 };
-static arc arcs_77_6[1] = {
+static arc arcs_78_6[1] = {
     {27, 4},
 };
-static arc arcs_77_7[1] = {
+static arc arcs_78_7[1] = {
     {0, 7},
 };
-static state states_77[8] = {
-    {1, arcs_77_0},
-    {1, arcs_77_1},
-    {2, arcs_77_2},
-    {2, arcs_77_3},
-    {1, arcs_77_4},
-    {1, arcs_77_5},
-    {1, arcs_77_6},
-    {1, arcs_77_7},
+static state states_78[8] = {
+    {1, arcs_78_0},
+    {1, arcs_78_1},
+    {2, arcs_78_2},
+    {2, arcs_78_3},
+    {1, arcs_78_4},
+    {1, arcs_78_5},
+    {1, arcs_78_6},
+    {1, arcs_78_7},
 };
-static arc arcs_78_0[1] = {
-    {170, 1},
+static arc arcs_79_0[1] = {
+    {172, 1},
 };
-static arc arcs_78_1[2] = {
+static arc arcs_79_1[2] = {
     {32, 2},
     {0, 1},
 };
-static arc arcs_78_2[2] = {
-    {170, 1},
+static arc arcs_79_2[2] = {
+    {172, 1},
     {0, 2},
 };
-static state states_78[3] = {
-    {1, arcs_78_0},
-    {2, arcs_78_1},
-    {2, arcs_78_2},
+static state states_79[3] = {
+    {1, arcs_79_0},
+    {2, arcs_79_1},
+    {2, arcs_79_2},
 };
-static arc arcs_79_0[3] = {
+static arc arcs_80_0[3] = {
     {26, 1},
     {34, 2},
     {33, 2},
 };
-static arc arcs_79_1[3] = {
-    {165, 3},
+static arc arcs_80_1[4] = {
+    {167, 3},
+    {113, 2},
     {31, 2},
     {0, 1},
 };
-static arc arcs_79_2[1] = {
+static arc arcs_80_2[1] = {
     {26, 3},
 };
-static arc arcs_79_3[1] = {
+static arc arcs_80_3[1] = {
     {0, 3},
 };
-static state states_79[4] = {
-    {3, arcs_79_0},
-    {3, arcs_79_1},
-    {1, arcs_79_2},
-    {1, arcs_79_3},
+static state states_80[4] = {
+    {3, arcs_80_0},
+    {4, arcs_80_1},
+    {1, arcs_80_2},
+    {1, arcs_80_3},
 };
-static arc arcs_80_0[2] = {
-    {165, 1},
-    {172, 1},
+static arc arcs_81_0[2] = {
+    {167, 1},
+    {174, 1},
 };
-static arc arcs_80_1[1] = {
+static arc arcs_81_1[1] = {
     {0, 1},
 };
-static state states_80[2] = {
-    {2, arcs_80_0},
-    {1, arcs_80_1},
+static state states_81[2] = {
+    {2, arcs_81_0},
+    {1, arcs_81_1},
 };
-static arc arcs_81_0[1] = {
-    {101, 1},
+static arc arcs_82_0[1] = {
+    {102, 1},
 };
-static arc arcs_81_1[1] = {
+static arc arcs_82_1[1] = {
     {66, 2},
 };
-static arc arcs_81_2[1] = {
-    {102, 3},
+static arc arcs_82_2[1] = {
+    {103, 3},
 };
-static arc arcs_81_3[1] = {
-    {112, 4},
+static arc arcs_82_3[1] = {
+    {114, 4},
 };
-static arc arcs_81_4[2] = {
-    {171, 5},
+static arc arcs_82_4[2] = {
+    {173, 5},
     {0, 4},
 };
-static arc arcs_81_5[1] = {
+static arc arcs_82_5[1] = {
     {0, 5},
 };
-static state states_81[6] = {
-    {1, arcs_81_0},
-    {1, arcs_81_1},
-    {1, arcs_81_2},
-    {1, arcs_81_3},
-    {2, arcs_81_4},
-    {1, arcs_81_5},
+static state states_82[6] = {
+    {1, arcs_82_0},
+    {1, arcs_82_1},
+    {1, arcs_82_2},
+    {1, arcs_82_3},
+    {2, arcs_82_4},
+    {1, arcs_82_5},
 };
-static arc arcs_82_0[2] = {
+static arc arcs_83_0[2] = {
     {21, 1},
-    {173, 2},
+    {175, 2},
 };
-static arc arcs_82_1[1] = {
-    {173, 2},
+static arc arcs_83_1[1] = {
+    {175, 2},
 };
-static arc arcs_82_2[1] = {
+static arc arcs_83_2[1] = {
     {0, 2},
 };
-static state states_82[3] = {
-    {2, arcs_82_0},
-    {1, arcs_82_1},
-    {1, arcs_82_2},
+static state states_83[3] = {
+    {2, arcs_83_0},
+    {1, arcs_83_1},
+    {1, arcs_83_2},
 };
-static arc arcs_83_0[1] = {
+static arc arcs_84_0[1] = {
     {97, 1},
 };
-static arc arcs_83_1[1] = {
-    {114, 2},
+static arc arcs_84_1[1] = {
+    {116, 2},
 };
-static arc arcs_83_2[2] = {
-    {171, 3},
+static arc arcs_84_2[2] = {
+    {173, 3},
     {0, 2},
 };
-static arc arcs_83_3[1] = {
+static arc arcs_84_3[1] = {
     {0, 3},
 };
-static state states_83[4] = {
-    {1, arcs_83_0},
-    {1, arcs_83_1},
-    {2, arcs_83_2},
-    {1, arcs_83_3},
+static state states_84[4] = {
+    {1, arcs_84_0},
+    {1, arcs_84_1},
+    {2, arcs_84_2},
+    {1, arcs_84_3},
 };
-static arc arcs_84_0[1] = {
+static arc arcs_85_0[1] = {
     {23, 1},
 };
-static arc arcs_84_1[1] = {
+static arc arcs_85_1[1] = {
     {0, 1},
 };
-static state states_84[2] = {
-    {1, arcs_84_0},
-    {1, arcs_84_1},
+static state states_85[2] = {
+    {1, arcs_85_0},
+    {1, arcs_85_1},
 };
-static arc arcs_85_0[1] = {
-    {175, 1},
+static arc arcs_86_0[1] = {
+    {177, 1},
 };
-static arc arcs_85_1[2] = {
-    {176, 2},
+static arc arcs_86_1[2] = {
+    {178, 2},
     {0, 1},
 };
-static arc arcs_85_2[1] = {
+static arc arcs_86_2[1] = {
     {0, 2},
 };
-static state states_85[3] = {
-    {1, arcs_85_0},
-    {2, arcs_85_1},
-    {1, arcs_85_2},
+static state states_86[3] = {
+    {1, arcs_86_0},
+    {2, arcs_86_1},
+    {1, arcs_86_2},
 };
-static arc arcs_86_0[2] = {
+static arc arcs_87_0[2] = {
     {77, 1},
     {47, 2},
 };
-static arc arcs_86_1[1] = {
+static arc arcs_87_1[1] = {
     {26, 2},
 };
-static arc arcs_86_2[1] = {
+static arc arcs_87_2[1] = {
     {0, 2},
 };
-static state states_86[3] = {
-    {2, arcs_86_0},
-    {1, arcs_86_1},
-    {1, arcs_86_2},
+static state states_87[3] = {
+    {2, arcs_87_0},
+    {1, arcs_87_1},
+    {1, arcs_87_2},
 };
-static dfa dfas[87] = {
+static dfa dfas[88] = {
     {256, "single_input", 0, 3, states_0,
-     "\004\050\340\000\002\000\000\000\012\076\011\007\262\004\020\002\000\300\220\050\037\202\000"},
+     "\004\050\340\000\002\000\000\000\012\076\011\007\142\011\100\010\000\000\103\242\174\010\002"},
     {257, "file_input", 0, 2, states_1,
-     "\204\050\340\000\002\000\000\000\012\076\011\007\262\004\020\002\000\300\220\050\037\202\000"},
+     "\204\050\340\000\002\000\000\000\012\076\011\007\142\011\100\010\000\000\103\242\174\010\002"},
     {258, "eval_input", 0, 3, states_2,
-     "\000\040\200\000\000\000\000\000\000\000\010\000\000\000\020\002\000\300\220\050\037\000\000"},
+     "\000\040\200\000\000\000\000\000\000\000\010\000\000\000\100\010\000\000\103\242\174\000\000"},
     {259, "decorator", 0, 7, states_3,
      "\000\010\000\000\000\000\000\000\000\000\000\000\000\000\000\000\000\000\000\000\000\000\000"},
     {260, "decorators", 0, 2, states_4,
@@ -1941,17 +1961,17 @@ static dfa dfas[87] = {
     {268, "vfpdef", 0, 2, states_12,
      "\000\000\200\000\000\000\000\000\000\000\000\000\000\000\000\000\000\000\000\000\000\000\000"},
     {269, "stmt", 0, 2, states_13,
-     "\000\050\340\000\002\000\000\000\012\076\011\007\262\004\020\002\000\300\220\050\037\202\000"},
+     "\000\050\340\000\002\000\000\000\012\076\011\007\142\011\100\010\000\000\103\242\174\010\002"},
     {270, "simple_stmt", 0, 4, states_14,
-     "\000\040\200\000\002\000\000\000\012\076\011\007\000\000\020\002\000\300\220\050\037\200\000"},
+     "\000\040\200\000\002\000\000\000\012\076\011\007\000\000\100\010\000\000\103\242\174\000\002"},
     {271, "small_stmt", 0, 2, states_15,
-     "\000\040\200\000\002\000\000\000\012\076\011\007\000\000\020\002\000\300\220\050\037\200\000"},
+     "\000\040\200\000\002\000\000\000\012\076\011\007\000\000\100\010\000\000\103\242\174\000\002"},
     {272, "expr_stmt", 0, 6, states_16,
-     "\000\040\200\000\002\000\000\000\000\000\010\000\000\000\020\002\000\300\220\050\037\000\000"},
+     "\000\040\200\000\002\000\000\000\000\000\010\000\000\000\100\010\000\000\103\242\174\000\000"},
     {273, "annassign", 0, 5, states_17,
      "\000\000\000\010\000\000\000\000\000\000\000\000\000\000\000\000\000\000\000\000\000\000\000"},
     {274, "testlist_star_expr", 0, 3, states_18,
-     "\000\040\200\000\002\000\000\000\000\000\010\000\000\000\020\002\000\300\220\050\037\000\000"},
+     "\000\040\200\000\002\000\000\000\000\000\010\000\000\000\100\010\000\000\103\242\174\000\000"},
     {275, "augassign", 0, 2, states_19,
      "\000\000\000\000\000\000\360\377\001\000\000\000\000\000\000\000\000\000\000\000\000\000\000"},
     {276, "del_stmt", 0, 3, states_20,
@@ -1959,7 +1979,7 @@ static dfa dfas[87] = {
     {277, "pass_stmt", 0, 2, states_21,
      "\000\000\000\000\000\000\000\000\010\000\000\000\000\000\000\000\000\000\000\000\000\000\000"},
     {278, "flow_stmt", 0, 2, states_22,
-     "\000\000\000\000\000\000\000\000\000\036\000\000\000\000\000\000\000\000\000\000\000\200\000"},
+     "\000\000\000\000\000\000\000\000\000\036\000\000\000\000\000\000\000\000\000\000\000\000\002"},
     {279, "break_stmt", 0, 2, states_23,
      "\000\000\000\000\000\000\000\000\000\002\000\000\000\000\000\000\000\000\000\000\000\000\000"},
     {280, "continue_stmt", 0, 2, states_24,
@@ -1967,7 +1987,7 @@ static dfa dfas[87] = {
     {281, "return_stmt", 0, 3, states_25,
      "\000\000\000\000\000\000\000\000\000\010\000\000\000\000\000\000\000\000\000\000\000\000\000"},
     {282, "yield_stmt", 0, 2, states_26,
-     "\000\000\000\000\000\000\000\000\000\000\000\000\000\000\000\000\000\000\000\000\000\200\000"},
+     "\000\000\000\000\000\000\000\000\000\000\000\000\000\000\000\000\000\000\000\000\000\000\002"},
     {283, "raise_stmt", 0, 5, states_27,
      "\000\000\000\000\000\000\000\000\000\020\000\000\000\000\000\000\000\000\000\000\000\000\000"},
     {284, "import_stmt", 0, 2, states_28,
@@ -1993,103 +2013,105 @@ static dfa dfas[87] = {
     {294, "assert_stmt", 0, 5, states_38,
      "\000\000\000\000\000\000\000\000\000\000\000\004\000\000\000\000\000\000\000\000\000\000\000"},
     {295, "compound_stmt", 0, 2, states_39,
-     "\000\010\140\000\000\000\000\000\000\000\000\000\262\004\000\000\000\000\000\000\000\002\000"},
+     "\000\010\140\000\000\000\000\000\000\000\000\000\142\011\000\000\000\000\000\000\000\010\000"},
     {296, "async_stmt", 0, 3, states_40,
      "\000\000\040\000\000\000\000\000\000\000\000\000\000\000\000\000\000\000\000\000\000\000\000"},
     {297, "if_stmt", 0, 8, states_41,
      "\000\000\000\000\000\000\000\000\000\000\000\000\002\000\000\000\000\000\000\000\000\000\000"},
     {298, "while_stmt", 0, 8, states_42,
-     "\000\000\000\000\000\000\000\000\000\000\000\000\020\000\000\000\000\000\000\000\000\000\000"},
-    {299, "for_stmt", 0, 10, states_43,
      "\000\000\000\000\000\000\000\000\000\000\000\000\040\000\000\000\000\000\000\000\000\000\000"},
+    {299, "for_stmt", 0, 10, states_43,
+     "\000\000\000\000\000\000\000\000\000\000\000\000\100\000\000\000\000\000\000\000\000\000\000"},
     {300, "try_stmt", 0, 13, states_44,
-     "\000\000\000\000\000\000\000\000\000\000\000\000\200\000\000\000\000\000\000\000\000\000\000"},
+     "\000\000\000\000\000\000\000\000\000\000\000\000\000\001\000\000\000\000\000\000\000\000\000"},
     {301, "with_stmt", 0, 5, states_45,
-     "\000\000\000\000\000\000\000\000\000\000\000\000\000\004\000\000\000\000\000\000\000\000\000"},
+     "\000\000\000\000\000\000\000\000\000\000\000\000\000\010\000\000\000\000\000\000\000\000\000"},
     {302, "with_item", 0, 4, states_46,
-     "\000\040\200\000\000\000\000\000\000\000\010\000\000\000\020\002\000\300\220\050\037\000\000"},
+     "\000\040\200\000\000\000\000\000\000\000\010\000\000\000\100\010\000\000\103\242\174\000\000"},
     {303, "except_clause", 0, 5, states_47,
-     "\000\000\000\000\000\000\000\000\000\000\000\000\000\040\000\000\000\000\000\000\000\000\000"},
+     "\000\000\000\000\000\000\000\000\000\000\000\000\000\100\000\000\000\000\000\000\000\000\000"},
     {304, "suite", 0, 5, states_48,
-     "\004\040\200\000\002\000\000\000\012\076\011\007\000\000\020\002\000\300\220\050\037\200\000"},
-    {305, "test", 0, 6, states_49,
-     "\000\040\200\000\000\000\000\000\000\000\010\000\000\000\020\002\000\300\220\050\037\000\000"},
-    {306, "test_nocond", 0, 2, states_50,
-     "\000\040\200\000\000\000\000\000\000\000\010\000\000\000\020\002\000\300\220\050\037\000\000"},
-    {307, "lambdef", 0, 5, states_51,
-     "\000\000\000\000\000\000\000\000\000\000\000\000\000\000\020\000\000\000\000\000\000\000\000"},
-    {308, "lambdef_nocond", 0, 5, states_52,
-     "\000\000\000\000\000\000\000\000\000\000\000\000\000\000\020\000\000\000\000\000\000\000\000"},
-    {309, "or_test", 0, 2, states_53,
-     "\000\040\200\000\000\000\000\000\000\000\010\000\000\000\000\002\000\300\220\050\037\000\000"},
-    {310, "and_test", 0, 2, states_54,
-     "\000\040\200\000\000\000\000\000\000\000\010\000\000\000\000\002\000\300\220\050\037\000\000"},
-    {311, "not_test", 0, 3, states_55,
-     "\000\040\200\000\000\000\000\000\000\000\010\000\000\000\000\002\000\300\220\050\037\000\000"},
-    {312, "comparison", 0, 2, states_56,
-     "\000\040\200\000\000\000\000\000\000\000\010\000\000\000\000\000\000\300\220\050\037\000\000"},
-    {313, "comp_op", 0, 4, states_57,
-     "\000\000\000\000\000\000\000\000\000\000\000\000\100\000\000\362\017\000\000\000\000\000\000"},
-    {314, "star_expr", 0, 3, states_58,
+     "\004\040\200\000\002\000\000\000\012\076\011\007\000\000\100\010\000\000\103\242\174\000\002"},
+    {305, "namedexpr_test", 0, 4, states_49,
+     "\000\040\200\000\000\000\000\000\000\000\010\000\000\000\100\010\000\000\103\242\174\000\000"},
+    {306, "test", 0, 6, states_50,
+     "\000\040\200\000\000\000\000\000\000\000\010\000\000\000\100\010\000\000\103\242\174\000\000"},
+    {307, "test_nocond", 0, 2, states_51,
+     "\000\040\200\000\000\000\000\000\000\000\010\000\000\000\100\010\000\000\103\242\174\000\000"},
+    {308, "lambdef", 0, 5, states_52,
+     "\000\000\000\000\000\000\000\000\000\000\000\000\000\000\100\000\000\000\000\000\000\000\000"},
+    {309, "lambdef_nocond", 0, 5, states_53,
+     "\000\000\000\000\000\000\000\000\000\000\000\000\000\000\100\000\000\000\000\000\000\000\000"},
+    {310, "or_test", 0, 2, states_54,
+     "\000\040\200\000\000\000\000\000\000\000\010\000\000\000\000\010\000\000\103\242\174\000\000"},
+    {311, "and_test", 0, 2, states_55,
+     "\000\040\200\000\000\000\000\000\000\000\010\000\000\000\000\010\000\000\103\242\174\000\000"},
+    {312, "not_test", 0, 3, states_56,
+     "\000\040\200\000\000\000\000\000\000\000\010\000\000\000\000\010\000\000\103\242\174\000\000"},
+    {313, "comparison", 0, 2, states_57,
+     "\000\040\200\000\000\000\000\000\000\000\010\000\000\000\000\000\000\000\103\242\174\000\000"},
+    {314, "comp_op", 0, 4, states_58,
+     "\000\000\000\000\000\000\000\000\000\000\000\000\200\000\000\310\077\000\000\000\000\000\000"},
+    {315, "star_expr", 0, 3, states_59,
      "\000\000\000\000\002\000\000\000\000\000\000\000\000\000\000\000\000\000\000\000\000\000\000"},
-    {315, "expr", 0, 2, states_59,
-     "\000\040\200\000\000\000\000\000\000\000\010\000\000\000\000\000\000\300\220\050\037\000\000"},
-    {316, "xor_expr", 0, 2, states_60,
-     "\000\040\200\000\000\000\000\000\000\000\010\000\000\000\000\000\000\300\220\050\037\000\000"},
-    {317, "and_expr", 0, 2, states_61,
-     "\000\040\200\000\000\000\000\000\000\000\010\000\000\000\000\000\000\300\220\050\037\000\000"},
-    {318, "shift_expr", 0, 2, states_62,
-     "\000\040\200\000\000\000\000\000\000\000\010\000\000\000\000\000\000\300\220\050\037\000\000"},
-    {319, "arith_expr", 0, 2, states_63,
-     "\000\040\200\000\000\000\000\000\000\000\010\000\000\000\000\000\000\300\220\050\037\000\000"},
-    {320, "term", 0, 2, states_64,
-     "\000\040\200\000\000\000\000\000\000\000\010\000\000\000\000\000\000\300\220\050\037\000\000"},
-    {321, "factor", 0, 3, states_65,
-     "\000\040\200\000\000\000\000\000\000\000\010\000\000\000\000\000\000\300\220\050\037\000\000"},
-    {322, "power", 0, 4, states_66,
-     "\000\040\200\000\000\000\000\000\000\000\010\000\000\000\000\000\000\000\200\050\037\000\000"},
-    {323, "atom_expr", 0, 3, states_67,
-     "\000\040\200\000\000\000\000\000\000\000\010\000\000\000\000\000\000\000\200\050\037\000\000"},
-    {324, "atom", 0, 9, states_68,
-     "\000\040\200\000\000\000\000\000\000\000\010\000\000\000\000\000\000\000\000\050\037\000\000"},
-    {325, "testlist_comp", 0, 5, states_69,
-     "\000\040\200\000\002\000\000\000\000\000\010\000\000\000\020\002\000\300\220\050\037\000\000"},
-    {326, "trailer", 0, 7, states_70,
-     "\000\040\000\000\000\000\000\000\000\000\004\000\000\000\000\000\000\000\000\010\000\000\000"},
-    {327, "subscriptlist", 0, 3, states_71,
-     "\000\040\200\010\000\000\000\000\000\000\010\000\000\000\020\002\000\300\220\050\037\000\000"},
-    {328, "subscript", 0, 5, states_72,
-     "\000\040\200\010\000\000\000\000\000\000\010\000\000\000\020\002\000\300\220\050\037\000\000"},
-    {329, "sliceop", 0, 3, states_73,
+    {316, "expr", 0, 2, states_60,
+     "\000\040\200\000\000\000\000\000\000\000\010\000\000\000\000\000\000\000\103\242\174\000\000"},
+    {317, "xor_expr", 0, 2, states_61,
+     "\000\040\200\000\000\000\000\000\000\000\010\000\000\000\000\000\000\000\103\242\174\000\000"},
+    {318, "and_expr", 0, 2, states_62,
+     "\000\040\200\000\000\000\000\000\000\000\010\000\000\000\000\000\000\000\103\242\174\000\000"},
+    {319, "shift_expr", 0, 2, states_63,
+     "\000\040\200\000\000\000\000\000\000\000\010\000\000\000\000\000\000\000\103\242\174\000\000"},
+    {320, "arith_expr", 0, 2, states_64,
+     "\000\040\200\000\000\000\000\000\000\000\010\000\000\000\000\000\000\000\103\242\174\000\000"},
+    {321, "term", 0, 2, states_65,
+     "\000\040\200\000\000\000\000\000\000\000\010\000\000\000\000\000\000\000\103\242\174\000\000"},
+    {322, "factor", 0, 3, states_66,
+     "\000\040\200\000\000\000\000\000\000\000\010\000\000\000\000\000\000\000\103\242\174\000\000"},
+    {323, "power", 0, 4, states_67,
+     "\000\040\200\000\000\000\000\000\000\000\010\000\000\000\000\000\000\000\000\242\174\000\000"},
+    {324, "atom_expr", 0, 3, states_68,
+     "\000\040\200\000\000\000\000\000\000\000\010\000\000\000\000\000\000\000\000\242\174\000\000"},
+    {325, "atom", 0, 9, states_69,
+     "\000\040\200\000\000\000\000\000\000\000\010\000\000\000\000\000\000\000\000\240\174\000\000"},
+    {326, "testlist_comp", 0, 5, states_70,
+     "\000\040\200\000\002\000\000\000\000\000\010\000\000\000\100\010\000\000\103\242\174\000\000"},
+    {327, "trailer", 0, 7, states_71,
+     "\000\040\000\000\000\000\000\000\000\000\004\000\000\000\000\000\000\000\000\040\000\000\000"},
+    {328, "subscriptlist", 0, 3, states_72,
+     "\000\040\200\010\000\000\000\000\000\000\010\000\000\000\100\010\000\000\103\242\174\000\000"},
+    {329, "subscript", 0, 5, states_73,
+     "\000\040\200\010\000\000\000\000\000\000\010\000\000\000\100\010\000\000\103\242\174\000\000"},
+    {330, "sliceop", 0, 3, states_74,
      "\000\000\000\010\000\000\000\000\000\000\000\000\000\000\000\000\000\000\000\000\000\000\000"},
-    {330, "exprlist", 0, 3, states_74,
-     "\000\040\200\000\002\000\000\000\000\000\010\000\000\000\000\000\000\300\220\050\037\000\000"},
-    {331, "testlist", 0, 3, states_75,
-     "\000\040\200\000\000\000\000\000\000\000\010\000\000\000\020\002\000\300\220\050\037\000\000"},
-    {332, "dictorsetmaker", 0, 14, states_76,
-     "\000\040\200\000\006\000\000\000\000\000\010\000\000\000\020\002\000\300\220\050\037\000\000"},
-    {333, "classdef", 0, 8, states_77,
-     "\000\000\000\000\000\000\000\000\000\000\000\000\000\000\000\000\000\000\000\000\000\002\000"},
-    {334, "arglist", 0, 3, states_78,
-     "\000\040\200\000\006\000\000\000\000\000\010\000\000\000\020\002\000\300\220\050\037\000\000"},
-    {335, "argument", 0, 4, states_79,
-     "\000\040\200\000\006\000\000\000\000\000\010\000\000\000\020\002\000\300\220\050\037\000\000"},
-    {336, "comp_iter", 0, 2, states_80,
-     "\000\000\040\000\000\000\000\000\000\000\000\000\042\000\000\000\000\000\000\000\000\000\000"},
-    {337, "sync_comp_for", 0, 6, states_81,
-     "\000\000\000\000\000\000\000\000\000\000\000\000\040\000\000\000\000\000\000\000\000\000\000"},
-    {338, "comp_for", 0, 3, states_82,
-     "\000\000\040\000\000\000\000\000\000\000\000\000\040\000\000\000\000\000\000\000\000\000\000"},
-    {339, "comp_if", 0, 4, states_83,
+    {331, "exprlist", 0, 3, states_75,
+     "\000\040\200\000\002\000\000\000\000\000\010\000\000\000\000\000\000\000\103\242\174\000\000"},
+    {332, "testlist", 0, 3, states_76,
+     "\000\040\200\000\000\000\000\000\000\000\010\000\000\000\100\010\000\000\103\242\174\000\000"},
+    {333, "dictorsetmaker", 0, 14, states_77,
+     "\000\040\200\000\006\000\000\000\000\000\010\000\000\000\100\010\000\000\103\242\174\000\000"},
+    {334, "classdef", 0, 8, states_78,
+     "\000\000\000\000\000\000\000\000\000\000\000\000\000\000\000\000\000\000\000\000\000\010\000"},
+    {335, "arglist", 0, 3, states_79,
+     "\000\040\200\000\006\000\000\000\000\000\010\000\000\000\100\010\000\000\103\242\174\000\000"},
+    {336, "argument", 0, 4, states_80,
+     "\000\040\200\000\006\000\000\000\000\000\010\000\000\000\100\010\000\000\103\242\174\000\000"},
+    {337, "comp_iter", 0, 2, states_81,
+     "\000\000\040\000\000\000\000\000\000\000\000\000\102\000\000\000\000\000\000\000\000\000\000"},
+    {338, "sync_comp_for", 0, 6, states_82,
+     "\000\000\000\000\000\000\000\000\000\000\000\000\100\000\000\000\000\000\000\000\000\000\000"},
+    {339, "comp_for", 0, 3, states_83,
+     "\000\000\040\000\000\000\000\000\000\000\000\000\100\000\000\000\000\000\000\000\000\000\000"},
+    {340, "comp_if", 0, 4, states_84,
      "\000\000\000\000\000\000\000\000\000\000\000\000\002\000\000\000\000\000\000\000\000\000\000"},
-    {340, "encoding_decl", 0, 2, states_84,
+    {341, "encoding_decl", 0, 2, states_85,
      "\000\000\200\000\000\000\000\000\000\000\000\000\000\000\000\000\000\000\000\000\000\000\000"},
-    {341, "yield_expr", 0, 3, states_85,
-     "\000\000\000\000\000\000\000\000\000\000\000\000\000\000\000\000\000\000\000\000\000\200\000"},
-    {342, "yield_arg", 0, 3, states_86,
-     "\000\040\200\000\002\000\000\000\000\040\010\000\000\000\020\002\000\300\220\050\037\000\000"},
+    {342, "yield_expr", 0, 3, states_86,
+     "\000\000\000\000\000\000\000\000\000\000\000\000\000\000\000\000\000\000\000\000\000\000\002"},
+    {343, "yield_arg", 0, 3, states_87,
+     "\000\040\200\000\002\000\000\000\000\040\010\000\000\000\100\010\000\000\103\242\174\000\000"},
 };
-static label labels[177] = {
+static label labels[179] = {
     {0, "EMPTY"},
     {256, 0},
     {4, 0},
@@ -2099,16 +2121,16 @@ static label labels[177] = {
     {269, 0},
     {0, 0},
     {258, 0},
-    {331, 0},
+    {332, 0},
     {259, 0},
     {49, 0},
     {291, 0},
     {7, 0},
-    {334, 0},
+    {335, 0},
     {8, 0},
     {260, 0},
     {261, 0},
-    {333, 0},
+    {334, 0},
     {263, 0},
     {262, 0},
     {1, "async"},
@@ -2116,7 +2138,7 @@ static label labels[177] = {
     {1, 0},
     {264, 0},
     {51, 0},
-    {305, 0},
+    {306, 0},
     {11, 0},
     {304, 0},
     {265, 0},
@@ -2140,8 +2162,8 @@ static label labels[177] = {
     {274, 0},
     {273, 0},
     {275, 0},
-    {341, 0},
-    {314, 0},
+    {342, 0},
+    {315, 0},
     {36, 0},
     {37, 0},
     {38, 0},
@@ -2156,7 +2178,7 @@ static label labels[177] = {
     {46, 0},
     {48, 0},
     {1, "del"},
-    {330, 0},
+    {331, 0},
     {1, "pass"},
     {279, 0},
     {280, 0},
@@ -2188,6 +2210,7 @@ static label labels[177] = {
     {301, 0},
     {296, 0},
     {1, "if"},
+    {305, 0},
     {1, "elif"},
     {1, "else"},
     {1, "while"},
@@ -2198,22 +2221,23 @@ static label labels[177] = {
     {1, "finally"},
     {1, "with"},
     {302, 0},
-    {315, 0},
+    {316, 0},
     {1, "except"},
     {5, 0},
     {6, 0},
-    {309, 0},
-    {307, 0},
-    {306, 0},
+    {53, 0},
+    {310, 0},
     {308, 0},
+    {307, 0},
+    {309, 0},
     {1, "lambda"},
-    {310, 0},
-    {1, "or"},
     {311, 0},
+    {1, "or"},
+    {312, 0},
     {1, "and"},
     {1, "not"},
-    {312, 0},
     {313, 0},
+    {314, 0},
     {20, 0},
     {21, 0},
     {27, 0},
@@ -2222,55 +2246,55 @@ static label labels[177] = {
     {28, 0},
     {28, 0},
     {1, "is"},
-    {316, 0},
-    {18, 0},
     {317, 0},
-    {32, 0},
+    {18, 0},
     {318, 0},
-    {19, 0},
+    {32, 0},
     {319, 0},
+    {19, 0},
+    {320, 0},
     {33, 0},
     {34, 0},
-    {320, 0},
+    {321, 0},
     {14, 0},
     {15, 0},
-    {321, 0},
+    {322, 0},
     {17, 0},
     {24, 0},
     {47, 0},
     {31, 0},
-    {322, 0},
     {323, 0},
-    {1, "await"},
     {324, 0},
-    {326, 0},
+    {1, "await"},
     {325, 0},
+    {327, 0},
+    {326, 0},
     {9, 0},
     {10, 0},
     {25, 0},
-    {332, 0},
+    {333, 0},
     {26, 0},
     {2, 0},
     {3, 0},
     {1, "None"},
     {1, "True"},
     {1, "False"},
-    {338, 0},
-    {327, 0},
+    {339, 0},
     {328, 0},
     {329, 0},
+    {330, 0},
     {1, "class"},
-    {335, 0},
     {336, 0},
-    {339, 0},
     {337, 0},
     {340, 0},
+    {338, 0},
+    {341, 0},
     {1, "yield"},
-    {342, 0},
+    {343, 0},
 };
 grammar _PyParser_Grammar = {
-    87,
+    88,
     dfas,
-    {177, labels},
+    {179, labels},
     256
 };
diff --git a/Python/symtable.c b/Python/symtable.c
index 677b6043438e..879e19ab79e0 100644
--- a/Python/symtable.c
+++ b/Python/symtable.c
@@ -31,6 +31,9 @@
 
 #define IMPORT_STAR_WARNING "import * only allowed at module level"
 
+#define NAMED_EXPR_COMP_IN_CLASS \
+"named expression within a comprehension cannot be used in a class body"
+
 static PySTEntryObject *
 ste_new(struct symtable *st, identifier name, _Py_block_ty block,
         void *key, int lineno, int col_offset)
@@ -75,6 +78,7 @@ ste_new(struct symtable *st, identifier name, _Py_block_ty block,
     ste->ste_child_free = 0;
     ste->ste_generator = 0;
     ste->ste_coroutine = 0;
+    ste->ste_comprehension = 0;
     ste->ste_returns_value = 0;
     ste->ste_needs_class_closure = 0;
 
@@ -972,7 +976,7 @@ symtable_lookup(struct symtable *st, PyObject *name)
 }
 
 static int
-symtable_add_def(struct symtable *st, PyObject *name, int flag)
+symtable_add_def_helper(struct symtable *st, PyObject *name, int flag, struct _symtable_entry *ste)
 {
     PyObject *o;
     PyObject *dict;
@@ -982,15 +986,15 @@ symtable_add_def(struct symtable *st, PyObject *name, int flag)
 
     if (!mangled)
         return 0;
-    dict = st->st_cur->ste_symbols;
+    dict = ste->ste_symbols;
     if ((o = PyDict_GetItem(dict, mangled))) {
         val = PyLong_AS_LONG(o);
         if ((flag & DEF_PARAM) && (val & DEF_PARAM)) {
             /* Is it better to use 'mangled' or 'name' here? */
             PyErr_Format(PyExc_SyntaxError, DUPLICATE_ARGUMENT, name);
             PyErr_SyntaxLocationObject(st->st_filename,
-                                       st->st_cur->ste_lineno,
-                                       st->st_cur->ste_col_offset + 1);
+                                       ste->ste_lineno,
+                                       ste->ste_col_offset + 1);
             goto error;
         }
         val |= flag;
@@ -1006,7 +1010,7 @@ symtable_add_def(struct symtable *st, PyObject *name, int flag)
     Py_DECREF(o);
 
     if (flag & DEF_PARAM) {
-        if (PyList_Append(st->st_cur->ste_varnames, mangled) < 0)
+        if (PyList_Append(ste->ste_varnames, mangled) < 0)
             goto error;
     } else      if (flag & DEF_GLOBAL) {
         /* XXX need to update DEF_GLOBAL for other flags too;
@@ -1032,6 +1036,11 @@ symtable_add_def(struct symtable *st, PyObject *name, int flag)
     return 0;
 }
 
+static int
+symtable_add_def(struct symtable *st, PyObject *name, int flag) {
+    return symtable_add_def_helper(st, name, flag, st->st_cur);
+}
+
 /* VISIT, VISIT_SEQ and VIST_SEQ_TAIL take an ASDL type as their second argument.
    They use the ASDL name to synthesize the name of the C type and the visit
    function.
@@ -1082,7 +1091,7 @@ symtable_add_def(struct symtable *st, PyObject *name, int flag)
 }
 
 static int
-symtable_record_directive(struct symtable *st, identifier name, stmt_ty s)
+symtable_record_directive(struct symtable *st, identifier name, int lineno, int col_offset)
 {
     PyObject *data, *mangled;
     int res;
@@ -1094,7 +1103,7 @@ symtable_record_directive(struct symtable *st, identifier name, stmt_ty s)
     mangled = _Py_Mangle(st->st_private, name);
     if (!mangled)
         return 0;
-    data = Py_BuildValue("(Nii)", mangled, s->lineno, s->col_offset);
+    data = Py_BuildValue("(Nii)", mangled, lineno, col_offset);
     if (!data)
         return 0;
     res = PyList_Append(st->st_cur->ste_directives, data);
@@ -1280,7 +1289,7 @@ symtable_visit_stmt(struct symtable *st, stmt_ty s)
             }
             if (!symtable_add_def(st, name, DEF_GLOBAL))
                 VISIT_QUIT(st, 0);
-            if (!symtable_record_directive(st, name, s))
+            if (!symtable_record_directive(st, name, s->lineno, s->col_offset))
                 VISIT_QUIT(st, 0);
         }
         break;
@@ -1312,7 +1321,7 @@ symtable_visit_stmt(struct symtable *st, stmt_ty s)
             }
             if (!symtable_add_def(st, name, DEF_NONLOCAL))
                 VISIT_QUIT(st, 0);
-            if (!symtable_record_directive(st, name, s))
+            if (!symtable_record_directive(st, name, s->lineno, s->col_offset))
                 VISIT_QUIT(st, 0);
         }
         break;
@@ -1367,6 +1376,60 @@ symtable_visit_stmt(struct symtable *st, stmt_ty s)
     VISIT_QUIT(st, 1);
 }
 
+static int
+symtable_extend_namedexpr_scope(struct symtable *st, expr_ty e)
+{
+    assert(st->st_stack);
+
+    Py_ssize_t i, size;
+    struct _symtable_entry *ste;
+    size = PyList_GET_SIZE(st->st_stack);
+    assert(size);
+
+    /* Iterate over the stack in reverse and add to the nearest adequate scope */
+    for (i = size - 1; i >= 0; i--) {
+        ste = (struct _symtable_entry *) PyList_GET_ITEM(st->st_stack, i);
+
+        /* If our current entry is a comprehension, skip it */
+        if (ste->ste_comprehension) {
+            continue;
+        }
+
+        /* If we find a FunctionBlock entry, add as NONLOCAL/LOCAL */
+        if (ste->ste_type == FunctionBlock) {
+            if (!symtable_add_def(st, e->v.Name.id, DEF_NONLOCAL))
+                VISIT_QUIT(st, 0);
+            if (!symtable_record_directive(st, e->v.Name.id, e->lineno, e->col_offset))
+                VISIT_QUIT(st, 0);
+
+            return symtable_add_def_helper(st, e->v.Name.id, DEF_LOCAL, ste);
+        }
+        /* If we find a ModuleBlock entry, add as GLOBAL */
+        if (ste->ste_type == ModuleBlock) {
+            if (!symtable_add_def(st, e->v.Name.id, DEF_GLOBAL))
+                VISIT_QUIT(st, 0);
+            if (!symtable_record_directive(st, e->v.Name.id, e->lineno, e->col_offset))
+                VISIT_QUIT(st, 0);
+
+            return symtable_add_def_helper(st, e->v.Name.id, DEF_GLOBAL, ste);
+        }
+        /* Disallow usage in ClassBlock */
+        if (ste->ste_type == ClassBlock) {
+            PyErr_Format(PyExc_TargetScopeError, NAMED_EXPR_COMP_IN_CLASS, e->v.Name.id);
+            PyErr_SyntaxLocationObject(st->st_filename,
+                                        e->lineno,
+                                        e->col_offset);
+            VISIT_QUIT(st, 0);
+        }
+    }
+
+    /* We should always find either a FunctionBlock, ModuleBlock or ClassBlock
+       and should never fall to this case
+    */
+    assert(0);
+    return 0;
+}
+
 static int
 symtable_visit_expr(struct symtable *st, expr_ty e)
 {
@@ -1376,6 +1439,10 @@ symtable_visit_expr(struct symtable *st, expr_ty e)
         VISIT_QUIT(st, 0);
     }
     switch (e->kind) {
+    case NamedExpr_kind:
+        VISIT(st, expr, e->v.NamedExpr.value);
+        VISIT(st, expr, e->v.NamedExpr.target);
+        break;
     case BoolOp_kind:
         VISIT_SEQ(st, expr, e->v.BoolOp.values);
         break;
@@ -1476,6 +1543,11 @@ symtable_visit_expr(struct symtable *st, expr_ty e)
         VISIT(st, expr, e->v.Starred.value);
         break;
     case Name_kind:
+        /* Special-case: named expr */
+        if (e->v.Name.ctx == NamedStore && st->st_cur->ste_comprehension) {
+            if(!symtable_extend_namedexpr_scope(st, e))
+                VISIT_QUIT(st, 0);
+        }
         if (!symtable_add_def(st, e->v.Name.id,
                               e->v.Name.ctx == Load ? USE : DEF_LOCAL))
             VISIT_QUIT(st, 0);
@@ -1713,6 +1785,8 @@ symtable_handle_comprehension(struct symtable *st, expr_ty e,
     if (outermost->is_async) {
         st->st_cur->ste_coroutine = 1;
     }
+    st->st_cur->ste_comprehension = 1;
+
     /* Outermost iter is received as an argument */
     if (!symtable_implicit_arg(st, 0)) {
         symtable_exit_block(st, (void *)e);



More information about the Python-checkins mailing list