From solipsis at pitrou.net Tue Jan 1 04:10:06 2019 From: solipsis at pitrou.net (solipsis at pitrou.net) Date: Tue, 01 Jan 2019 09:10:06 +0000 Subject: [Python-checkins] Daily reference leaks (4243df51fe43): sum=7 Message-ID: <20190101091006.1.BAC9705B591A3BE8@psf.io> results for 4243df51fe43 on branch "default" -------------------------------------------- test_asyncio leaked [3, 0, 0] memory blocks, sum=3 test_functools leaked [0, 3, 1] memory blocks, sum=4 Command line was: ['./python', '-m', 'test.regrtest', '-uall', '-R', '3:3:/home/psf-users/antoine/refleaks/reflogL2YCta', '--timeout', '7200'] From webhook-mailer at python.org Tue Jan 1 20:57:47 2019 From: webhook-mailer at python.org (Benjamin Peterson) Date: Wed, 02 Jan 2019 01:57:47 -0000 Subject: [Python-checkins] closes bpo-35630: Use code tag for 'python3' in 'README.rst' (GH-11394) Message-ID: https://github.com/python/cpython/commit/7e3fb40b923cb09ecc67816d3191197868593737 commit: 7e3fb40b923cb09ecc67816d3191197868593737 branch: master author: Suriyaa ??? committer: Benjamin Peterson date: 2019-01-01T17:57:42-08:00 summary: closes bpo-35630: Use code tag for 'python3' in 'README.rst' (GH-11394) files: M README.rst diff --git a/README.rst b/README.rst index 61281042e3bd..184ddb874885 100644 --- a/README.rst +++ b/README.rst @@ -63,7 +63,7 @@ On Unix, Linux, BSD, macOS, and Cygwin:: make test sudo make install -This will install Python as python3. +This will install Python as ``python3``. You can pass many options to the configure script; run ``./configure --help`` to find out more. On macOS and Cygwin, the executable is called ``python.exe``; From webhook-mailer at python.org Tue Jan 1 21:01:59 2019 From: webhook-mailer at python.org (Benjamin Peterson) Date: Wed, 02 Jan 2019 02:01:59 -0000 Subject: [Python-checkins] closes bpo-35623: Fix integer overflow when sorting large lists (GH-11380) Message-ID: https://github.com/python/cpython/commit/f8b534477a2a51d85ea1663530f685f805f2b247 commit: f8b534477a2a51d85ea1663530f685f805f2b247 branch: master author: sth committer: Benjamin Peterson date: 2019-01-01T18:01:54-08:00 summary: closes bpo-35623: Fix integer overflow when sorting large lists (GH-11380) There is already a `Py_ssize_t i` defined at function scope that is used for similar loops. By removing the local `int i` declaration that `i` is used, which has the appropriate type. files: A Misc/NEWS.d/next/Core and Builtins/2018-12-31-02-37-20.bpo-35623.24AQhY.rst M Objects/listobject.c diff --git a/Misc/NEWS.d/next/Core and Builtins/2018-12-31-02-37-20.bpo-35623.24AQhY.rst b/Misc/NEWS.d/next/Core and Builtins/2018-12-31-02-37-20.bpo-35623.24AQhY.rst new file mode 100644 index 000000000000..6e3df4dc5d4e --- /dev/null +++ b/Misc/NEWS.d/next/Core and Builtins/2018-12-31-02-37-20.bpo-35623.24AQhY.rst @@ -0,0 +1 @@ +Fix a crash when sorting very long lists. Patch by Stephan Hohe. diff --git a/Objects/listobject.c b/Objects/listobject.c index 17c37ba97560..e11a3fdd1358 100644 --- a/Objects/listobject.c +++ b/Objects/listobject.c @@ -2283,7 +2283,6 @@ list_sort_impl(PyListObject *self, PyObject *keyfunc, int reverse) int ints_are_bounded = 1; /* Prove that assumption by checking every key. */ - int i; for (i=0; i < saved_ob_size; i++) { if (keys_are_in_tuples && From webhook-mailer at python.org Tue Jan 1 21:03:58 2019 From: webhook-mailer at python.org (Miss Islington (bot)) Date: Wed, 02 Jan 2019 02:03:58 -0000 Subject: [Python-checkins] closes bpo-35630: Use code tag for 'python3' in 'README.rst' (GH-11394) Message-ID: https://github.com/python/cpython/commit/513fab2c67365e1693216dd59e4df0a5f6bfeb26 commit: 513fab2c67365e1693216dd59e4df0a5f6bfeb26 branch: 3.7 author: Miss Islington (bot) <31488909+miss-islington at users.noreply.github.com> committer: GitHub date: 2019-01-01T18:03:53-08:00 summary: closes bpo-35630: Use code tag for 'python3' in 'README.rst' (GH-11394) (cherry picked from commit 7e3fb40b923cb09ecc67816d3191197868593737) Co-authored-by: Suriyaa ??? files: M README.rst diff --git a/README.rst b/README.rst index 216cfa334663..ddd3537493e8 100644 --- a/README.rst +++ b/README.rst @@ -60,7 +60,7 @@ On Unix, Linux, BSD, macOS, and Cygwin:: make test sudo make install -This will install Python as python3. +This will install Python as ``python3``. You can pass many options to the configure script; run ``./configure --help`` to find out more. On macOS and Cygwin, the executable is called ``python.exe``; From webhook-mailer at python.org Tue Jan 1 21:04:28 2019 From: webhook-mailer at python.org (Benjamin Peterson) Date: Wed, 02 Jan 2019 02:04:28 -0000 Subject: [Python-checkins] closes bpo-35630: Use code tag for 'python3' in 'README.rst' (GH-11400) Message-ID: https://github.com/python/cpython/commit/de66b8d498e47ed9d70607ac9b9f72468241da77 commit: de66b8d498e47ed9d70607ac9b9f72468241da77 branch: 3.6 author: Miss Islington (bot) <31488909+miss-islington at users.noreply.github.com> committer: Benjamin Peterson date: 2019-01-01T18:04:25-08:00 summary: closes bpo-35630: Use code tag for 'python3' in 'README.rst' (GH-11400) (cherry picked from commit 7e3fb40b923cb09ecc67816d3191197868593737) Co-authored-by: Suriyaa ??? files: M README.rst diff --git a/README.rst b/README.rst index 8c260626d74e..99558b8c551d 100644 --- a/README.rst +++ b/README.rst @@ -59,7 +59,7 @@ On Unix, Linux, BSD, macOS, and Cygwin:: make test sudo make install -This will install Python as python3. +This will install Python as ``python3``. You can pass many options to the configure script; run ``./configure --help`` to find out more. On macOS and Cygwin, the executable is called ``python.exe``; From webhook-mailer at python.org Tue Jan 1 21:25:26 2019 From: webhook-mailer at python.org (Miss Islington (bot)) Date: Wed, 02 Jan 2019 02:25:26 -0000 Subject: [Python-checkins] closes bpo-35623: Fix integer overflow when sorting large lists (GH-11380) Message-ID: https://github.com/python/cpython/commit/a5955b0895aa011b0beff1ceb6539b2ff8888425 commit: a5955b0895aa011b0beff1ceb6539b2ff8888425 branch: 3.7 author: Miss Islington (bot) <31488909+miss-islington at users.noreply.github.com> committer: GitHub date: 2019-01-01T18:25:23-08:00 summary: closes bpo-35623: Fix integer overflow when sorting large lists (GH-11380) There is already a `Py_ssize_t i` defined at function scope that is used for similar loops. By removing the local `int i` declaration that `i` is used, which has the appropriate type. (cherry picked from commit f8b534477a2a51d85ea1663530f685f805f2b247) Co-authored-by: sth files: A Misc/NEWS.d/next/Core and Builtins/2018-12-31-02-37-20.bpo-35623.24AQhY.rst M Objects/listobject.c diff --git a/Misc/NEWS.d/next/Core and Builtins/2018-12-31-02-37-20.bpo-35623.24AQhY.rst b/Misc/NEWS.d/next/Core and Builtins/2018-12-31-02-37-20.bpo-35623.24AQhY.rst new file mode 100644 index 000000000000..6e3df4dc5d4e --- /dev/null +++ b/Misc/NEWS.d/next/Core and Builtins/2018-12-31-02-37-20.bpo-35623.24AQhY.rst @@ -0,0 +1 @@ +Fix a crash when sorting very long lists. Patch by Stephan Hohe. diff --git a/Objects/listobject.c b/Objects/listobject.c index c8ffeff09368..de73b8bf80fd 100644 --- a/Objects/listobject.c +++ b/Objects/listobject.c @@ -2232,7 +2232,6 @@ list_sort_impl(PyListObject *self, PyObject *keyfunc, int reverse) int ints_are_bounded = 1; /* Prove that assumption by checking every key. */ - int i; for (i=0; i < saved_ob_size; i++) { if (keys_are_in_tuples && From solipsis at pitrou.net Wed Jan 2 04:10:18 2019 From: solipsis at pitrou.net (solipsis at pitrou.net) Date: Wed, 02 Jan 2019 09:10:18 +0000 Subject: [Python-checkins] Daily reference leaks (4243df51fe43): sum=3 Message-ID: <20190102091018.1.F1467B82EA65A579@psf.io> results for 4243df51fe43 on branch "default" -------------------------------------------- test_collections leaked [0, 7, -7] memory blocks, sum=0 test_functools leaked [0, 3, 1] memory blocks, sum=4 test_multiprocessing_spawn leaked [0, -2, 1] memory blocks, sum=-1 Command line was: ['./python', '-m', 'test.regrtest', '-uall', '-R', '3:3:/home/psf-users/antoine/refleaks/refloggszeUT', '--timeout', '7200'] From webhook-mailer at python.org Wed Jan 2 07:16:12 2019 From: webhook-mailer at python.org (Serhiy Storchaka) Date: Wed, 02 Jan 2019 12:16:12 -0000 Subject: [Python-checkins] bpo-35636: Remove redundant check in unicode_hash(). (GH-11402) Message-ID: https://github.com/python/cpython/commit/a1d14253066f7dd60cfb465c6511fa565f312b42 commit: a1d14253066f7dd60cfb465c6511fa565f312b42 branch: master author: animalize committer: Serhiy Storchaka date: 2019-01-02T14:16:06+02:00 summary: bpo-35636: Remove redundant check in unicode_hash(). (GH-11402) _Py_HashBytes() does the check for empty string. files: M Objects/unicodeobject.c diff --git a/Objects/unicodeobject.c b/Objects/unicodeobject.c index 06338fac2b28..f1dcfe9ab72a 100644 --- a/Objects/unicodeobject.c +++ b/Objects/unicodeobject.c @@ -11656,7 +11656,6 @@ unicode_getitem(PyObject *self, Py_ssize_t index) static Py_hash_t unicode_hash(PyObject *self) { - Py_ssize_t len; Py_uhash_t x; /* Unsigned for defined overflow behavior. */ #ifdef Py_DEBUG @@ -11666,15 +11665,7 @@ unicode_hash(PyObject *self) return _PyUnicode_HASH(self); if (PyUnicode_READY(self) == -1) return -1; - len = PyUnicode_GET_LENGTH(self); - /* - We make the hash of the empty string be 0, rather than using - (prefix ^ suffix), since this slightly obfuscates the hash secret - */ - if (len == 0) { - _PyUnicode_HASH(self) = 0; - return 0; - } + x = _Py_HashBytes(PyUnicode_DATA(self), PyUnicode_GET_LENGTH(self) * PyUnicode_KIND(self)); _PyUnicode_HASH(self) = x; From webhook-mailer at python.org Wed Jan 2 07:22:09 2019 From: webhook-mailer at python.org (Serhiy Storchaka) Date: Wed, 02 Jan 2019 12:22:09 -0000 Subject: [Python-checkins] bpo-35588: Speed up mod, divmod and floordiv operations for Fraction type (#11322) Message-ID: https://github.com/python/cpython/commit/3a374e0c5abe805667b71ffaaa7614781101ff4c commit: 3a374e0c5abe805667b71ffaaa7614781101ff4c branch: master author: Stefan Behnel committer: Serhiy Storchaka date: 2019-01-02T14:22:06+02:00 summary: bpo-35588: Speed up mod, divmod and floordiv operations for Fraction type (#11322) * bpo-35588: Implement mod and divmod operations for Fraction type by spelling out the numerator/denominator calculation, instead of instantiating and normalising Fractions along the way. This speeds up '%' and divmod() by 2-3x. * bpo-35588: Also reimplement Fraction.__floordiv__() using integer operations to make it ~4x faster. * Improve code formatting. Co-Authored-By: scoder * bpo-35588: Fix return type of divmod(): the result of the integer division should be an integer. * bpo-35588: Further specialise __mod__() and inline the original helper function _flat_divmod() since it's no longer reused. * bpo-35588: Add some tests with large numerators and/or denominators. * bpo-35588: Use builtin "divmod()" function for implementing __divmod__() in order to simplify the implementation, even though performance results are mixed. * Rremove accidentally added empty line. * bpo-35588: Try to provide more informative output on test failures. * bpo-35588: Improve wording in News entry. Co-Authored-By: scoder * Remove stray space. files: A Misc/NEWS.d/next/Library/2018-12-26-10-55-59.bpo-35588.PSR6Ez.rst M Lib/fractions.py M Lib/test/test_fractions.py diff --git a/Lib/fractions.py b/Lib/fractions.py index e0a024a03b14..4bbfc434f7d1 100644 --- a/Lib/fractions.py +++ b/Lib/fractions.py @@ -429,14 +429,22 @@ def _div(a, b): def _floordiv(a, b): """a // b""" - return math.floor(a / b) + return (a.numerator * b.denominator) // (a.denominator * b.numerator) __floordiv__, __rfloordiv__ = _operator_fallbacks(_floordiv, operator.floordiv) + def _divmod(a, b): + """(a // b, a % b)""" + da, db = a.denominator, b.denominator + div, n_mod = divmod(a.numerator * db, da * b.numerator) + return div, Fraction(n_mod, da * db) + + __divmod__, __rdivmod__ = _operator_fallbacks(_divmod, divmod) + def _mod(a, b): """a % b""" - div = a // b - return a - b * div + da, db = a.denominator, b.denominator + return Fraction((a.numerator * db) % (b.numerator * da), da * db) __mod__, __rmod__ = _operator_fallbacks(_mod, operator.mod) diff --git a/Lib/test/test_fractions.py b/Lib/test/test_fractions.py index 452f18126de9..277916220051 100644 --- a/Lib/test/test_fractions.py +++ b/Lib/test/test_fractions.py @@ -117,6 +117,11 @@ def assertTypedEquals(self, expected, actual): self.assertEqual(type(expected), type(actual)) self.assertEqual(expected, actual) + def assertTypedTupleEquals(self, expected, actual): + """Asserts that both the types and values in the tuples are the same.""" + self.assertTupleEqual(expected, actual) + self.assertListEqual(list(map(type, expected)), list(map(type, actual))) + def assertRaisesMessage(self, exc_type, message, callable, *args, **kwargs): """Asserts that callable(*args, **kwargs) raises exc_type(message).""" @@ -349,7 +354,10 @@ def testArithmetic(self): self.assertEqual(F(1, 4), F(1, 10) / F(2, 5)) self.assertTypedEquals(2, F(9, 10) // F(2, 5)) self.assertTypedEquals(10**23, F(10**23, 1) // F(1)) + self.assertEqual(F(5, 6), F(7, 3) % F(3, 2)) self.assertEqual(F(2, 3), F(-7, 3) % F(3, 2)) + self.assertEqual((F(1), F(5, 6)), divmod(F(7, 3), F(3, 2))) + self.assertEqual((F(-2), F(2, 3)), divmod(F(-7, 3), F(3, 2))) self.assertEqual(F(8, 27), F(2, 3) ** F(3)) self.assertEqual(F(27, 8), F(2, 3) ** F(-3)) self.assertTypedEquals(2.0, F(4) ** F(1, 2)) @@ -371,6 +379,40 @@ def testArithmetic(self): self.assertEqual(p.numerator, 4) self.assertEqual(p.denominator, 1) + def testLargeArithmetic(self): + self.assertTypedEquals( + F(10101010100808080808080808101010101010000000000000000, + 1010101010101010101010101011111111101010101010101010101010101), + F(10**35+1, 10**27+1) % F(10**27+1, 10**35-1) + ) + self.assertTypedEquals( + F(7, 1901475900342344102245054808064), + F(-2**100, 3) % F(5, 2**100) + ) + self.assertTypedTupleEquals( + (9999999999999999, + F(10101010100808080808080808101010101010000000000000000, + 1010101010101010101010101011111111101010101010101010101010101)), + divmod(F(10**35+1, 10**27+1), F(10**27+1, 10**35-1)) + ) + self.assertTypedEquals( + -2 ** 200 // 15, + F(-2**100, 3) // F(5, 2**100) + ) + self.assertTypedEquals( + 1, + F(5, 2**100) // F(3, 2**100) + ) + self.assertTypedEquals( + (1, F(2, 2**100)), + divmod(F(5, 2**100), F(3, 2**100)) + ) + self.assertTypedTupleEquals( + (-2 ** 200 // 15, + F(7, 1901475900342344102245054808064)), + divmod(F(-2**100, 3), F(5, 2**100)) + ) + def testMixedArithmetic(self): self.assertTypedEquals(F(11, 10), F(1, 10) + 1) self.assertTypedEquals(1.1, F(1, 10) + 1.0) @@ -415,7 +457,14 @@ def testMixedArithmetic(self): self.assertTypedEquals(float('inf'), F(-1, 10) % float('inf')) self.assertTypedEquals(-0.1, F(-1, 10) % float('-inf')) - # No need for divmod since we don't override it. + self.assertTypedTupleEquals((0, F(1, 10)), divmod(F(1, 10), 1)) + self.assertTypedTupleEquals(divmod(0.1, 1.0), divmod(F(1, 10), 1.0)) + self.assertTypedTupleEquals((10, F(0)), divmod(1, F(1, 10))) + self.assertTypedTupleEquals(divmod(1.0, 0.1), divmod(1.0, F(1, 10))) + self.assertTypedTupleEquals(divmod(0.1, float('inf')), divmod(F(1, 10), float('inf'))) + self.assertTypedTupleEquals(divmod(0.1, float('-inf')), divmod(F(1, 10), float('-inf'))) + self.assertTypedTupleEquals(divmod(-0.1, float('inf')), divmod(F(-1, 10), float('inf'))) + self.assertTypedTupleEquals(divmod(-0.1, float('-inf')), divmod(F(-1, 10), float('-inf'))) # ** has more interesting conversion rules. self.assertTypedEquals(F(100, 1), F(1, 10) ** -2) diff --git a/Misc/NEWS.d/next/Library/2018-12-26-10-55-59.bpo-35588.PSR6Ez.rst b/Misc/NEWS.d/next/Library/2018-12-26-10-55-59.bpo-35588.PSR6Ez.rst new file mode 100644 index 000000000000..270f556e76b2 --- /dev/null +++ b/Misc/NEWS.d/next/Library/2018-12-26-10-55-59.bpo-35588.PSR6Ez.rst @@ -0,0 +1,2 @@ +The floor division and modulo operations and the :func:`divmod` function on :class:`fractions.Fraction` types are 2--4x faster. +Patch by Stefan Behnel. From webhook-mailer at python.org Wed Jan 2 07:49:29 2019 From: webhook-mailer at python.org (Serhiy Storchaka) Date: Wed, 02 Jan 2019 12:49:29 -0000 Subject: [Python-checkins] Revert "bpo-35603: Escape table header of make_table output that can cause potential XSS. (GH-11341)" (GH-11356) Message-ID: https://github.com/python/cpython/commit/830ddc74c495ac1a5c03172a31006074967571a3 commit: 830ddc74c495ac1a5c03172a31006074967571a3 branch: master author: Serhiy Storchaka committer: GitHub date: 2019-01-02T14:49:25+02:00 summary: Revert "bpo-35603: Escape table header of make_table output that can cause potential XSS. (GH-11341)" (GH-11356) This reverts commit 78de01198b047347abc5e458851bb12c48429e24. files: D Misc/NEWS.d/next/Library/2018-12-28-14-53-22.bpo-35603.rVCZAE.rst M Lib/difflib.py M Lib/test/test_difflib.py diff --git a/Lib/difflib.py b/Lib/difflib.py index 4571817b9823..887c3c26cae4 100644 --- a/Lib/difflib.py +++ b/Lib/difflib.py @@ -2036,10 +2036,6 @@ def make_table(self,fromlines,tolines,fromdesc='',todesc='',context=False, s.append( fmt % (next_id[i],next_href[i],fromlist[i], next_href[i],tolist[i])) if fromdesc or todesc: - fromdesc = fromdesc.replace("&", "&").replace(">", ">") \ - .replace("<", "<") - todesc = todesc.replace("&", "&").replace(">", ">") \ - .replace("<", "<") header_row = '%s%s%s%s' % ( '
', '%s' % fromdesc, diff --git a/Lib/test/test_difflib.py b/Lib/test/test_difflib.py index 63ebdb0dc83b..745ccbd6659e 100644 --- a/Lib/test/test_difflib.py +++ b/Lib/test/test_difflib.py @@ -238,15 +238,6 @@ def test_html_diff(self): with open(findfile('test_difflib_expect.html')) as fp: self.assertEqual(actual, fp.read()) - def test_make_table_escape_table_header(self): - html_diff = difflib.HtmlDiff() - output = html_diff.make_table(patch914575_from1.splitlines(), - patch914575_to1.splitlines(), - fromdesc='', - todesc='') - self.assertIn('<from>', output) - self.assertIn('<to>', output) - def test_recursion_limit(self): # Check if the problem described in patch #1413711 exists. limit = sys.getrecursionlimit() diff --git a/Misc/NEWS.d/next/Library/2018-12-28-14-53-22.bpo-35603.rVCZAE.rst b/Misc/NEWS.d/next/Library/2018-12-28-14-53-22.bpo-35603.rVCZAE.rst deleted file mode 100644 index 03150c3aa494..000000000000 --- a/Misc/NEWS.d/next/Library/2018-12-28-14-53-22.bpo-35603.rVCZAE.rst +++ /dev/null @@ -1,2 +0,0 @@ -Escape table header output of :meth:`difflib.HtmlDiff.make_table`. -Patch by Karthikeyan Singaravelan. From webhook-mailer at python.org Wed Jan 2 10:46:58 2019 From: webhook-mailer at python.org (Benjamin Peterson) Date: Wed, 02 Jan 2019 15:46:58 -0000 Subject: [Python-checkins] Bump copyright years to 2019. (GH-11404) Message-ID: https://github.com/python/cpython/commit/9a69ae8a78785105ded02b083b2e5cd2dd939307 commit: 9a69ae8a78785105ded02b083b2e5cd2dd939307 branch: master author: Benjamin Peterson committer: GitHub date: 2019-01-02T07:46:53-08:00 summary: Bump copyright years to 2019. (GH-11404) files: M Doc/copyright.rst M Doc/license.rst M LICENSE M Mac/IDLE/IDLE.app/Contents/Info.plist M Mac/PythonLauncher/Info.plist.in M Mac/Resources/app/Info.plist.in M Mac/Resources/framework/Info.plist.in M Python/getcopyright.c M README.rst diff --git a/Doc/copyright.rst b/Doc/copyright.rst index 540ff5ef0593..393a1f03751f 100644 --- a/Doc/copyright.rst +++ b/Doc/copyright.rst @@ -4,7 +4,7 @@ Copyright Python and this documentation is: -Copyright ? 2001-2018 Python Software Foundation. All rights reserved. +Copyright ? 2001-2019 Python Software Foundation. All rights reserved. Copyright ? 2000 BeOpen.com. All rights reserved. diff --git a/Doc/license.rst b/Doc/license.rst index 5f78240002f8..a315b6f8134d 100644 --- a/Doc/license.rst +++ b/Doc/license.rst @@ -87,7 +87,7 @@ PSF LICENSE AGREEMENT FOR PYTHON |release| analyze, test, perform and/or display publicly, prepare derivative works, distribute, and otherwise use Python |release| alone or in any derivative version, provided, however, that PSF's License Agreement and PSF's notice of - copyright, i.e., "Copyright ? 2001-2018 Python Software Foundation; All Rights + copyright, i.e., "Copyright ? 2001-2019 Python Software Foundation; All Rights Reserved" are retained in Python |release| alone or in any derivative version prepared by Licensee. diff --git a/LICENSE b/LICENSE index 1afbedba92b3..9dc010d80348 100644 --- a/LICENSE +++ b/LICENSE @@ -73,8 +73,8 @@ analyze, test, perform and/or display publicly, prepare derivative works, distribute, and otherwise use Python alone or in any derivative version, provided, however, that PSF's License Agreement and PSF's notice of copyright, i.e., "Copyright (c) 2001, 2002, 2003, 2004, 2005, 2006, 2007, 2008, 2009, 2010, -2011, 2012, 2013, 2014, 2015, 2016, 2017, 2018 Python Software Foundation; All -Rights Reserved" are retained in Python alone or in any derivative version +2011, 2012, 2013, 2014, 2015, 2016, 2017, 2018, 2019 Python Software Foundation; +All Rights Reserved" are retained in Python alone or in any derivative version prepared by Licensee. 3. In the event Licensee prepares a derivative work that is based on diff --git a/Mac/IDLE/IDLE.app/Contents/Info.plist b/Mac/IDLE/IDLE.app/Contents/Info.plist index 826d3793f75a..04a0a08c8363 100644 --- a/Mac/IDLE/IDLE.app/Contents/Info.plist +++ b/Mac/IDLE/IDLE.app/Contents/Info.plist @@ -36,7 +36,7 @@ CFBundleExecutable IDLE CFBundleGetInfoString - %version%, ? 2001-2018 Python Software Foundation + %version%, ? 2001-2019 Python Software Foundation CFBundleIconFile IDLE.icns CFBundleIdentifier diff --git a/Mac/PythonLauncher/Info.plist.in b/Mac/PythonLauncher/Info.plist.in index 5fa346ed4d3d..9fb4e0affd9c 100644 --- a/Mac/PythonLauncher/Info.plist.in +++ b/Mac/PythonLauncher/Info.plist.in @@ -40,7 +40,7 @@ CFBundleExecutable Python Launcher CFBundleGetInfoString - %VERSION%, ? 2001-2018 Python Software Foundation + %VERSION%, ? 2001-2019 Python Software Foundation CFBundleIconFile PythonLauncher.icns CFBundleIdentifier diff --git a/Mac/Resources/app/Info.plist.in b/Mac/Resources/app/Info.plist.in index abe9ae23e341..b7581984dd67 100644 --- a/Mac/Resources/app/Info.plist.in +++ b/Mac/Resources/app/Info.plist.in @@ -37,7 +37,7 @@ CFBundleInfoDictionaryVersion 6.0 CFBundleLongVersionString - %version%, (c) 2001-2018 Python Software Foundation. + %version%, (c) 2001-2019 Python Software Foundation. CFBundleName Python CFBundlePackageType diff --git a/Mac/Resources/framework/Info.plist.in b/Mac/Resources/framework/Info.plist.in index c1ea9f688920..0dc2e17156f1 100644 --- a/Mac/Resources/framework/Info.plist.in +++ b/Mac/Resources/framework/Info.plist.in @@ -17,9 +17,9 @@ CFBundlePackageType FMWK CFBundleShortVersionString - %VERSION%, (c) 2001-2018 Python Software Foundation. + %VERSION%, (c) 2001-2019 Python Software Foundation. CFBundleLongVersionString - %VERSION%, (c) 2001-2018 Python Software Foundation. + %VERSION%, (c) 2001-2019 Python Software Foundation. CFBundleSignature ???? CFBundleVersion diff --git a/Python/getcopyright.c b/Python/getcopyright.c index 51772ecef317..27a1731f46de 100644 --- a/Python/getcopyright.c +++ b/Python/getcopyright.c @@ -4,7 +4,7 @@ static const char cprt[] = "\ -Copyright (c) 2001-2018 Python Software Foundation.\n\ +Copyright (c) 2001-2019 Python Software Foundation.\n\ All Rights Reserved.\n\ \n\ Copyright (c) 2000 BeOpen.com.\n\ diff --git a/README.rst b/README.rst index 184ddb874885..6a70f6c8eef8 100644 --- a/README.rst +++ b/README.rst @@ -22,7 +22,7 @@ This is Python version 3.8.0 alpha 0 :target: https://python.zulipchat.com -Copyright (c) 2001-2018 Python Software Foundation. All rights reserved. +Copyright (c) 2001-2019 Python Software Foundation. All rights reserved. See the end of this file for further copyright and license information. @@ -249,7 +249,7 @@ See :pep:`569` for Python 3.8 release details. Copyright and License Information --------------------------------- -Copyright (c) 2001-2018 Python Software Foundation. All rights reserved. +Copyright (c) 2001-2019 Python Software Foundation. All rights reserved. Copyright (c) 2000 BeOpen.com. All rights reserved. From webhook-mailer at python.org Wed Jan 2 11:15:57 2019 From: webhook-mailer at python.org (Benjamin Peterson) Date: Wed, 02 Jan 2019 16:15:57 -0000 Subject: [Python-checkins] [3.7] Bump copyright years to 2019. (GH-11406) Message-ID: https://github.com/python/cpython/commit/d634abd1234654c9969b60e4688acb902eea69f9 commit: d634abd1234654c9969b60e4688acb902eea69f9 branch: 3.7 author: Benjamin Peterson committer: GitHub date: 2019-01-02T08:15:53-08:00 summary: [3.7] Bump copyright years to 2019. (GH-11406) (cherry picked from commit 9a69ae8a78785105ded02b083b2e5cd2dd939307) files: M Doc/copyright.rst M Doc/license.rst M LICENSE M Mac/IDLE/IDLE.app/Contents/Info.plist M Mac/PythonLauncher/Info.plist.in M Mac/Resources/app/Info.plist.in M Mac/Resources/framework/Info.plist.in M Python/getcopyright.c M README.rst diff --git a/Doc/copyright.rst b/Doc/copyright.rst index 540ff5ef0593..393a1f03751f 100644 --- a/Doc/copyright.rst +++ b/Doc/copyright.rst @@ -4,7 +4,7 @@ Copyright Python and this documentation is: -Copyright ? 2001-2018 Python Software Foundation. All rights reserved. +Copyright ? 2001-2019 Python Software Foundation. All rights reserved. Copyright ? 2000 BeOpen.com. All rights reserved. diff --git a/Doc/license.rst b/Doc/license.rst index 5f78240002f8..a315b6f8134d 100644 --- a/Doc/license.rst +++ b/Doc/license.rst @@ -87,7 +87,7 @@ PSF LICENSE AGREEMENT FOR PYTHON |release| analyze, test, perform and/or display publicly, prepare derivative works, distribute, and otherwise use Python |release| alone or in any derivative version, provided, however, that PSF's License Agreement and PSF's notice of - copyright, i.e., "Copyright ? 2001-2018 Python Software Foundation; All Rights + copyright, i.e., "Copyright ? 2001-2019 Python Software Foundation; All Rights Reserved" are retained in Python |release| alone or in any derivative version prepared by Licensee. diff --git a/LICENSE b/LICENSE index 1afbedba92b3..9dc010d80348 100644 --- a/LICENSE +++ b/LICENSE @@ -73,8 +73,8 @@ analyze, test, perform and/or display publicly, prepare derivative works, distribute, and otherwise use Python alone or in any derivative version, provided, however, that PSF's License Agreement and PSF's notice of copyright, i.e., "Copyright (c) 2001, 2002, 2003, 2004, 2005, 2006, 2007, 2008, 2009, 2010, -2011, 2012, 2013, 2014, 2015, 2016, 2017, 2018 Python Software Foundation; All -Rights Reserved" are retained in Python alone or in any derivative version +2011, 2012, 2013, 2014, 2015, 2016, 2017, 2018, 2019 Python Software Foundation; +All Rights Reserved" are retained in Python alone or in any derivative version prepared by Licensee. 3. In the event Licensee prepares a derivative work that is based on diff --git a/Mac/IDLE/IDLE.app/Contents/Info.plist b/Mac/IDLE/IDLE.app/Contents/Info.plist index 826d3793f75a..04a0a08c8363 100644 --- a/Mac/IDLE/IDLE.app/Contents/Info.plist +++ b/Mac/IDLE/IDLE.app/Contents/Info.plist @@ -36,7 +36,7 @@ CFBundleExecutable IDLE CFBundleGetInfoString - %version%, ? 2001-2018 Python Software Foundation + %version%, ? 2001-2019 Python Software Foundation CFBundleIconFile IDLE.icns CFBundleIdentifier diff --git a/Mac/PythonLauncher/Info.plist.in b/Mac/PythonLauncher/Info.plist.in index 5fa346ed4d3d..9fb4e0affd9c 100644 --- a/Mac/PythonLauncher/Info.plist.in +++ b/Mac/PythonLauncher/Info.plist.in @@ -40,7 +40,7 @@ CFBundleExecutable Python Launcher CFBundleGetInfoString - %VERSION%, ? 2001-2018 Python Software Foundation + %VERSION%, ? 2001-2019 Python Software Foundation CFBundleIconFile PythonLauncher.icns CFBundleIdentifier diff --git a/Mac/Resources/app/Info.plist.in b/Mac/Resources/app/Info.plist.in index abe9ae23e341..b7581984dd67 100644 --- a/Mac/Resources/app/Info.plist.in +++ b/Mac/Resources/app/Info.plist.in @@ -37,7 +37,7 @@ CFBundleInfoDictionaryVersion 6.0 CFBundleLongVersionString - %version%, (c) 2001-2018 Python Software Foundation. + %version%, (c) 2001-2019 Python Software Foundation. CFBundleName Python CFBundlePackageType diff --git a/Mac/Resources/framework/Info.plist.in b/Mac/Resources/framework/Info.plist.in index c1ea9f688920..0dc2e17156f1 100644 --- a/Mac/Resources/framework/Info.plist.in +++ b/Mac/Resources/framework/Info.plist.in @@ -17,9 +17,9 @@ CFBundlePackageType FMWK CFBundleShortVersionString - %VERSION%, (c) 2001-2018 Python Software Foundation. + %VERSION%, (c) 2001-2019 Python Software Foundation. CFBundleLongVersionString - %VERSION%, (c) 2001-2018 Python Software Foundation. + %VERSION%, (c) 2001-2019 Python Software Foundation. CFBundleSignature ???? CFBundleVersion diff --git a/Python/getcopyright.c b/Python/getcopyright.c index 51772ecef317..27a1731f46de 100644 --- a/Python/getcopyright.c +++ b/Python/getcopyright.c @@ -4,7 +4,7 @@ static const char cprt[] = "\ -Copyright (c) 2001-2018 Python Software Foundation.\n\ +Copyright (c) 2001-2019 Python Software Foundation.\n\ All Rights Reserved.\n\ \n\ Copyright (c) 2000 BeOpen.com.\n\ diff --git a/README.rst b/README.rst index ddd3537493e8..0a375f872421 100644 --- a/README.rst +++ b/README.rst @@ -18,8 +18,8 @@ This is Python version 3.7.2+ :target: https://codecov.io/gh/python/cpython Copyright (c) 2001, 2002, 2003, 2004, 2005, 2006, 2007, 2008, 2009, 2010, 2011, -2012, 2013, 2014, 2015, 2016, 2017, 2018 Python Software Foundation. All rights -reserved. +2012, 2013, 2014, 2015, 2016, 2017, 2018, 2019 Python Software Foundation. All +rights reserved. See the end of this file for further copyright and license information. @@ -247,8 +247,8 @@ Copyright and License Information --------------------------------- Copyright (c) 2001, 2002, 2003, 2004, 2005, 2006, 2007, 2008, 2009, 2010, 2011, -2012, 2013, 2014, 2015, 2016, 2017, 2018 Python Software Foundation. All rights -reserved. +2012, 2013, 2014, 2015, 2016, 2017, 2018, 2019 Python Software Foundation. All +rights reserved. Copyright (c) 2000 BeOpen.com. All rights reserved. From webhook-mailer at python.org Wed Jan 2 11:23:54 2019 From: webhook-mailer at python.org (Benjamin Peterson) Date: Wed, 02 Jan 2019 16:23:54 -0000 Subject: [Python-checkins] [3.6] Bump copyright years to 2019. (GH-11407) Message-ID: https://github.com/python/cpython/commit/c2340619a772849eb7c4b038ca57ca093c738ed8 commit: c2340619a772849eb7c4b038ca57ca093c738ed8 branch: 3.6 author: Benjamin Peterson committer: GitHub date: 2019-01-02T08:23:51-08:00 summary: [3.6] Bump copyright years to 2019. (GH-11407) (cherry picked from commit 9a69ae8a78785105ded02b083b2e5cd2dd939307) files: M Doc/copyright.rst M Doc/license.rst M LICENSE M Mac/IDLE/IDLE.app/Contents/Info.plist M Mac/PythonLauncher/Info.plist.in M Mac/Resources/app/Info.plist.in M Mac/Resources/framework/Info.plist.in M Python/getcopyright.c M README.rst diff --git a/Doc/copyright.rst b/Doc/copyright.rst index 540ff5ef0593..393a1f03751f 100644 --- a/Doc/copyright.rst +++ b/Doc/copyright.rst @@ -4,7 +4,7 @@ Copyright Python and this documentation is: -Copyright ? 2001-2018 Python Software Foundation. All rights reserved. +Copyright ? 2001-2019 Python Software Foundation. All rights reserved. Copyright ? 2000 BeOpen.com. All rights reserved. diff --git a/Doc/license.rst b/Doc/license.rst index aa4b1450e91a..a7df1b992044 100644 --- a/Doc/license.rst +++ b/Doc/license.rst @@ -87,7 +87,7 @@ PSF LICENSE AGREEMENT FOR PYTHON |release| analyze, test, perform and/or display publicly, prepare derivative works, distribute, and otherwise use Python |release| alone or in any derivative version, provided, however, that PSF's License Agreement and PSF's notice of - copyright, i.e., "Copyright ? 2001-2018 Python Software Foundation; All Rights + copyright, i.e., "Copyright ? 2001-2019 Python Software Foundation; All Rights Reserved" are retained in Python |release| alone or in any derivative version prepared by Licensee. diff --git a/LICENSE b/LICENSE index 1afbedba92b3..9dc010d80348 100644 --- a/LICENSE +++ b/LICENSE @@ -73,8 +73,8 @@ analyze, test, perform and/or display publicly, prepare derivative works, distribute, and otherwise use Python alone or in any derivative version, provided, however, that PSF's License Agreement and PSF's notice of copyright, i.e., "Copyright (c) 2001, 2002, 2003, 2004, 2005, 2006, 2007, 2008, 2009, 2010, -2011, 2012, 2013, 2014, 2015, 2016, 2017, 2018 Python Software Foundation; All -Rights Reserved" are retained in Python alone or in any derivative version +2011, 2012, 2013, 2014, 2015, 2016, 2017, 2018, 2019 Python Software Foundation; +All Rights Reserved" are retained in Python alone or in any derivative version prepared by Licensee. 3. In the event Licensee prepares a derivative work that is based on diff --git a/Mac/IDLE/IDLE.app/Contents/Info.plist b/Mac/IDLE/IDLE.app/Contents/Info.plist index 826d3793f75a..04a0a08c8363 100644 --- a/Mac/IDLE/IDLE.app/Contents/Info.plist +++ b/Mac/IDLE/IDLE.app/Contents/Info.plist @@ -36,7 +36,7 @@ CFBundleExecutable IDLE CFBundleGetInfoString - %version%, ? 2001-2018 Python Software Foundation + %version%, ? 2001-2019 Python Software Foundation CFBundleIconFile IDLE.icns CFBundleIdentifier diff --git a/Mac/PythonLauncher/Info.plist.in b/Mac/PythonLauncher/Info.plist.in index 5fa346ed4d3d..9fb4e0affd9c 100644 --- a/Mac/PythonLauncher/Info.plist.in +++ b/Mac/PythonLauncher/Info.plist.in @@ -40,7 +40,7 @@ CFBundleExecutable Python Launcher CFBundleGetInfoString - %VERSION%, ? 2001-2018 Python Software Foundation + %VERSION%, ? 2001-2019 Python Software Foundation CFBundleIconFile PythonLauncher.icns CFBundleIdentifier diff --git a/Mac/Resources/app/Info.plist.in b/Mac/Resources/app/Info.plist.in index abe9ae23e341..b7581984dd67 100644 --- a/Mac/Resources/app/Info.plist.in +++ b/Mac/Resources/app/Info.plist.in @@ -37,7 +37,7 @@ CFBundleInfoDictionaryVersion 6.0 CFBundleLongVersionString - %version%, (c) 2001-2018 Python Software Foundation. + %version%, (c) 2001-2019 Python Software Foundation. CFBundleName Python CFBundlePackageType diff --git a/Mac/Resources/framework/Info.plist.in b/Mac/Resources/framework/Info.plist.in index c1ea9f688920..0dc2e17156f1 100644 --- a/Mac/Resources/framework/Info.plist.in +++ b/Mac/Resources/framework/Info.plist.in @@ -17,9 +17,9 @@ CFBundlePackageType FMWK CFBundleShortVersionString - %VERSION%, (c) 2001-2018 Python Software Foundation. + %VERSION%, (c) 2001-2019 Python Software Foundation. CFBundleLongVersionString - %VERSION%, (c) 2001-2018 Python Software Foundation. + %VERSION%, (c) 2001-2019 Python Software Foundation. CFBundleSignature ???? CFBundleVersion diff --git a/Python/getcopyright.c b/Python/getcopyright.c index 51772ecef317..27a1731f46de 100644 --- a/Python/getcopyright.c +++ b/Python/getcopyright.c @@ -4,7 +4,7 @@ static const char cprt[] = "\ -Copyright (c) 2001-2018 Python Software Foundation.\n\ +Copyright (c) 2001-2019 Python Software Foundation.\n\ All Rights Reserved.\n\ \n\ Copyright (c) 2000 BeOpen.com.\n\ diff --git a/README.rst b/README.rst index 99558b8c551d..e1d13b07ee9f 100644 --- a/README.rst +++ b/README.rst @@ -18,8 +18,8 @@ This is Python version 3.6.8+ :target: https://codecov.io/gh/python/cpython Copyright (c) 2001, 2002, 2003, 2004, 2005, 2006, 2007, 2008, 2009, 2010, 2011, -2012, 2013, 2014, 2015, 2016, 2017, 2018 Python Software Foundation. All rights -reserved. +2012, 2013, 2014, 2015, 2016, 2017, 2018, 2019 Python Software Foundation. All +rights reserved. See the end of this file for further copyright and license information. @@ -245,8 +245,8 @@ Copyright and License Information --------------------------------- Copyright (c) 2001, 2002, 2003, 2004, 2005, 2006, 2007, 2008, 2009, 2010, 2011, -2012, 2013, 2014, 2015, 2016, 2017, 2018 Python Software Foundation. All rights -reserved. +2012, 2013, 2014, 2015, 2016, 2017, 2018, 2019 Python Software Foundation. All +rights reserved. Copyright (c) 2000 BeOpen.com. All rights reserved. From webhook-mailer at python.org Wed Jan 2 11:43:36 2019 From: webhook-mailer at python.org (Benjamin Peterson) Date: Wed, 02 Jan 2019 16:43:36 -0000 Subject: [Python-checkins] [2.7] Bump copyright years to 2019. (GH-11408) Message-ID: https://github.com/python/cpython/commit/5a89c71580529549e71567abf557c812eb470b2b commit: 5a89c71580529549e71567abf557c812eb470b2b branch: 2.7 author: Benjamin Peterson committer: GitHub date: 2019-01-02T08:43:32-08:00 summary: [2.7] Bump copyright years to 2019. (GH-11408) (cherry picked from commit 9a69ae8a78785105ded02b083b2e5cd2dd939307) files: A README.rst M Doc/copyright.rst M Doc/license.rst M LICENSE M Mac/PythonLauncher/Info.plist.in M Mac/Resources/app/Info.plist.in M Mac/Resources/framework/Info.plist.in M Python/getcopyright.c M README diff --git a/Doc/copyright.rst b/Doc/copyright.rst index 540ff5ef0593..393a1f03751f 100644 --- a/Doc/copyright.rst +++ b/Doc/copyright.rst @@ -4,7 +4,7 @@ Copyright Python and this documentation is: -Copyright ? 2001-2018 Python Software Foundation. All rights reserved. +Copyright ? 2001-2019 Python Software Foundation. All rights reserved. Copyright ? 2000 BeOpen.com. All rights reserved. diff --git a/Doc/license.rst b/Doc/license.rst index f33495ab1ef9..eded4a935ddf 100644 --- a/Doc/license.rst +++ b/Doc/license.rst @@ -87,7 +87,7 @@ PSF LICENSE AGREEMENT FOR PYTHON |release| analyze, test, perform and/or display publicly, prepare derivative works, distribute, and otherwise use Python |release| alone or in any derivative version, provided, however, that PSF's License Agreement and PSF's notice of - copyright, i.e., "Copyright ? 2001-2018 Python Software Foundation; All Rights + copyright, i.e., "Copyright ? 2001-2019 Python Software Foundation; All Rights Reserved" are retained in Python |release| alone or in any derivative version prepared by Licensee. diff --git a/LICENSE b/LICENSE index 1afbedba92b3..9dc010d80348 100644 --- a/LICENSE +++ b/LICENSE @@ -73,8 +73,8 @@ analyze, test, perform and/or display publicly, prepare derivative works, distribute, and otherwise use Python alone or in any derivative version, provided, however, that PSF's License Agreement and PSF's notice of copyright, i.e., "Copyright (c) 2001, 2002, 2003, 2004, 2005, 2006, 2007, 2008, 2009, 2010, -2011, 2012, 2013, 2014, 2015, 2016, 2017, 2018 Python Software Foundation; All -Rights Reserved" are retained in Python alone or in any derivative version +2011, 2012, 2013, 2014, 2015, 2016, 2017, 2018, 2019 Python Software Foundation; +All Rights Reserved" are retained in Python alone or in any derivative version prepared by Licensee. 3. In the event Licensee prepares a derivative work that is based on diff --git a/Mac/PythonLauncher/Info.plist.in b/Mac/PythonLauncher/Info.plist.in index b84fffeec64a..77dbb0fe84e4 100644 --- a/Mac/PythonLauncher/Info.plist.in +++ b/Mac/PythonLauncher/Info.plist.in @@ -40,7 +40,7 @@ CFBundleExecutable PythonLauncher CFBundleGetInfoString - %VERSION%, ? 2001-2018 Python Software Foundation + %VERSION%, ? 2001-2019 Python Software Foundation CFBundleIconFile PythonLauncher.icns CFBundleIdentifier diff --git a/Mac/Resources/app/Info.plist.in b/Mac/Resources/app/Info.plist.in index abe9ae23e341..b7581984dd67 100644 --- a/Mac/Resources/app/Info.plist.in +++ b/Mac/Resources/app/Info.plist.in @@ -37,7 +37,7 @@ CFBundleInfoDictionaryVersion 6.0 CFBundleLongVersionString - %version%, (c) 2001-2018 Python Software Foundation. + %version%, (c) 2001-2019 Python Software Foundation. CFBundleName Python CFBundlePackageType diff --git a/Mac/Resources/framework/Info.plist.in b/Mac/Resources/framework/Info.plist.in index c1ea9f688920..0dc2e17156f1 100644 --- a/Mac/Resources/framework/Info.plist.in +++ b/Mac/Resources/framework/Info.plist.in @@ -17,9 +17,9 @@ CFBundlePackageType FMWK CFBundleShortVersionString - %VERSION%, (c) 2001-2018 Python Software Foundation. + %VERSION%, (c) 2001-2019 Python Software Foundation. CFBundleLongVersionString - %VERSION%, (c) 2001-2018 Python Software Foundation. + %VERSION%, (c) 2001-2019 Python Software Foundation. CFBundleSignature ???? CFBundleVersion diff --git a/Python/getcopyright.c b/Python/getcopyright.c index 1b69012fbc17..0ef16d092381 100644 --- a/Python/getcopyright.c +++ b/Python/getcopyright.c @@ -4,7 +4,7 @@ static char cprt[] = "\ -Copyright (c) 2001-2018 Python Software Foundation.\n\ +Copyright (c) 2001-2019 Python Software Foundation.\n\ All Rights Reserved.\n\ \n\ Copyright (c) 2000 BeOpen.com.\n\ diff --git a/README b/README index 8a8883631f30..078273a52f13 100644 --- a/README +++ b/README @@ -2,8 +2,8 @@ This is Python version 2.7.15 ============================= Copyright (c) 2001, 2002, 2003, 2004, 2005, 2006, 2007, 2008, 2009, 2010, 2011, -2012, 2013, 2014, 2015, 2016, 2017, 2018 Python Software Foundation. All rights -reserved. +2012, 2013, 2014, 2015, 2016, 2017, 2018, 2019 Python Software Foundation. All +rights reserved. Copyright (c) 2000 BeOpen.com. All rights reserved. diff --git a/README.rst b/README.rst new file mode 100644 index 000000000000..6a70f6c8eef8 --- /dev/null +++ b/README.rst @@ -0,0 +1,268 @@ +This is Python version 3.8.0 alpha 0 +==================================== + +.. image:: https://travis-ci.org/python/cpython.svg?branch=master + :alt: CPython build status on Travis CI + :target: https://travis-ci.org/python/cpython + +.. image:: https://ci.appveyor.com/api/projects/status/4mew1a93xdkbf5ua/branch/master?svg=true + :alt: CPython build status on Appveyor + :target: https://ci.appveyor.com/project/python/cpython/branch/master + +.. image:: https://dev.azure.com/python/cpython/_apis/build/status/Azure%20Pipelines%20CI?branchName=master + :alt: CPython build status on Azure DevOps + :target: https://dev.azure.com/python/cpython/_build/latest?definitionId=4&branchName=master + +.. image:: https://codecov.io/gh/python/cpython/branch/master/graph/badge.svg + :alt: CPython code coverage on Codecov + :target: https://codecov.io/gh/python/cpython + +.. image:: https://img.shields.io/badge/zulip-join_chat-brightgreen.svg + :alt: Python Zulip chat + :target: https://python.zulipchat.com + + +Copyright (c) 2001-2019 Python Software Foundation. All rights reserved. + +See the end of this file for further copyright and license information. + +.. contents:: + +General Information +------------------- + +- Website: https://www.python.org +- Source code: https://github.com/python/cpython +- Issue tracker: https://bugs.python.org +- Documentation: https://docs.python.org +- Developer's Guide: https://devguide.python.org/ + +Contributing to CPython +----------------------- + +For more complete instructions on contributing to CPython development, +see the `Developer Guide`_. + +.. _Developer Guide: https://devguide.python.org/ + +Using Python +------------ + +Installable Python kits, and information about using Python, are available at +`python.org`_. + +.. _python.org: https://www.python.org/ + +Build Instructions +------------------ + +On Unix, Linux, BSD, macOS, and Cygwin:: + + ./configure + make + make test + sudo make install + +This will install Python as ``python3``. + +You can pass many options to the configure script; run ``./configure --help`` +to find out more. On macOS and Cygwin, the executable is called ``python.exe``; +elsewhere it's just ``python``. + +If you are running on macOS with the latest updates installed, make sure to install +openSSL or some other SSL software along with Homebrew or another package manager. +If issues persist, see https://devguide.python.org/setup/#macos-and-os-x for more +information. + +On macOS, if you have configured Python with ``--enable-framework``, you +should use ``make frameworkinstall`` to do the installation. Note that this +installs the Python executable in a place that is not normally on your PATH, +you may want to set up a symlink in ``/usr/local/bin``. + +On Windows, see `PCbuild/readme.txt +`_. + +If you wish, you can create a subdirectory and invoke configure from there. +For example:: + + mkdir debug + cd debug + ../configure --with-pydebug + make + make test + +(This will fail if you *also* built at the top-level directory. You should do +a ``make clean`` at the toplevel first.) + +To get an optimized build of Python, ``configure --enable-optimizations`` +before you run ``make``. This sets the default make targets up to enable +Profile Guided Optimization (PGO) and may be used to auto-enable Link Time +Optimization (LTO) on some platforms. For more details, see the sections +below. + + +Profile Guided Optimization +^^^^^^^^^^^^^^^^^^^^^^^^^^^ + +PGO takes advantage of recent versions of the GCC or Clang compilers. If used, +either via ``configure --enable-optimizations`` or by manually running +``make profile-opt`` regardless of configure flags, the optimized build +process will perform the following steps: + +The entire Python directory is cleaned of temporary files that may have +resulted from a previous compilation. + +An instrumented version of the interpreter is built, using suitable compiler +flags for each flavour. Note that this is just an intermediary step. The +binary resulting from this step is not good for real life workloads as it has +profiling instructions embedded inside. + +After the instrumented interpreter is built, the Makefile will run a training +workload. This is necessary in order to profile the interpreter execution. +Note also that any output, both stdout and stderr, that may appear at this step +is suppressed. + +The final step is to build the actual interpreter, using the information +collected from the instrumented one. The end result will be a Python binary +that is optimized; suitable for distribution or production installation. + + +Link Time Optimization +^^^^^^^^^^^^^^^^^^^^^^ + +Enabled via configure's ``--with-lto`` flag. LTO takes advantage of the +ability of recent compiler toolchains to optimize across the otherwise +arbitrary ``.o`` file boundary when building final executables or shared +libraries for additional performance gains. + + +What's New +---------- + +We have a comprehensive overview of the changes in the `What's New in Python +3.8 `_ document. For a more +detailed change log, read `Misc/NEWS +`_, but a full +accounting of changes can only be gleaned from the `commit history +`_. + +If you want to install multiple versions of Python see the section below +entitled "Installing multiple versions". + + +Documentation +------------- + +`Documentation for Python 3.8 `_ is online, +updated daily. + +It can also be downloaded in many formats for faster access. The documentation +is downloadable in HTML, PDF, and reStructuredText formats; the latter version +is primarily for documentation authors, translators, and people with special +formatting requirements. + +For information about building Python's documentation, refer to `Doc/README.rst +`_. + + +Converting From Python 2.x to 3.x +--------------------------------- + +Significant backward incompatible changes were made for the release of Python +3.0, which may cause programs written for Python 2 to fail when run with Python +3. For more information about porting your code from Python 2 to Python 3, see +the `Porting HOWTO `_. + + +Testing +------- + +To test the interpreter, type ``make test`` in the top-level directory. The +test set produces some output. You can generally ignore the messages about +skipped tests due to optional features which can't be imported. If a message +is printed about a failed test or a traceback or core dump is produced, +something is wrong. + +By default, tests are prevented from overusing resources like disk space and +memory. To enable these tests, run ``make testall``. + +If any tests fail, you can re-run the failing test(s) in verbose mode. For +example, if ``test_os`` and ``test_gdb`` failed, you can run:: + + make test TESTOPTS="-v test_os test_gdb" + +If the failure persists and appears to be a problem with Python rather than +your environment, you can `file a bug report `_ and +include relevant output from that command to show the issue. + +See `Running & Writing Tests `_ +for more on running tests. + +Installing multiple versions +---------------------------- + +On Unix and Mac systems if you intend to install multiple versions of Python +using the same installation prefix (``--prefix`` argument to the configure +script) you must take care that your primary python executable is not +overwritten by the installation of a different version. All files and +directories installed using ``make altinstall`` contain the major and minor +version and can thus live side-by-side. ``make install`` also creates +``${prefix}/bin/python3`` which refers to ``${prefix}/bin/pythonX.Y``. If you +intend to install multiple versions using the same prefix you must decide which +version (if any) is your "primary" version. Install that version using ``make +install``. Install all other versions using ``make altinstall``. + +For example, if you want to install Python 2.7, 3.6, and 3.8 with 3.8 being the +primary version, you would execute ``make install`` in your 3.8 build directory +and ``make altinstall`` in the others. + + +Issue Tracker and Mailing List +------------------------------ + +Bug reports are welcome! You can use the `issue tracker +`_ to report bugs, and/or submit pull requests `on +GitHub `_. + +You can also follow development discussion on the `python-dev mailing list +`_. + + +Proposals for enhancement +------------------------- + +If you have a proposal to change Python, you may want to send an email to the +comp.lang.python or `python-ideas`_ mailing lists for initial feedback. A +Python Enhancement Proposal (PEP) may be submitted if your idea gains ground. +All current PEPs, as well as guidelines for submitting a new PEP, are listed at +`python.org/dev/peps/ `_. + +.. _python-ideas: https://mail.python.org/mailman/listinfo/python-ideas/ + + +Release Schedule +---------------- + +See :pep:`569` for Python 3.8 release details. + + +Copyright and License Information +--------------------------------- + +Copyright (c) 2001-2019 Python Software Foundation. All rights reserved. + +Copyright (c) 2000 BeOpen.com. All rights reserved. + +Copyright (c) 1995-2001 Corporation for National Research Initiatives. All +rights reserved. + +Copyright (c) 1991-1995 Stichting Mathematisch Centrum. All rights reserved. + +See the file "LICENSE" for information on the history of this software, terms & +conditions for usage, and a DISCLAIMER OF ALL WARRANTIES. + +This Python distribution contains *no* GNU General Public License (GPL) code, +so it may be used in proprietary projects. There are interfaces to some GNU +code but these are entirely optional. + +All trademarks referenced herein are property of their respective holders. From webhook-mailer at python.org Wed Jan 2 12:14:33 2019 From: webhook-mailer at python.org (Benjamin Peterson) Date: Wed, 02 Jan 2019 17:14:33 -0000 Subject: [Python-checkins] Remove README.rst inadvertandly "backported" from 3.x in 5a89c71580529549e71567abf557c812eb470b2b. (GH-11409) Message-ID: https://github.com/python/cpython/commit/112e4afd582515fcdcc0cde5012a4866e5cfda12 commit: 112e4afd582515fcdcc0cde5012a4866e5cfda12 branch: 2.7 author: Benjamin Peterson committer: GitHub date: 2019-01-02T09:14:30-08:00 summary: Remove README.rst inadvertandly "backported" from 3.x in 5a89c71580529549e71567abf557c812eb470b2b. (GH-11409) files: D README.rst diff --git a/README.rst b/README.rst deleted file mode 100644 index 6a70f6c8eef8..000000000000 --- a/README.rst +++ /dev/null @@ -1,268 +0,0 @@ -This is Python version 3.8.0 alpha 0 -==================================== - -.. image:: https://travis-ci.org/python/cpython.svg?branch=master - :alt: CPython build status on Travis CI - :target: https://travis-ci.org/python/cpython - -.. image:: https://ci.appveyor.com/api/projects/status/4mew1a93xdkbf5ua/branch/master?svg=true - :alt: CPython build status on Appveyor - :target: https://ci.appveyor.com/project/python/cpython/branch/master - -.. image:: https://dev.azure.com/python/cpython/_apis/build/status/Azure%20Pipelines%20CI?branchName=master - :alt: CPython build status on Azure DevOps - :target: https://dev.azure.com/python/cpython/_build/latest?definitionId=4&branchName=master - -.. image:: https://codecov.io/gh/python/cpython/branch/master/graph/badge.svg - :alt: CPython code coverage on Codecov - :target: https://codecov.io/gh/python/cpython - -.. image:: https://img.shields.io/badge/zulip-join_chat-brightgreen.svg - :alt: Python Zulip chat - :target: https://python.zulipchat.com - - -Copyright (c) 2001-2019 Python Software Foundation. All rights reserved. - -See the end of this file for further copyright and license information. - -.. contents:: - -General Information -------------------- - -- Website: https://www.python.org -- Source code: https://github.com/python/cpython -- Issue tracker: https://bugs.python.org -- Documentation: https://docs.python.org -- Developer's Guide: https://devguide.python.org/ - -Contributing to CPython ------------------------ - -For more complete instructions on contributing to CPython development, -see the `Developer Guide`_. - -.. _Developer Guide: https://devguide.python.org/ - -Using Python ------------- - -Installable Python kits, and information about using Python, are available at -`python.org`_. - -.. _python.org: https://www.python.org/ - -Build Instructions ------------------- - -On Unix, Linux, BSD, macOS, and Cygwin:: - - ./configure - make - make test - sudo make install - -This will install Python as ``python3``. - -You can pass many options to the configure script; run ``./configure --help`` -to find out more. On macOS and Cygwin, the executable is called ``python.exe``; -elsewhere it's just ``python``. - -If you are running on macOS with the latest updates installed, make sure to install -openSSL or some other SSL software along with Homebrew or another package manager. -If issues persist, see https://devguide.python.org/setup/#macos-and-os-x for more -information. - -On macOS, if you have configured Python with ``--enable-framework``, you -should use ``make frameworkinstall`` to do the installation. Note that this -installs the Python executable in a place that is not normally on your PATH, -you may want to set up a symlink in ``/usr/local/bin``. - -On Windows, see `PCbuild/readme.txt -`_. - -If you wish, you can create a subdirectory and invoke configure from there. -For example:: - - mkdir debug - cd debug - ../configure --with-pydebug - make - make test - -(This will fail if you *also* built at the top-level directory. You should do -a ``make clean`` at the toplevel first.) - -To get an optimized build of Python, ``configure --enable-optimizations`` -before you run ``make``. This sets the default make targets up to enable -Profile Guided Optimization (PGO) and may be used to auto-enable Link Time -Optimization (LTO) on some platforms. For more details, see the sections -below. - - -Profile Guided Optimization -^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -PGO takes advantage of recent versions of the GCC or Clang compilers. If used, -either via ``configure --enable-optimizations`` or by manually running -``make profile-opt`` regardless of configure flags, the optimized build -process will perform the following steps: - -The entire Python directory is cleaned of temporary files that may have -resulted from a previous compilation. - -An instrumented version of the interpreter is built, using suitable compiler -flags for each flavour. Note that this is just an intermediary step. The -binary resulting from this step is not good for real life workloads as it has -profiling instructions embedded inside. - -After the instrumented interpreter is built, the Makefile will run a training -workload. This is necessary in order to profile the interpreter execution. -Note also that any output, both stdout and stderr, that may appear at this step -is suppressed. - -The final step is to build the actual interpreter, using the information -collected from the instrumented one. The end result will be a Python binary -that is optimized; suitable for distribution or production installation. - - -Link Time Optimization -^^^^^^^^^^^^^^^^^^^^^^ - -Enabled via configure's ``--with-lto`` flag. LTO takes advantage of the -ability of recent compiler toolchains to optimize across the otherwise -arbitrary ``.o`` file boundary when building final executables or shared -libraries for additional performance gains. - - -What's New ----------- - -We have a comprehensive overview of the changes in the `What's New in Python -3.8 `_ document. For a more -detailed change log, read `Misc/NEWS -`_, but a full -accounting of changes can only be gleaned from the `commit history -`_. - -If you want to install multiple versions of Python see the section below -entitled "Installing multiple versions". - - -Documentation -------------- - -`Documentation for Python 3.8 `_ is online, -updated daily. - -It can also be downloaded in many formats for faster access. The documentation -is downloadable in HTML, PDF, and reStructuredText formats; the latter version -is primarily for documentation authors, translators, and people with special -formatting requirements. - -For information about building Python's documentation, refer to `Doc/README.rst -`_. - - -Converting From Python 2.x to 3.x ---------------------------------- - -Significant backward incompatible changes were made for the release of Python -3.0, which may cause programs written for Python 2 to fail when run with Python -3. For more information about porting your code from Python 2 to Python 3, see -the `Porting HOWTO `_. - - -Testing -------- - -To test the interpreter, type ``make test`` in the top-level directory. The -test set produces some output. You can generally ignore the messages about -skipped tests due to optional features which can't be imported. If a message -is printed about a failed test or a traceback or core dump is produced, -something is wrong. - -By default, tests are prevented from overusing resources like disk space and -memory. To enable these tests, run ``make testall``. - -If any tests fail, you can re-run the failing test(s) in verbose mode. For -example, if ``test_os`` and ``test_gdb`` failed, you can run:: - - make test TESTOPTS="-v test_os test_gdb" - -If the failure persists and appears to be a problem with Python rather than -your environment, you can `file a bug report `_ and -include relevant output from that command to show the issue. - -See `Running & Writing Tests `_ -for more on running tests. - -Installing multiple versions ----------------------------- - -On Unix and Mac systems if you intend to install multiple versions of Python -using the same installation prefix (``--prefix`` argument to the configure -script) you must take care that your primary python executable is not -overwritten by the installation of a different version. All files and -directories installed using ``make altinstall`` contain the major and minor -version and can thus live side-by-side. ``make install`` also creates -``${prefix}/bin/python3`` which refers to ``${prefix}/bin/pythonX.Y``. If you -intend to install multiple versions using the same prefix you must decide which -version (if any) is your "primary" version. Install that version using ``make -install``. Install all other versions using ``make altinstall``. - -For example, if you want to install Python 2.7, 3.6, and 3.8 with 3.8 being the -primary version, you would execute ``make install`` in your 3.8 build directory -and ``make altinstall`` in the others. - - -Issue Tracker and Mailing List ------------------------------- - -Bug reports are welcome! You can use the `issue tracker -`_ to report bugs, and/or submit pull requests `on -GitHub `_. - -You can also follow development discussion on the `python-dev mailing list -`_. - - -Proposals for enhancement -------------------------- - -If you have a proposal to change Python, you may want to send an email to the -comp.lang.python or `python-ideas`_ mailing lists for initial feedback. A -Python Enhancement Proposal (PEP) may be submitted if your idea gains ground. -All current PEPs, as well as guidelines for submitting a new PEP, are listed at -`python.org/dev/peps/ `_. - -.. _python-ideas: https://mail.python.org/mailman/listinfo/python-ideas/ - - -Release Schedule ----------------- - -See :pep:`569` for Python 3.8 release details. - - -Copyright and License Information ---------------------------------- - -Copyright (c) 2001-2019 Python Software Foundation. All rights reserved. - -Copyright (c) 2000 BeOpen.com. All rights reserved. - -Copyright (c) 1995-2001 Corporation for National Research Initiatives. All -rights reserved. - -Copyright (c) 1991-1995 Stichting Mathematisch Centrum. All rights reserved. - -See the file "LICENSE" for information on the history of this software, terms & -conditions for usage, and a DISCLAIMER OF ALL WARRANTIES. - -This Python distribution contains *no* GNU General Public License (GPL) code, -so it may be used in proprietary projects. There are interfaces to some GNU -code but these are entirely optional. - -All trademarks referenced herein are property of their respective holders. From webhook-mailer at python.org Wed Jan 2 14:27:03 2019 From: webhook-mailer at python.org (Benjamin Peterson) Date: Wed, 02 Jan 2019 19:27:03 -0000 Subject: [Python-checkins] closes bpo-35643: Fix a SyntaxWarning: invalid escape sequence in Modules/_sha3/cleanup.py (GH-11411) Message-ID: https://github.com/python/cpython/commit/d466c43e55cd32af84e353f0e9a48b09b7534f61 commit: d466c43e55cd32af84e353f0e9a48b09b7534f61 branch: master author: Micka?l Schoentgen committer: Benjamin Peterson date: 2019-01-02T11:26:57-08:00 summary: closes bpo-35643: Fix a SyntaxWarning: invalid escape sequence in Modules/_sha3/cleanup.py (GH-11411) files: A Misc/NEWS.d/next/Library/2019-01-02-20-04-49.bpo-35643.DaMiaV.rst M Modules/_sha3/cleanup.py diff --git a/Misc/NEWS.d/next/Library/2019-01-02-20-04-49.bpo-35643.DaMiaV.rst b/Misc/NEWS.d/next/Library/2019-01-02-20-04-49.bpo-35643.DaMiaV.rst new file mode 100644 index 000000000000..0b47bb61fc05 --- /dev/null +++ b/Misc/NEWS.d/next/Library/2019-01-02-20-04-49.bpo-35643.DaMiaV.rst @@ -0,0 +1,2 @@ +Fixed a SyntaxWarning: invalid escape sequence in Modules/_sha3/cleanup.py. +Patch by Micka?l Schoentgen. diff --git a/Modules/_sha3/cleanup.py b/Modules/_sha3/cleanup.py index 17c56b34b261..4f53681b49e6 100755 --- a/Modules/_sha3/cleanup.py +++ b/Modules/_sha3/cleanup.py @@ -8,7 +8,7 @@ import re CPP1 = re.compile("^//(.*)") -CPP2 = re.compile("\ //(.*)") +CPP2 = re.compile(r"\ //(.*)") STATICS = ("void ", "int ", "HashReturn ", "const UINT64 ", "UINT16 ", " int prefix##") From webhook-mailer at python.org Wed Jan 2 14:59:01 2019 From: webhook-mailer at python.org (Miss Islington (bot)) Date: Wed, 02 Jan 2019 19:59:01 -0000 Subject: [Python-checkins] closes bpo-35643: Fix a SyntaxWarning: invalid escape sequence in Modules/_sha3/cleanup.py (GH-11411) Message-ID: https://github.com/python/cpython/commit/6d04bc9a2eb02efdb49f14f2c9664fe687f9a170 commit: 6d04bc9a2eb02efdb49f14f2c9664fe687f9a170 branch: 3.7 author: Miss Islington (bot) <31488909+miss-islington at users.noreply.github.com> committer: GitHub date: 2019-01-02T11:58:58-08:00 summary: closes bpo-35643: Fix a SyntaxWarning: invalid escape sequence in Modules/_sha3/cleanup.py (GH-11411) (cherry picked from commit d466c43e55cd32af84e353f0e9a48b09b7534f61) Co-authored-by: Micka?l Schoentgen files: A Misc/NEWS.d/next/Library/2019-01-02-20-04-49.bpo-35643.DaMiaV.rst M Modules/_sha3/cleanup.py diff --git a/Misc/NEWS.d/next/Library/2019-01-02-20-04-49.bpo-35643.DaMiaV.rst b/Misc/NEWS.d/next/Library/2019-01-02-20-04-49.bpo-35643.DaMiaV.rst new file mode 100644 index 000000000000..0b47bb61fc05 --- /dev/null +++ b/Misc/NEWS.d/next/Library/2019-01-02-20-04-49.bpo-35643.DaMiaV.rst @@ -0,0 +1,2 @@ +Fixed a SyntaxWarning: invalid escape sequence in Modules/_sha3/cleanup.py. +Patch by Micka?l Schoentgen. diff --git a/Modules/_sha3/cleanup.py b/Modules/_sha3/cleanup.py index 17c56b34b261..4f53681b49e6 100755 --- a/Modules/_sha3/cleanup.py +++ b/Modules/_sha3/cleanup.py @@ -8,7 +8,7 @@ import re CPP1 = re.compile("^//(.*)") -CPP2 = re.compile("\ //(.*)") +CPP2 = re.compile(r"\ //(.*)") STATICS = ("void ", "int ", "HashReturn ", "const UINT64 ", "UINT16 ", " int prefix##") From webhook-mailer at python.org Wed Jan 2 16:05:28 2019 From: webhook-mailer at python.org (Tal Einat) Date: Wed, 02 Jan 2019 21:05:28 -0000 Subject: [Python-checkins] bpo-35525: Correct the argument name for NNTP.starttls() (GH-11310) Message-ID: https://github.com/python/cpython/commit/e9a044ec16989bd4b39763c0588c17200a925350 commit: e9a044ec16989bd4b39763c0588c17200a925350 branch: master author: Harmandeep Singh committer: Tal Einat date: 2019-01-02T23:05:19+02:00 summary: bpo-35525: Correct the argument name for NNTP.starttls() (GH-11310) files: M Doc/library/nntplib.rst diff --git a/Doc/library/nntplib.rst b/Doc/library/nntplib.rst index d8ef8a692a95..56188c7ef538 100644 --- a/Doc/library/nntplib.rst +++ b/Doc/library/nntplib.rst @@ -232,10 +232,10 @@ tuples or objects that the method normally returns will be empty. .. versionadded:: 3.2 -.. method:: NNTP.starttls(ssl_context=None) +.. method:: NNTP.starttls(context=None) Send a ``STARTTLS`` command. This will enable encryption on the NNTP - connection. The *ssl_context* argument is optional and should be a + connection. The *context* argument is optional and should be a :class:`ssl.SSLContext` object. Please read :ref:`ssl-security` for best practices. From webhook-mailer at python.org Wed Jan 2 16:11:03 2019 From: webhook-mailer at python.org (Miss Islington (bot)) Date: Wed, 02 Jan 2019 21:11:03 -0000 Subject: [Python-checkins] bpo-35525: Correct the argument name for NNTP.starttls() (GH-11310) Message-ID: https://github.com/python/cpython/commit/d7cb2034bbaa26170cdc66eb54626e3ce1b8678a commit: d7cb2034bbaa26170cdc66eb54626e3ce1b8678a branch: 3.7 author: Miss Islington (bot) <31488909+miss-islington at users.noreply.github.com> committer: GitHub date: 2019-01-02T13:11:00-08:00 summary: bpo-35525: Correct the argument name for NNTP.starttls() (GH-11310) (cherry picked from commit e9a044ec16989bd4b39763c0588c17200a925350) Co-authored-by: Harmandeep Singh files: M Doc/library/nntplib.rst diff --git a/Doc/library/nntplib.rst b/Doc/library/nntplib.rst index d8ef8a692a95..56188c7ef538 100644 --- a/Doc/library/nntplib.rst +++ b/Doc/library/nntplib.rst @@ -232,10 +232,10 @@ tuples or objects that the method normally returns will be empty. .. versionadded:: 3.2 -.. method:: NNTP.starttls(ssl_context=None) +.. method:: NNTP.starttls(context=None) Send a ``STARTTLS`` command. This will enable encryption on the NNTP - connection. The *ssl_context* argument is optional and should be a + connection. The *context* argument is optional and should be a :class:`ssl.SSLContext` object. Please read :ref:`ssl-security` for best practices. From webhook-mailer at python.org Wed Jan 2 22:04:11 2019 From: webhook-mailer at python.org (Terry Jan Reedy) Date: Thu, 03 Jan 2019 03:04:11 -0000 Subject: [Python-checkins] bpo-33987: IDLE - use ttk Frame for ttk widgets (GH-11395) Message-ID: https://github.com/python/cpython/commit/aff0adabf3ace62073076f4ce875ff568f2d3180 commit: aff0adabf3ace62073076f4ce875ff568f2d3180 branch: master author: Terry Jan Reedy committer: GitHub date: 2019-01-02T22:04:06-05:00 summary: bpo-33987: IDLE - use ttk Frame for ttk widgets (GH-11395) files: A Misc/NEWS.d/next/IDLE/2018-12-31-17-04-18.bpo-33987.fD92up.rst M Lib/idlelib/autocomplete_w.py M Lib/idlelib/config_key.py M Lib/idlelib/configdialog.py M Lib/idlelib/debugger.py M Lib/idlelib/grep.py M Lib/idlelib/idle_test/test_searchbase.py M Lib/idlelib/query.py M Lib/idlelib/replace.py M Lib/idlelib/scrolledlist.py M Lib/idlelib/search.py M Lib/idlelib/searchbase.py M Lib/idlelib/tree.py diff --git a/Lib/idlelib/autocomplete_w.py b/Lib/idlelib/autocomplete_w.py index 9e0d336523d4..7994bc0db170 100644 --- a/Lib/idlelib/autocomplete_w.py +++ b/Lib/idlelib/autocomplete_w.py @@ -4,7 +4,7 @@ import platform from tkinter import * -from tkinter.ttk import Scrollbar +from tkinter.ttk import Frame, Scrollbar from idlelib.autocomplete import COMPLETE_FILES, COMPLETE_ATTRIBUTES from idlelib.multicall import MC_SHIFT diff --git a/Lib/idlelib/config_key.py b/Lib/idlelib/config_key.py index 21e84a94c9ff..4478323fcc2c 100644 --- a/Lib/idlelib/config_key.py +++ b/Lib/idlelib/config_key.py @@ -2,7 +2,7 @@ Dialog for building Tkinter accelerator key bindings """ from tkinter import Toplevel, Listbox, Text, StringVar, TclError -from tkinter.ttk import Button, Checkbutton, Entry, Frame, Label, Scrollbar +from tkinter.ttk import Frame, Button, Checkbutton, Entry, Label, Scrollbar from tkinter import messagebox import string import sys diff --git a/Lib/idlelib/configdialog.py b/Lib/idlelib/configdialog.py index 229dc8987433..5fdaf82de4de 100644 --- a/Lib/idlelib/configdialog.py +++ b/Lib/idlelib/configdialog.py @@ -14,7 +14,7 @@ TOP, BOTTOM, RIGHT, LEFT, SOLID, GROOVE, NONE, BOTH, X, Y, W, E, EW, NS, NSEW, NW, HORIZONTAL, VERTICAL, ANCHOR, ACTIVE, END) -from tkinter.ttk import (Button, Checkbutton, Entry, Frame, Label, LabelFrame, +from tkinter.ttk import (Frame, LabelFrame, Button, Checkbutton, Entry, Label, OptionMenu, Notebook, Radiobutton, Scrollbar, Style) import tkinter.colorchooser as tkColorChooser import tkinter.font as tkFont diff --git a/Lib/idlelib/debugger.py b/Lib/idlelib/debugger.py index 09f912c9af33..ccd03e46e161 100644 --- a/Lib/idlelib/debugger.py +++ b/Lib/idlelib/debugger.py @@ -2,7 +2,7 @@ import os from tkinter import * -from tkinter.ttk import Scrollbar +from tkinter.ttk import Frame, Scrollbar from idlelib import macosx from idlelib.scrolledlist import ScrolledList diff --git a/Lib/idlelib/grep.py b/Lib/idlelib/grep.py index 8cc293c380de..873233ec1543 100644 --- a/Lib/idlelib/grep.py +++ b/Lib/idlelib/grep.py @@ -8,7 +8,7 @@ import sys from tkinter import StringVar, BooleanVar -from tkinter.ttk import Checkbutton +from tkinter.ttk import Checkbutton # Frame imported in ...Base from idlelib.searchbase import SearchDialogBase from idlelib import searchengine @@ -173,15 +173,18 @@ def findfiles(self, dir, base, rec): def _grep_dialog(parent): # htest # from tkinter import Toplevel, Text, SEL, END - from tkinter.ttk import Button + from tkinter.ttk import Frame, Button from idlelib.pyshell import PyShellFileList + top = Toplevel(parent) top.title("Test GrepDialog") x, y = map(int, parent.geometry().split('+')[1:]) top.geometry(f"+{x}+{y + 175}") flist = PyShellFileList(top) - text = Text(top, height=5) + frame = Frame(top) + frame.pack() + text = Text(frame, height=5) text.pack() def show_grep_dialog(): @@ -189,7 +192,7 @@ def show_grep_dialog(): grep(text, flist=flist) text.tag_remove(SEL, "1.0", END) - button = Button(top, text="Show GrepDialog", command=show_grep_dialog) + button = Button(frame, text="Show GrepDialog", command=show_grep_dialog) button.pack() if __name__ == "__main__": diff --git a/Lib/idlelib/idle_test/test_searchbase.py b/Lib/idlelib/idle_test/test_searchbase.py index 46c3ad111d17..09a7fff51de1 100644 --- a/Lib/idlelib/idle_test/test_searchbase.py +++ b/Lib/idlelib/idle_test/test_searchbase.py @@ -4,7 +4,8 @@ import unittest from test.support import requires -from tkinter import Tk, Frame ##, BooleanVar, StringVar +from tkinter import Tk +from tkinter.ttk import Frame from idlelib import searchengine as se from idlelib import searchbase as sdb from idlelib.idle_test.mock_idle import Func @@ -97,11 +98,12 @@ def test_make_frame(self): self.dialog.top = self.root frame, label = self.dialog.make_frame() self.assertEqual(label, '') - self.assertIsInstance(frame, Frame) + self.assertEqual(str(type(frame)), "") + # self.assertIsInstance(frame, Frame) fails when test is run by + # test_idle not run from IDLE editor. See issue 33987 PR. frame, label = self.dialog.make_frame('testlabel') self.assertEqual(label['text'], 'testlabel') - self.assertIsInstance(frame, Frame) def btn_test_setup(self, meth): self.dialog.top = self.root diff --git a/Lib/idlelib/query.py b/Lib/idlelib/query.py index c2628cceb739..f0b72553db87 100644 --- a/Lib/idlelib/query.py +++ b/Lib/idlelib/query.py @@ -1,6 +1,5 @@ """ Dialogs that query users and verify the answer before accepting. -Use ttk widgets, limiting use to tcl/tk 8.5+, as in IDLE 3.6+. Query is the generic base class for a popup dialog. The user must either enter a valid answer or close the dialog. diff --git a/Lib/idlelib/replace.py b/Lib/idlelib/replace.py index 83cf98756bdf..4a834eb7901e 100644 --- a/Lib/idlelib/replace.py +++ b/Lib/idlelib/replace.py @@ -205,12 +205,12 @@ def close(self, event=None): def _replace_dialog(parent): # htest # from tkinter import Toplevel, Text, END, SEL - from tkinter.ttk import Button + from tkinter.ttk import Frame, Button - box = Toplevel(parent) - box.title("Test ReplaceDialog") + top = Toplevel(parent) + top.title("Test ReplaceDialog") x, y = map(int, parent.geometry().split('+')[1:]) - box.geometry("+%d+%d" % (x, y + 175)) + top.geometry("+%d+%d" % (x, y + 175)) # mock undo delegator methods def undo_block_start(): @@ -219,7 +219,9 @@ def undo_block_start(): def undo_block_stop(): pass - text = Text(box, inactiveselectbackground='gray') + frame = Frame(top) + frame.pack() + text = Text(frame, inactiveselectbackground='gray') text.undo_block_start = undo_block_start text.undo_block_stop = undo_block_stop text.pack() @@ -231,7 +233,7 @@ def show_replace(): replace(text) text.tag_remove(SEL, "1.0", END) - button = Button(box, text="Replace", command=show_replace) + button = Button(frame, text="Replace", command=show_replace) button.pack() if __name__ == '__main__': diff --git a/Lib/idlelib/scrolledlist.py b/Lib/idlelib/scrolledlist.py index 10229b636292..71fd18ab19ec 100644 --- a/Lib/idlelib/scrolledlist.py +++ b/Lib/idlelib/scrolledlist.py @@ -1,5 +1,5 @@ from tkinter import * -from tkinter.ttk import Scrollbar +from tkinter.ttk import Frame, Scrollbar from idlelib import macosx diff --git a/Lib/idlelib/search.py b/Lib/idlelib/search.py index 6223661c8e08..6e5a0c7973c7 100644 --- a/Lib/idlelib/search.py +++ b/Lib/idlelib/search.py @@ -75,13 +75,16 @@ def find_selection(self, text): def _search_dialog(parent): # htest # "Display search test box." from tkinter import Toplevel, Text - from tkinter.ttk import Button + from tkinter.ttk import Frame, Button - box = Toplevel(parent) - box.title("Test SearchDialog") + top = Toplevel(parent) + top.title("Test SearchDialog") x, y = map(int, parent.geometry().split('+')[1:]) - box.geometry("+%d+%d" % (x, y + 175)) - text = Text(box, inactiveselectbackground='gray') + top.geometry("+%d+%d" % (x, y + 175)) + + frame = Frame(top) + frame.pack() + text = Text(frame, inactiveselectbackground='gray') text.pack() text.insert("insert","This is a sample string.\n"*5) @@ -90,7 +93,7 @@ def show_find(): _setup(text).open(text) text.tag_remove('sel', '1.0', 'end') - button = Button(box, text="Search (selection ignored)", command=show_find) + button = Button(frame, text="Search (selection ignored)", command=show_find) button.pack() if __name__ == '__main__': diff --git a/Lib/idlelib/searchbase.py b/Lib/idlelib/searchbase.py index 348db660e3e9..f0e3d6f14ba4 100644 --- a/Lib/idlelib/searchbase.py +++ b/Lib/idlelib/searchbase.py @@ -1,7 +1,7 @@ '''Define SearchDialogBase used by Search, Replace, and Grep dialogs.''' -from tkinter import Toplevel, Frame -from tkinter.ttk import Entry, Label, Button, Checkbutton, Radiobutton +from tkinter import Toplevel +from tkinter.ttk import Frame, Entry, Label, Button, Checkbutton, Radiobutton class SearchDialogBase: diff --git a/Lib/idlelib/tree.py b/Lib/idlelib/tree.py index 05f864657fb8..21426cbb33e0 100644 --- a/Lib/idlelib/tree.py +++ b/Lib/idlelib/tree.py @@ -17,7 +17,7 @@ import os from tkinter import * -from tkinter.ttk import Scrollbar +from tkinter.ttk import Frame, Scrollbar from idlelib.config import idleConf from idlelib import zoomheight diff --git a/Misc/NEWS.d/next/IDLE/2018-12-31-17-04-18.bpo-33987.fD92up.rst b/Misc/NEWS.d/next/IDLE/2018-12-31-17-04-18.bpo-33987.fD92up.rst new file mode 100644 index 000000000000..289a65cf532c --- /dev/null +++ b/Misc/NEWS.d/next/IDLE/2018-12-31-17-04-18.bpo-33987.fD92up.rst @@ -0,0 +1 @@ +Use ttk Frame for ttk widgets. From webhook-mailer at python.org Wed Jan 2 22:22:15 2019 From: webhook-mailer at python.org (Miss Islington (bot)) Date: Thu, 03 Jan 2019 03:22:15 -0000 Subject: [Python-checkins] bpo-33987: IDLE - use ttk Frame for ttk widgets (GH-11395) Message-ID: https://github.com/python/cpython/commit/b364caa39999658e843151602356e527851d6c68 commit: b364caa39999658e843151602356e527851d6c68 branch: 3.7 author: Miss Islington (bot) <31488909+miss-islington at users.noreply.github.com> committer: GitHub date: 2019-01-02T19:22:10-08:00 summary: bpo-33987: IDLE - use ttk Frame for ttk widgets (GH-11395) (cherry picked from commit aff0adabf3ace62073076f4ce875ff568f2d3180) Co-authored-by: Terry Jan Reedy files: A Misc/NEWS.d/next/IDLE/2018-12-31-17-04-18.bpo-33987.fD92up.rst M Lib/idlelib/autocomplete_w.py M Lib/idlelib/config_key.py M Lib/idlelib/configdialog.py M Lib/idlelib/debugger.py M Lib/idlelib/grep.py M Lib/idlelib/idle_test/test_searchbase.py M Lib/idlelib/query.py M Lib/idlelib/replace.py M Lib/idlelib/scrolledlist.py M Lib/idlelib/search.py M Lib/idlelib/searchbase.py M Lib/idlelib/tree.py diff --git a/Lib/idlelib/autocomplete_w.py b/Lib/idlelib/autocomplete_w.py index 9e0d336523d4..7994bc0db170 100644 --- a/Lib/idlelib/autocomplete_w.py +++ b/Lib/idlelib/autocomplete_w.py @@ -4,7 +4,7 @@ import platform from tkinter import * -from tkinter.ttk import Scrollbar +from tkinter.ttk import Frame, Scrollbar from idlelib.autocomplete import COMPLETE_FILES, COMPLETE_ATTRIBUTES from idlelib.multicall import MC_SHIFT diff --git a/Lib/idlelib/config_key.py b/Lib/idlelib/config_key.py index 21e84a94c9ff..4478323fcc2c 100644 --- a/Lib/idlelib/config_key.py +++ b/Lib/idlelib/config_key.py @@ -2,7 +2,7 @@ Dialog for building Tkinter accelerator key bindings """ from tkinter import Toplevel, Listbox, Text, StringVar, TclError -from tkinter.ttk import Button, Checkbutton, Entry, Frame, Label, Scrollbar +from tkinter.ttk import Frame, Button, Checkbutton, Entry, Label, Scrollbar from tkinter import messagebox import string import sys diff --git a/Lib/idlelib/configdialog.py b/Lib/idlelib/configdialog.py index 229dc8987433..5fdaf82de4de 100644 --- a/Lib/idlelib/configdialog.py +++ b/Lib/idlelib/configdialog.py @@ -14,7 +14,7 @@ TOP, BOTTOM, RIGHT, LEFT, SOLID, GROOVE, NONE, BOTH, X, Y, W, E, EW, NS, NSEW, NW, HORIZONTAL, VERTICAL, ANCHOR, ACTIVE, END) -from tkinter.ttk import (Button, Checkbutton, Entry, Frame, Label, LabelFrame, +from tkinter.ttk import (Frame, LabelFrame, Button, Checkbutton, Entry, Label, OptionMenu, Notebook, Radiobutton, Scrollbar, Style) import tkinter.colorchooser as tkColorChooser import tkinter.font as tkFont diff --git a/Lib/idlelib/debugger.py b/Lib/idlelib/debugger.py index 09f912c9af33..ccd03e46e161 100644 --- a/Lib/idlelib/debugger.py +++ b/Lib/idlelib/debugger.py @@ -2,7 +2,7 @@ import os from tkinter import * -from tkinter.ttk import Scrollbar +from tkinter.ttk import Frame, Scrollbar from idlelib import macosx from idlelib.scrolledlist import ScrolledList diff --git a/Lib/idlelib/grep.py b/Lib/idlelib/grep.py index 8cc293c380de..873233ec1543 100644 --- a/Lib/idlelib/grep.py +++ b/Lib/idlelib/grep.py @@ -8,7 +8,7 @@ import sys from tkinter import StringVar, BooleanVar -from tkinter.ttk import Checkbutton +from tkinter.ttk import Checkbutton # Frame imported in ...Base from idlelib.searchbase import SearchDialogBase from idlelib import searchengine @@ -173,15 +173,18 @@ def findfiles(self, dir, base, rec): def _grep_dialog(parent): # htest # from tkinter import Toplevel, Text, SEL, END - from tkinter.ttk import Button + from tkinter.ttk import Frame, Button from idlelib.pyshell import PyShellFileList + top = Toplevel(parent) top.title("Test GrepDialog") x, y = map(int, parent.geometry().split('+')[1:]) top.geometry(f"+{x}+{y + 175}") flist = PyShellFileList(top) - text = Text(top, height=5) + frame = Frame(top) + frame.pack() + text = Text(frame, height=5) text.pack() def show_grep_dialog(): @@ -189,7 +192,7 @@ def show_grep_dialog(): grep(text, flist=flist) text.tag_remove(SEL, "1.0", END) - button = Button(top, text="Show GrepDialog", command=show_grep_dialog) + button = Button(frame, text="Show GrepDialog", command=show_grep_dialog) button.pack() if __name__ == "__main__": diff --git a/Lib/idlelib/idle_test/test_searchbase.py b/Lib/idlelib/idle_test/test_searchbase.py index 46c3ad111d17..09a7fff51de1 100644 --- a/Lib/idlelib/idle_test/test_searchbase.py +++ b/Lib/idlelib/idle_test/test_searchbase.py @@ -4,7 +4,8 @@ import unittest from test.support import requires -from tkinter import Tk, Frame ##, BooleanVar, StringVar +from tkinter import Tk +from tkinter.ttk import Frame from idlelib import searchengine as se from idlelib import searchbase as sdb from idlelib.idle_test.mock_idle import Func @@ -97,11 +98,12 @@ def test_make_frame(self): self.dialog.top = self.root frame, label = self.dialog.make_frame() self.assertEqual(label, '') - self.assertIsInstance(frame, Frame) + self.assertEqual(str(type(frame)), "") + # self.assertIsInstance(frame, Frame) fails when test is run by + # test_idle not run from IDLE editor. See issue 33987 PR. frame, label = self.dialog.make_frame('testlabel') self.assertEqual(label['text'], 'testlabel') - self.assertIsInstance(frame, Frame) def btn_test_setup(self, meth): self.dialog.top = self.root diff --git a/Lib/idlelib/query.py b/Lib/idlelib/query.py index c2628cceb739..f0b72553db87 100644 --- a/Lib/idlelib/query.py +++ b/Lib/idlelib/query.py @@ -1,6 +1,5 @@ """ Dialogs that query users and verify the answer before accepting. -Use ttk widgets, limiting use to tcl/tk 8.5+, as in IDLE 3.6+. Query is the generic base class for a popup dialog. The user must either enter a valid answer or close the dialog. diff --git a/Lib/idlelib/replace.py b/Lib/idlelib/replace.py index 83cf98756bdf..4a834eb7901e 100644 --- a/Lib/idlelib/replace.py +++ b/Lib/idlelib/replace.py @@ -205,12 +205,12 @@ def close(self, event=None): def _replace_dialog(parent): # htest # from tkinter import Toplevel, Text, END, SEL - from tkinter.ttk import Button + from tkinter.ttk import Frame, Button - box = Toplevel(parent) - box.title("Test ReplaceDialog") + top = Toplevel(parent) + top.title("Test ReplaceDialog") x, y = map(int, parent.geometry().split('+')[1:]) - box.geometry("+%d+%d" % (x, y + 175)) + top.geometry("+%d+%d" % (x, y + 175)) # mock undo delegator methods def undo_block_start(): @@ -219,7 +219,9 @@ def undo_block_start(): def undo_block_stop(): pass - text = Text(box, inactiveselectbackground='gray') + frame = Frame(top) + frame.pack() + text = Text(frame, inactiveselectbackground='gray') text.undo_block_start = undo_block_start text.undo_block_stop = undo_block_stop text.pack() @@ -231,7 +233,7 @@ def show_replace(): replace(text) text.tag_remove(SEL, "1.0", END) - button = Button(box, text="Replace", command=show_replace) + button = Button(frame, text="Replace", command=show_replace) button.pack() if __name__ == '__main__': diff --git a/Lib/idlelib/scrolledlist.py b/Lib/idlelib/scrolledlist.py index 10229b636292..71fd18ab19ec 100644 --- a/Lib/idlelib/scrolledlist.py +++ b/Lib/idlelib/scrolledlist.py @@ -1,5 +1,5 @@ from tkinter import * -from tkinter.ttk import Scrollbar +from tkinter.ttk import Frame, Scrollbar from idlelib import macosx diff --git a/Lib/idlelib/search.py b/Lib/idlelib/search.py index 6223661c8e08..6e5a0c7973c7 100644 --- a/Lib/idlelib/search.py +++ b/Lib/idlelib/search.py @@ -75,13 +75,16 @@ def find_selection(self, text): def _search_dialog(parent): # htest # "Display search test box." from tkinter import Toplevel, Text - from tkinter.ttk import Button + from tkinter.ttk import Frame, Button - box = Toplevel(parent) - box.title("Test SearchDialog") + top = Toplevel(parent) + top.title("Test SearchDialog") x, y = map(int, parent.geometry().split('+')[1:]) - box.geometry("+%d+%d" % (x, y + 175)) - text = Text(box, inactiveselectbackground='gray') + top.geometry("+%d+%d" % (x, y + 175)) + + frame = Frame(top) + frame.pack() + text = Text(frame, inactiveselectbackground='gray') text.pack() text.insert("insert","This is a sample string.\n"*5) @@ -90,7 +93,7 @@ def show_find(): _setup(text).open(text) text.tag_remove('sel', '1.0', 'end') - button = Button(box, text="Search (selection ignored)", command=show_find) + button = Button(frame, text="Search (selection ignored)", command=show_find) button.pack() if __name__ == '__main__': diff --git a/Lib/idlelib/searchbase.py b/Lib/idlelib/searchbase.py index 348db660e3e9..f0e3d6f14ba4 100644 --- a/Lib/idlelib/searchbase.py +++ b/Lib/idlelib/searchbase.py @@ -1,7 +1,7 @@ '''Define SearchDialogBase used by Search, Replace, and Grep dialogs.''' -from tkinter import Toplevel, Frame -from tkinter.ttk import Entry, Label, Button, Checkbutton, Radiobutton +from tkinter import Toplevel +from tkinter.ttk import Frame, Entry, Label, Button, Checkbutton, Radiobutton class SearchDialogBase: diff --git a/Lib/idlelib/tree.py b/Lib/idlelib/tree.py index 05f864657fb8..21426cbb33e0 100644 --- a/Lib/idlelib/tree.py +++ b/Lib/idlelib/tree.py @@ -17,7 +17,7 @@ import os from tkinter import * -from tkinter.ttk import Scrollbar +from tkinter.ttk import Frame, Scrollbar from idlelib.config import idleConf from idlelib import zoomheight diff --git a/Misc/NEWS.d/next/IDLE/2018-12-31-17-04-18.bpo-33987.fD92up.rst b/Misc/NEWS.d/next/IDLE/2018-12-31-17-04-18.bpo-33987.fD92up.rst new file mode 100644 index 000000000000..289a65cf532c --- /dev/null +++ b/Misc/NEWS.d/next/IDLE/2018-12-31-17-04-18.bpo-33987.fD92up.rst @@ -0,0 +1 @@ +Use ttk Frame for ttk widgets. From webhook-mailer at python.org Thu Jan 3 02:48:10 2019 From: webhook-mailer at python.org (Terry Jan Reedy) Date: Thu, 03 Jan 2019 07:48:10 -0000 Subject: [Python-checkins] bpo-35641: IDLE - format calltip properly when no docstring (GH-11415) Message-ID: https://github.com/python/cpython/commit/ab54b9a130c88f708077c2ef6c4963b632c132b3 commit: ab54b9a130c88f708077c2ef6c4963b632c132b3 branch: master author: Emmanuel Arias committer: Terry Jan Reedy date: 2019-01-03T02:47:58-05:00 summary: bpo-35641: IDLE - format calltip properly when no docstring (GH-11415) files: A Misc/NEWS.d/next/Library/2019-01-02-22-15-01.bpo-35641.QEaANl.rst M Lib/idlelib/calltip.py M Lib/idlelib/idle_test/test_calltip.py M Misc/ACKS diff --git a/Lib/idlelib/calltip.py b/Lib/idlelib/calltip.py index 758569a45fdf..2a9a131ed96b 100644 --- a/Lib/idlelib/calltip.py +++ b/Lib/idlelib/calltip.py @@ -167,7 +167,7 @@ def get_argspec(ob): if len(line) > _MAX_COLS: line = line[: _MAX_COLS - 3] + '...' lines.append(line) - argspec = '\n'.join(lines) + argspec = '\n'.join(lines) if not argspec: argspec = _default_callable_argspec return argspec diff --git a/Lib/idlelib/idle_test/test_calltip.py b/Lib/idlelib/idle_test/test_calltip.py index 0698d4f8b99e..833351bd7996 100644 --- a/Lib/idlelib/idle_test/test_calltip.py +++ b/Lib/idlelib/idle_test/test_calltip.py @@ -99,6 +99,35 @@ def test_signature_wrap(self): drop_whitespace=True, break_on_hyphens=True, tabsize=8, *, max_lines=None, placeholder=' [...]')''') + def test_properly_formated(self): + def foo(s='a'*100): + pass + + def bar(s='a'*100): + """Hello Guido""" + pass + + def baz(s='a'*100, z='b'*100): + pass + + indent = calltip._INDENT + + str_foo = "(s='aaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa"\ + "aaaaaaaaaaaaaaaaaaaaaaaaaaaaaa\n" + indent + "aaaaaaaaa"\ + "aaaaaaaaaa')" + str_bar = "(s='aaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa"\ + "aaaaaaaaaaaaaaaaaaaaaaaaaaaaaa\n" + indent + "aaaaaaaaa"\ + "aaaaaaaaaa')\nHello Guido" + str_baz = "(s='aaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa"\ + "aaaaaaaaaaaaaaaaaaaaaaaaaaaaaa\n" + indent + "aaaaaaaaa"\ + "aaaaaaaaaa', z='bbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbb"\ + "bbbbbbbbbbbbbbbbb\n" + indent + "bbbbbbbbbbbbbbbbbbbbbb"\ + "bbbbbbbbbbbbbbbbbbbbbb')" + + self.assertEqual(calltip.get_argspec(foo), str_foo) + self.assertEqual(calltip.get_argspec(bar), str_bar) + self.assertEqual(calltip.get_argspec(baz), str_baz) + def test_docline_truncation(self): def f(): pass f.__doc__ = 'a'*300 diff --git a/Misc/ACKS b/Misc/ACKS index 63df5df4cbef..49b28153d3f7 100644 --- a/Misc/ACKS +++ b/Misc/ACKS @@ -60,6 +60,7 @@ Heidi Annexstad Ramchandra Apte ?ric Araujo Alexandru Ardelean +Emmanuel Arias Alicia Arlen Jeffrey Armstrong Jason Asbahr diff --git a/Misc/NEWS.d/next/Library/2019-01-02-22-15-01.bpo-35641.QEaANl.rst b/Misc/NEWS.d/next/Library/2019-01-02-22-15-01.bpo-35641.QEaANl.rst new file mode 100644 index 000000000000..5abba690be48 --- /dev/null +++ b/Misc/NEWS.d/next/Library/2019-01-02-22-15-01.bpo-35641.QEaANl.rst @@ -0,0 +1 @@ +Proper format `calltip` when the function has no docstring. From solipsis at pitrou.net Thu Jan 3 04:08:08 2019 From: solipsis at pitrou.net (solipsis at pitrou.net) Date: Thu, 03 Jan 2019 09:08:08 +0000 Subject: [Python-checkins] Daily reference leaks (4243df51fe43): sum=5 Message-ID: <20190103090808.1.4C763C497F2EEC3A@psf.io> results for 4243df51fe43 on branch "default" -------------------------------------------- test_collections leaked [7, 0, -7] memory blocks, sum=0 test_functools leaked [0, 3, 1] memory blocks, sum=4 test_multiprocessing_spawn leaked [-1, 2, 0] memory blocks, sum=1 Command line was: ['./python', '-m', 'test.regrtest', '-uall', '-R', '3:3:/home/psf-users/antoine/refleaks/reflogr_a4Lk', '--timeout', '7200'] From webhook-mailer at python.org Thu Jan 3 04:44:50 2019 From: webhook-mailer at python.org (Miss Islington (bot)) Date: Thu, 03 Jan 2019 09:44:50 -0000 Subject: [Python-checkins] bpo-35641: IDLE - format calltip properly when no docstring (GH-11415) Message-ID: https://github.com/python/cpython/commit/3c83cb7eed4f0e8b9f1cbf39263a2053a2483cb0 commit: 3c83cb7eed4f0e8b9f1cbf39263a2053a2483cb0 branch: 3.7 author: Miss Islington (bot) <31488909+miss-islington at users.noreply.github.com> committer: GitHub date: 2019-01-03T01:44:47-08:00 summary: bpo-35641: IDLE - format calltip properly when no docstring (GH-11415) (cherry picked from commit ab54b9a130c88f708077c2ef6c4963b632c132b3) Co-authored-by: Emmanuel Arias files: A Misc/NEWS.d/next/Library/2019-01-02-22-15-01.bpo-35641.QEaANl.rst M Lib/idlelib/calltip.py M Lib/idlelib/idle_test/test_calltip.py M Misc/ACKS diff --git a/Lib/idlelib/calltip.py b/Lib/idlelib/calltip.py index 758569a45fdf..2a9a131ed96b 100644 --- a/Lib/idlelib/calltip.py +++ b/Lib/idlelib/calltip.py @@ -167,7 +167,7 @@ def get_argspec(ob): if len(line) > _MAX_COLS: line = line[: _MAX_COLS - 3] + '...' lines.append(line) - argspec = '\n'.join(lines) + argspec = '\n'.join(lines) if not argspec: argspec = _default_callable_argspec return argspec diff --git a/Lib/idlelib/idle_test/test_calltip.py b/Lib/idlelib/idle_test/test_calltip.py index 0698d4f8b99e..833351bd7996 100644 --- a/Lib/idlelib/idle_test/test_calltip.py +++ b/Lib/idlelib/idle_test/test_calltip.py @@ -99,6 +99,35 @@ def test_signature_wrap(self): drop_whitespace=True, break_on_hyphens=True, tabsize=8, *, max_lines=None, placeholder=' [...]')''') + def test_properly_formated(self): + def foo(s='a'*100): + pass + + def bar(s='a'*100): + """Hello Guido""" + pass + + def baz(s='a'*100, z='b'*100): + pass + + indent = calltip._INDENT + + str_foo = "(s='aaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa"\ + "aaaaaaaaaaaaaaaaaaaaaaaaaaaaaa\n" + indent + "aaaaaaaaa"\ + "aaaaaaaaaa')" + str_bar = "(s='aaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa"\ + "aaaaaaaaaaaaaaaaaaaaaaaaaaaaaa\n" + indent + "aaaaaaaaa"\ + "aaaaaaaaaa')\nHello Guido" + str_baz = "(s='aaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa"\ + "aaaaaaaaaaaaaaaaaaaaaaaaaaaaaa\n" + indent + "aaaaaaaaa"\ + "aaaaaaaaaa', z='bbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbb"\ + "bbbbbbbbbbbbbbbbb\n" + indent + "bbbbbbbbbbbbbbbbbbbbbb"\ + "bbbbbbbbbbbbbbbbbbbbbb')" + + self.assertEqual(calltip.get_argspec(foo), str_foo) + self.assertEqual(calltip.get_argspec(bar), str_bar) + self.assertEqual(calltip.get_argspec(baz), str_baz) + def test_docline_truncation(self): def f(): pass f.__doc__ = 'a'*300 diff --git a/Misc/ACKS b/Misc/ACKS index f38408491288..027e01307225 100644 --- a/Misc/ACKS +++ b/Misc/ACKS @@ -60,6 +60,7 @@ Heidi Annexstad Ramchandra Apte ?ric Araujo Alexandru Ardelean +Emmanuel Arias Alicia Arlen Jeffrey Armstrong Jason Asbahr diff --git a/Misc/NEWS.d/next/Library/2019-01-02-22-15-01.bpo-35641.QEaANl.rst b/Misc/NEWS.d/next/Library/2019-01-02-22-15-01.bpo-35641.QEaANl.rst new file mode 100644 index 000000000000..5abba690be48 --- /dev/null +++ b/Misc/NEWS.d/next/Library/2019-01-02-22-15-01.bpo-35641.QEaANl.rst @@ -0,0 +1 @@ +Proper format `calltip` when the function has no docstring. From webhook-mailer at python.org Thu Jan 3 14:53:59 2019 From: webhook-mailer at python.org (Gregory P. Smith) Date: Thu, 03 Jan 2019 19:53:59 -0000 Subject: [Python-checkins] bpo-31450: Remove documentation mentioning that subprocess's child_traceback is available with the parent process (GH-11422) Message-ID: https://github.com/python/cpython/commit/47a2fced84605a32b79aa3ebc543533ad1a976a1 commit: 47a2fced84605a32b79aa3ebc543533ad1a976a1 branch: master author: Harmandeep Singh committer: Gregory P. Smith date: 2019-01-03T11:53:56-08:00 summary: bpo-31450: Remove documentation mentioning that subprocess's child_traceback is available with the parent process (GH-11422) files: M Doc/library/subprocess.rst diff --git a/Doc/library/subprocess.rst b/Doc/library/subprocess.rst index 25f267cf8305..e7844587e908 100644 --- a/Doc/library/subprocess.rst +++ b/Doc/library/subprocess.rst @@ -572,9 +572,7 @@ Exceptions ^^^^^^^^^^ Exceptions raised in the child process, before the new program has started to -execute, will be re-raised in the parent. Additionally, the exception object -will have one extra attribute called :attr:`child_traceback`, which is a string -containing traceback information from the child's point of view. +execute, will be re-raised in the parent. The most common exception raised is :exc:`OSError`. This occurs, for example, when trying to execute a non-existent file. Applications should prepare for From webhook-mailer at python.org Thu Jan 3 15:01:48 2019 From: webhook-mailer at python.org (Miss Islington (bot)) Date: Thu, 03 Jan 2019 20:01:48 -0000 Subject: [Python-checkins] bpo-31450: Remove documentation mentioning that subprocess's child_traceback is available with the parent process (GH-11422) Message-ID: https://github.com/python/cpython/commit/47c035f3efa77a439967776b5b6ba11d010ce466 commit: 47c035f3efa77a439967776b5b6ba11d010ce466 branch: 3.7 author: Miss Islington (bot) <31488909+miss-islington at users.noreply.github.com> committer: GitHub date: 2019-01-03T12:01:44-08:00 summary: bpo-31450: Remove documentation mentioning that subprocess's child_traceback is available with the parent process (GH-11422) (cherry picked from commit 47a2fced84605a32b79aa3ebc543533ad1a976a1) Co-authored-by: Harmandeep Singh files: M Doc/library/subprocess.rst diff --git a/Doc/library/subprocess.rst b/Doc/library/subprocess.rst index 4a10c7a7d9c7..ff91b5d4c758 100644 --- a/Doc/library/subprocess.rst +++ b/Doc/library/subprocess.rst @@ -572,9 +572,7 @@ Exceptions ^^^^^^^^^^ Exceptions raised in the child process, before the new program has started to -execute, will be re-raised in the parent. Additionally, the exception object -will have one extra attribute called :attr:`child_traceback`, which is a string -containing traceback information from the child's point of view. +execute, will be re-raised in the parent. The most common exception raised is :exc:`OSError`. This occurs, for example, when trying to execute a non-existent file. Applications should prepare for From solipsis at pitrou.net Fri Jan 4 04:11:04 2019 From: solipsis at pitrou.net (solipsis at pitrou.net) Date: Fri, 04 Jan 2019 09:11:04 +0000 Subject: [Python-checkins] Daily reference leaks (4243df51fe43): sum=0 Message-ID: <20190104091104.1.C36583C3F7E5480F@psf.io> results for 4243df51fe43 on branch "default" -------------------------------------------- test_collections leaked [-7, 8, -7] memory blocks, sum=-6 test_functools leaked [0, 3, 1] memory blocks, sum=4 test_multiprocessing_fork leaked [2, 0, 0] memory blocks, sum=2 test_multiprocessing_spawn leaked [-2, 1, 1] memory blocks, sum=0 Command line was: ['./python', '-m', 'test.regrtest', '-uall', '-R', '3:3:/home/psf-users/antoine/refleaks/reflog1oAa0h', '--timeout', '7200'] From webhook-mailer at python.org Fri Jan 4 09:14:37 2019 From: webhook-mailer at python.org (Ivan Levkivskyi) Date: Fri, 04 Jan 2019 14:14:37 -0000 Subject: [Python-checkins] bpo-35631: Improve typing docs wrt abstract/concrete collection types (GH-11396) Message-ID: https://github.com/python/cpython/commit/31ec52a9afedd77e36a3ddc31c4c45664b8ac410 commit: 31ec52a9afedd77e36a3ddc31c4c45664b8ac410 branch: master author: Ville Skytt? committer: Ivan Levkivskyi date: 2019-01-04T14:14:32Z summary: bpo-35631: Improve typing docs wrt abstract/concrete collection types (GH-11396) https://bugs.python.org/issue35631 files: M Doc/library/typing.rst diff --git a/Doc/library/typing.rst b/Doc/library/typing.rst index aa5e4d287ae7..4a03fe7ecf6b 100644 --- a/Doc/library/typing.rst +++ b/Doc/library/typing.rst @@ -50,20 +50,20 @@ A type alias is defined by assigning the type to the alias. In this example, Type aliases are useful for simplifying complex type signatures. For example:: - from typing import Dict, Tuple, List + from typing import Dict, Tuple, Sequence ConnectionOptions = Dict[str, str] Address = Tuple[str, int] Server = Tuple[Address, ConnectionOptions] - def broadcast_message(message: str, servers: List[Server]) -> None: + def broadcast_message(message: str, servers: Sequence[Server]) -> None: ... # The static type checker will treat the previous type signature as # being exactly equivalent to this one. def broadcast_message( message: str, - servers: List[Tuple[Tuple[str, int], Dict[str, str]]]) -> None: + servers: Sequence[Tuple[Tuple[str, int], Dict[str, str]]]) -> None: ... Note that ``None`` as a type hint is a special case and is replaced by @@ -568,6 +568,10 @@ The module defines the following classes, functions and decorators: .. class:: Mapping(Sized, Collection[KT], Generic[VT_co]) A generic version of :class:`collections.abc.Mapping`. + This type can be used as follows:: + + def get_position_in_index(word_list: Mapping[str, int], word: str) -> int: + return word_list[word] .. class:: MutableMapping(Mapping[KT, VT]) @@ -601,8 +605,8 @@ The module defines the following classes, functions and decorators: Generic version of :class:`list`. Useful for annotating return types. To annotate arguments it is preferred - to use abstract collection types such as :class:`Mapping`, :class:`Sequence`, - or :class:`AbstractSet`. + to use an abstract collection type such as :class:`Sequence` or + :class:`Iterable`. This type may be used as follows:: @@ -617,6 +621,8 @@ The module defines the following classes, functions and decorators: .. class:: Set(set, MutableSet[T]) A generic version of :class:`builtins.set `. + Useful for annotating return types. To annotate arguments it is preferred + to use an abstract collection type such as :class:`AbstractSet`. .. class:: FrozenSet(frozenset, AbstractSet[T_co]) @@ -678,10 +684,13 @@ The module defines the following classes, functions and decorators: .. class:: Dict(dict, MutableMapping[KT, VT]) A generic version of :class:`dict`. - The usage of this type is as follows:: + Useful for annotating return types. To annotate arguments it is preferred + to use an abstract collection type such as :class:`Mapping`. - def get_position_in_index(word_list: Dict[str, int], word: str) -> int: - return word_list[word] + This type can be used as follows:: + + def count_words(text: str) -> Dict[str, int]: + ... .. class:: DefaultDict(collections.defaultdict, MutableMapping[KT, VT]) From webhook-mailer at python.org Fri Jan 4 09:20:22 2019 From: webhook-mailer at python.org (Miss Islington (bot)) Date: Fri, 04 Jan 2019 14:20:22 -0000 Subject: [Python-checkins] bpo-35631: Improve typing docs wrt abstract/concrete collection types (GH-11396) Message-ID: https://github.com/python/cpython/commit/902196d867a34cc154fa9c861c883e69232251c6 commit: 902196d867a34cc154fa9c861c883e69232251c6 branch: 3.7 author: Miss Islington (bot) <31488909+miss-islington at users.noreply.github.com> committer: GitHub date: 2019-01-04T06:20:19-08:00 summary: bpo-35631: Improve typing docs wrt abstract/concrete collection types (GH-11396) https://bugs.python.org/issue35631 (cherry picked from commit 31ec52a9afedd77e36a3ddc31c4c45664b8ac410) Co-authored-by: Ville Skytt? files: M Doc/library/typing.rst diff --git a/Doc/library/typing.rst b/Doc/library/typing.rst index 0f37fc60896f..1e6dcccc1ffe 100644 --- a/Doc/library/typing.rst +++ b/Doc/library/typing.rst @@ -50,20 +50,20 @@ A type alias is defined by assigning the type to the alias. In this example, Type aliases are useful for simplifying complex type signatures. For example:: - from typing import Dict, Tuple, List + from typing import Dict, Tuple, Sequence ConnectionOptions = Dict[str, str] Address = Tuple[str, int] Server = Tuple[Address, ConnectionOptions] - def broadcast_message(message: str, servers: List[Server]) -> None: + def broadcast_message(message: str, servers: Sequence[Server]) -> None: ... # The static type checker will treat the previous type signature as # being exactly equivalent to this one. def broadcast_message( message: str, - servers: List[Tuple[Tuple[str, int], Dict[str, str]]]) -> None: + servers: Sequence[Tuple[Tuple[str, int], Dict[str, str]]]) -> None: ... Note that ``None`` as a type hint is a special case and is replaced by @@ -568,6 +568,10 @@ The module defines the following classes, functions and decorators: .. class:: Mapping(Sized, Collection[KT], Generic[VT_co]) A generic version of :class:`collections.abc.Mapping`. + This type can be used as follows:: + + def get_position_in_index(word_list: Mapping[str, int], word: str) -> int: + return word_list[word] .. class:: MutableMapping(Mapping[KT, VT]) @@ -601,8 +605,8 @@ The module defines the following classes, functions and decorators: Generic version of :class:`list`. Useful for annotating return types. To annotate arguments it is preferred - to use abstract collection types such as :class:`Mapping`, :class:`Sequence`, - or :class:`AbstractSet`. + to use an abstract collection type such as :class:`Sequence` or + :class:`Iterable`. This type may be used as follows:: @@ -617,6 +621,8 @@ The module defines the following classes, functions and decorators: .. class:: Set(set, MutableSet[T]) A generic version of :class:`builtins.set `. + Useful for annotating return types. To annotate arguments it is preferred + to use an abstract collection type such as :class:`AbstractSet`. .. class:: FrozenSet(frozenset, AbstractSet[T_co]) @@ -678,10 +684,13 @@ The module defines the following classes, functions and decorators: .. class:: Dict(dict, MutableMapping[KT, VT]) A generic version of :class:`dict`. - The usage of this type is as follows:: + Useful for annotating return types. To annotate arguments it is preferred + to use an abstract collection type such as :class:`Mapping`. - def get_position_in_index(word_list: Dict[str, int], word: str) -> int: - return word_list[word] + This type can be used as follows:: + + def count_words(text: str) -> Dict[str, int]: + ... .. class:: DefaultDict(collections.defaultdict, MutableMapping[KT, VT]) From solipsis at pitrou.net Sat Jan 5 04:13:47 2019 From: solipsis at pitrou.net (solipsis at pitrou.net) Date: Sat, 05 Jan 2019 09:13:47 +0000 Subject: [Python-checkins] Daily reference leaks (4243df51fe43): sum=6 Message-ID: <20190105091347.1.5D3EABD7A6DD76FD@psf.io> results for 4243df51fe43 on branch "default" -------------------------------------------- test_collections leaked [7, -7, 1] memory blocks, sum=1 test_functools leaked [0, 3, 1] memory blocks, sum=4 test_multiprocessing_forkserver leaked [0, 2, -1] memory blocks, sum=1 Command line was: ['./python', '-m', 'test.regrtest', '-uall', '-R', '3:3:/home/psf-users/antoine/refleaks/reflog6Hp7Yj', '--timeout', '7200'] From webhook-mailer at python.org Sat Jan 5 15:45:04 2019 From: webhook-mailer at python.org (Andrew Svetlov) Date: Sat, 05 Jan 2019 20:45:04 -0000 Subject: [Python-checkins] bpo-23057: Use 'raise' to emulate ctrl-c in proactor tests (#11274) Message-ID: https://github.com/python/cpython/commit/67ba547cf001d6b975cf6900aaf2bd5508dc6a87 commit: 67ba547cf001d6b975cf6900aaf2bd5508dc6a87 branch: master author: Vladimir Matveev committer: Andrew Svetlov date: 2019-01-05T22:44:59+02:00 summary: bpo-23057: Use 'raise' to emulate ctrl-c in proactor tests (#11274) files: D Lib/test/test_asyncio/test_ctrl_c_in_proactor_loop_helper.py M Lib/asyncio/windows_events.py M Lib/test/test_asyncio/test_windows_events.py diff --git a/Lib/asyncio/windows_events.py b/Lib/asyncio/windows_events.py index 33ffaf971770..0f3e9f425f68 100644 --- a/Lib/asyncio/windows_events.py +++ b/Lib/asyncio/windows_events.py @@ -315,7 +315,12 @@ def run_forever(self): super().run_forever() finally: if self._self_reading_future is not None: + ov = self._self_reading_future._ov self._self_reading_future.cancel() + # self_reading_future was just cancelled so it will never be signalled + # Unregister it otherwise IocpProactor.close will wait for it forever + if ov is not None: + self._proactor._unregister(ov) self._self_reading_future = None async def create_pipe_connection(self, protocol_factory, address): diff --git a/Lib/test/test_asyncio/test_ctrl_c_in_proactor_loop_helper.py b/Lib/test/test_asyncio/test_ctrl_c_in_proactor_loop_helper.py deleted file mode 100644 index 9aeb58aae212..000000000000 --- a/Lib/test/test_asyncio/test_ctrl_c_in_proactor_loop_helper.py +++ /dev/null @@ -1,63 +0,0 @@ -import sys - - -def do_in_child_process(): - import asyncio - - asyncio.set_event_loop_policy(asyncio.WindowsProactorEventLoopPolicy()) - l = asyncio.get_event_loop() - - def step(n): - try: - print(n) - sys.stdout.flush() - l.run_forever() - sys.exit(100) - except KeyboardInterrupt: - # ok - pass - except: - # error - use default exit code - sys.exit(200) - - step(1) - step(2) - sys.exit(255) - - -def do_in_main_process(): - import os - import signal - import subprocess - import time - from test.support.script_helper import spawn_python - - ok = False - - def step(p, expected): - s = p.stdout.readline() - if s != expected: - raise Exception(f"Unexpected line: got {s}, expected '{expected}'") - # ensure that child process gets to run_forever - time.sleep(0.5) - os.kill(p.pid, signal.CTRL_C_EVENT) - - with spawn_python(__file__, "--child") as p: - try: - # ignore ctrl-c in current process - signal.signal(signal.SIGINT, signal.SIG_IGN) - step(p, b"1\r\n") - step(p, b"2\r\n") - exit_code = p.wait(timeout=5) - ok = exit_code = 255 - except Exception as e: - sys.stderr.write(repr(e)) - p.kill() - sys.exit(255 if ok else 1) - - -if __name__ == "__main__": - if len(sys.argv) == 1: - do_in_main_process() - else: - do_in_child_process() diff --git a/Lib/test/test_asyncio/test_windows_events.py b/Lib/test/test_asyncio/test_windows_events.py index 05d875ac3c85..a200a8a80ad9 100644 --- a/Lib/test/test_asyncio/test_windows_events.py +++ b/Lib/test/test_asyncio/test_windows_events.py @@ -4,6 +4,7 @@ import sys import subprocess import time +import threading import unittest from unittest import mock @@ -11,6 +12,7 @@ raise unittest.SkipTest('Windows only') import _overlapped +import _testcapi import _winapi import asyncio @@ -38,20 +40,24 @@ def data_received(self, data): class ProactorLoopCtrlC(test_utils.TestCase): + def test_ctrl_c(self): - from .test_ctrl_c_in_proactor_loop_helper import __file__ as f - - # ctrl-c will be sent to all processes that share the same console - # in order to isolate the effect of raising ctrl-c we'll create - # a process with a new console - flags = subprocess.CREATE_NEW_CONSOLE - with spawn_python(f, creationflags=flags) as p: - try: - exit_code = p.wait(timeout=5) - self.assertEqual(exit_code, 255) - except: - p.kill() - raise + + def SIGINT_after_delay(): + time.sleep(1) + _testcapi.raise_signal(signal.SIGINT) + + asyncio.set_event_loop_policy(asyncio.WindowsProactorEventLoopPolicy()) + l = asyncio.get_event_loop() + try: + t = threading.Thread(target=SIGINT_after_delay) + t.start() + l.run_forever() + self.fail("should not fall through 'run_forever'") + except KeyboardInterrupt: + pass + finally: + l.close() class ProactorTests(test_utils.TestCase): From webhook-mailer at python.org Sun Jan 6 03:10:40 2019 From: webhook-mailer at python.org (Tal Einat) Date: Sun, 06 Jan 2019 08:10:40 -0000 Subject: [Python-checkins] remove doc-string declaration no longer used after AC conversion (GH-11444) Message-ID: https://github.com/python/cpython/commit/a5b76167dedf4d15211a216c3ca7b98e3cec33b8 commit: a5b76167dedf4d15211a216c3ca7b98e3cec33b8 branch: master author: Tal Einat committer: GitHub date: 2019-01-06T10:10:34+02:00 summary: remove doc-string declaration no longer used after AC conversion (GH-11444) files: M Python/sysmodule.c diff --git a/Python/sysmodule.c b/Python/sysmodule.c index 4150fbbbd997..10707fd23fc6 100644 --- a/Python/sysmodule.c +++ b/Python/sysmodule.c @@ -1594,10 +1594,6 @@ sys__clear_type_cache_impl(PyObject *module) Py_RETURN_NONE; } -PyDoc_STRVAR(sys_clear_type_cache__doc__, -"_clear_type_cache() -> None\n\ -Clear the internal type lookup cache."); - /*[clinic input] sys.is_finalizing From solipsis at pitrou.net Sun Jan 6 04:08:43 2019 From: solipsis at pitrou.net (solipsis at pitrou.net) Date: Sun, 06 Jan 2019 09:08:43 +0000 Subject: [Python-checkins] Daily reference leaks (4243df51fe43): sum=11 Message-ID: <20190106090843.1.FD6F7B4B75A57486@psf.io> results for 4243df51fe43 on branch "default" -------------------------------------------- test_collections leaked [0, 0, 7] memory blocks, sum=7 test_functools leaked [0, 3, 1] memory blocks, sum=4 Command line was: ['./python', '-m', 'test.regrtest', '-uall', '-R', '3:3:/home/psf-users/antoine/refleaks/reflogIrjEZ0', '--timeout', '7200'] From webhook-mailer at python.org Sun Jan 6 15:31:32 2019 From: webhook-mailer at python.org (Brett Cannon) Date: Sun, 06 Jan 2019 20:31:32 -0000 Subject: [Python-checkins] bpo-35488: Add tests for ** glob matching in pathlib (GH-11171) Message-ID: https://github.com/python/cpython/commit/83da926b89daf80013ea966037c2c0e1e9d25c6b commit: 83da926b89daf80013ea966037c2c0e1e9d25c6b branch: master author: Anthony Shaw committer: Brett Cannon date: 2019-01-06T12:31:29-08:00 summary: bpo-35488: Add tests for ** glob matching in pathlib (GH-11171) files: A Misc/NEWS.d/next/Tests/2019-01-04-21-34-53.bpo-35488.U7JJzP.rst M Lib/test/test_pathlib.py diff --git a/Lib/test/test_pathlib.py b/Lib/test/test_pathlib.py index d3fd4bd9e6b7..f8325eb93275 100644 --- a/Lib/test/test_pathlib.py +++ b/Lib/test/test_pathlib.py @@ -35,28 +35,28 @@ def _check_parse_parts(self, arg, expected): def test_parse_parts_common(self): check = self._check_parse_parts sep = self.flavour.sep - # Unanchored parts + # Unanchored parts. check([], ('', '', [])) check(['a'], ('', '', ['a'])) check(['a/'], ('', '', ['a'])) check(['a', 'b'], ('', '', ['a', 'b'])) - # Expansion + # Expansion. check(['a/b'], ('', '', ['a', 'b'])) check(['a/b/'], ('', '', ['a', 'b'])) check(['a', 'b/c', 'd'], ('', '', ['a', 'b', 'c', 'd'])) - # Collapsing and stripping excess slashes + # Collapsing and stripping excess slashes. check(['a', 'b//c', 'd'], ('', '', ['a', 'b', 'c', 'd'])) check(['a', 'b/c/', 'd'], ('', '', ['a', 'b', 'c', 'd'])) - # Eliminating standalone dots + # Eliminating standalone dots. check(['.'], ('', '', [])) check(['.', '.', 'b'], ('', '', ['b'])) check(['a', '.', 'b'], ('', '', ['a', 'b'])) check(['a', '.', '.'], ('', '', ['a'])) - # The first part is anchored + # The first part is anchored. check(['/a/b'], ('', sep, [sep, 'a', 'b'])) check(['/a', 'b'], ('', sep, [sep, 'a', 'b'])) check(['/a/', 'b'], ('', sep, [sep, 'a', 'b'])) - # Ignoring parts before an anchored part + # Ignoring parts before an anchored part. check(['a', '/b', 'c'], ('', sep, [sep, 'b', 'c'])) check(['a', '/b', '/c'], ('', sep, [sep, 'c'])) @@ -71,7 +71,7 @@ def test_parse_parts(self): check(['//a', 'b'], ('', '//', ['//', 'a', 'b'])) check(['///a', 'b'], ('', '/', ['/', 'a', 'b'])) check(['////a', 'b'], ('', '/', ['/', 'a', 'b'])) - # Paths which look like NT paths aren't treated specially + # Paths which look like NT paths aren't treated specially. check(['c:a'], ('', '', ['c:a'])) check(['c:\\a'], ('', '', ['c:\\a'])) check(['\\a'], ('', '', ['\\a'])) @@ -91,7 +91,7 @@ def test_splitroot(self): self.assertEqual(f('//a'), ('', '//', 'a')) self.assertEqual(f('///a'), ('', '/', 'a')) self.assertEqual(f('///a/b'), ('', '/', 'a/b')) - # Paths which look like NT paths aren't treated specially + # Paths which look like NT paths aren't treated specially. self.assertEqual(f('c:/a/b'), ('', '', 'c:/a/b')) self.assertEqual(f('\\/a/b'), ('', '', '\\/a/b')) self.assertEqual(f('\\a\\b'), ('', '', '\\a\\b')) @@ -102,34 +102,34 @@ class NTFlavourTest(_BaseFlavourTest, unittest.TestCase): def test_parse_parts(self): check = self._check_parse_parts - # First part is anchored + # First part is anchored. check(['c:'], ('c:', '', ['c:'])) check(['c:/'], ('c:', '\\', ['c:\\'])) check(['/'], ('', '\\', ['\\'])) check(['c:a'], ('c:', '', ['c:', 'a'])) check(['c:/a'], ('c:', '\\', ['c:\\', 'a'])) check(['/a'], ('', '\\', ['\\', 'a'])) - # UNC paths + # UNC paths. check(['//a/b'], ('\\\\a\\b', '\\', ['\\\\a\\b\\'])) check(['//a/b/'], ('\\\\a\\b', '\\', ['\\\\a\\b\\'])) check(['//a/b/c'], ('\\\\a\\b', '\\', ['\\\\a\\b\\', 'c'])) - # Second part is anchored, so that the first part is ignored + # Second part is anchored, so that the first part is ignored. check(['a', 'Z:b', 'c'], ('Z:', '', ['Z:', 'b', 'c'])) check(['a', 'Z:/b', 'c'], ('Z:', '\\', ['Z:\\', 'b', 'c'])) - # UNC paths + # UNC paths. check(['a', '//b/c', 'd'], ('\\\\b\\c', '\\', ['\\\\b\\c\\', 'd'])) - # Collapsing and stripping excess slashes + # Collapsing and stripping excess slashes. check(['a', 'Z://b//c/', 'd/'], ('Z:', '\\', ['Z:\\', 'b', 'c', 'd'])) - # UNC paths + # UNC paths. check(['a', '//b/c//', 'd'], ('\\\\b\\c', '\\', ['\\\\b\\c\\', 'd'])) - # Extended paths + # Extended paths. check(['//?/c:/'], ('\\\\?\\c:', '\\', ['\\\\?\\c:\\'])) check(['//?/c:/a'], ('\\\\?\\c:', '\\', ['\\\\?\\c:\\', 'a'])) check(['//?/c:/a', '/b'], ('\\\\?\\c:', '\\', ['\\\\?\\c:\\', 'b'])) - # Extended UNC paths (format is "\\?\UNC\server\share") + # Extended UNC paths (format is "\\?\UNC\server\share"). check(['//?/UNC/b/c'], ('\\\\?\\UNC\\b\\c', '\\', ['\\\\?\\UNC\\b\\c\\'])) check(['//?/UNC/b/c/d'], ('\\\\?\\UNC\\b\\c', '\\', ['\\\\?\\UNC\\b\\c\\', 'd'])) - # Second part has a root but not drive + # Second part has a root but not drive. check(['a', '/b', 'c'], ('', '\\', ['\\', 'b', 'c'])) check(['Z:/a', '/b', 'c'], ('Z:', '\\', ['Z:\\', 'b', 'c'])) check(['//?/Z:/a', '/b', 'c'], ('\\\\?\\Z:', '\\', ['\\\\?\\Z:\\', 'b', 'c'])) @@ -143,41 +143,41 @@ def test_splitroot(self): self.assertEqual(f('\\a\\b'), ('', '\\', 'a\\b')) self.assertEqual(f('c:a\\b'), ('c:', '', 'a\\b')) self.assertEqual(f('c:\\a\\b'), ('c:', '\\', 'a\\b')) - # Redundant slashes in the root are collapsed + # Redundant slashes in the root are collapsed. self.assertEqual(f('\\\\a'), ('', '\\', 'a')) self.assertEqual(f('\\\\\\a/b'), ('', '\\', 'a/b')) self.assertEqual(f('c:\\\\a'), ('c:', '\\', 'a')) self.assertEqual(f('c:\\\\\\a/b'), ('c:', '\\', 'a/b')) - # Valid UNC paths + # Valid UNC paths. self.assertEqual(f('\\\\a\\b'), ('\\\\a\\b', '\\', '')) self.assertEqual(f('\\\\a\\b\\'), ('\\\\a\\b', '\\', '')) self.assertEqual(f('\\\\a\\b\\c\\d'), ('\\\\a\\b', '\\', 'c\\d')) - # These are non-UNC paths (according to ntpath.py and test_ntpath) + # These are non-UNC paths (according to ntpath.py and test_ntpath). # However, command.com says such paths are invalid, so it's - # difficult to know what the right semantics are + # difficult to know what the right semantics are. self.assertEqual(f('\\\\\\a\\b'), ('', '\\', 'a\\b')) self.assertEqual(f('\\\\a'), ('', '\\', 'a')) # -# Tests for the pure classes +# Tests for the pure classes. # class _BasePurePathTest(object): - # keys are canonical paths, values are list of tuples of arguments - # supposed to produce equal paths + # Keys are canonical paths, values are list of tuples of arguments + # supposed to produce equal paths. equivalences = { 'a/b': [ ('a', 'b'), ('a/', 'b'), ('a', 'b/'), ('a/', 'b/'), ('a/b/',), ('a//b',), ('a//b//',), - # empty components get removed + # Empty components get removed. ('', 'a', 'b'), ('a', '', 'b'), ('a', 'b', ''), ], '/b/c/d': [ ('a', '/b/c', 'd'), ('a', '///b//c', 'd/'), ('/a', '/b/c', 'd'), - # empty components get removed + # Empty components get removed. ('/', 'b', '', 'c/d'), ('/', '', 'b/c/d'), ('', '/b/c/d'), ], } @@ -235,7 +235,7 @@ def test_join_common(self): self.assertEqual(pp, P('/c')) def test_div_common(self): - # Basically the same as joinpath() + # Basically the same as joinpath(). P = self.cls p = P('a/b') pp = p / 'c' @@ -257,18 +257,18 @@ def _check_str(self, expected, args): self.assertEqual(str(p), expected.replace('/', self.sep)) def test_str_common(self): - # Canonicalized paths roundtrip + # Canonicalized paths roundtrip. for pathstr in ('a', 'a/b', 'a/b/c', '/', '/a/b', '/a/b/c'): self._check_str(pathstr, (pathstr,)) - # Special case for the empty path + # Special case for the empty path. self._check_str('.', ('',)) - # Other tests for str() are in test_equivalences() + # Other tests for str() are in test_equivalences(). def test_as_posix_common(self): P = self.cls for pathstr in ('a', 'a/b', 'a/b/c', '/', '/a/b', '/a/b/c'): self.assertEqual(P(pathstr).as_posix(), pathstr) - # Other tests for as_posix() are in test_equivalences() + # Other tests for as_posix() are in test_equivalences(). def test_as_bytes_common(self): sep = os.fsencode(self.sep) @@ -287,12 +287,12 @@ def test_repr_common(self): p = self.cls(pathstr) clsname = p.__class__.__name__ r = repr(p) - # The repr() is in the form ClassName("forward-slashes path") + # The repr() is in the form ClassName("forward-slashes path"). self.assertTrue(r.startswith(clsname + '('), r) self.assertTrue(r.endswith(')'), r) inner = r[len(clsname) + 1 : -1] self.assertEqual(eval(inner), p.as_posix()) - # The repr() roundtrips + # The repr() roundtrips. q = eval(r, pathlib.__dict__) self.assertIs(q.__class__, p.__class__) self.assertEqual(q, p) @@ -315,7 +315,7 @@ def test_match_common(self): P = self.cls self.assertRaises(ValueError, P('a').match, '') self.assertRaises(ValueError, P('a').match, '.') - # Simple relative pattern + # Simple relative pattern. self.assertTrue(P('b.py').match('b.py')) self.assertTrue(P('a/b.py').match('b.py')) self.assertTrue(P('/a/b.py').match('b.py')) @@ -323,31 +323,34 @@ def test_match_common(self): self.assertFalse(P('b/py').match('b.py')) self.assertFalse(P('/a.py').match('b.py')) self.assertFalse(P('b.py/c').match('b.py')) - # Wilcard relative pattern + # Wilcard relative pattern. self.assertTrue(P('b.py').match('*.py')) self.assertTrue(P('a/b.py').match('*.py')) self.assertTrue(P('/a/b.py').match('*.py')) self.assertFalse(P('b.pyc').match('*.py')) self.assertFalse(P('b./py').match('*.py')) self.assertFalse(P('b.py/c').match('*.py')) - # Multi-part relative pattern + # Multi-part relative pattern. self.assertTrue(P('ab/c.py').match('a*/*.py')) self.assertTrue(P('/d/ab/c.py').match('a*/*.py')) self.assertFalse(P('a.py').match('a*/*.py')) self.assertFalse(P('/dab/c.py').match('a*/*.py')) self.assertFalse(P('ab/c.py/d').match('a*/*.py')) - # Absolute pattern + # Absolute pattern. self.assertTrue(P('/b.py').match('/*.py')) self.assertFalse(P('b.py').match('/*.py')) self.assertFalse(P('a/b.py').match('/*.py')) self.assertFalse(P('/a/b.py').match('/*.py')) - # Multi-part absolute pattern + # Multi-part absolute pattern. self.assertTrue(P('/a/b.py').match('/a/*.py')) self.assertFalse(P('/ab.py').match('/a/*.py')) self.assertFalse(P('/a/b/c.py').match('/a/*.py')) + # Multi-part glob-style pattern. + self.assertFalse(P('/a/b/c.py').match('/**/*.py')) + self.assertTrue(P('/a/b/c.py').match('/a/**/*.py')) def test_ordering_common(self): - # Ordering is tuple-alike + # Ordering is tuple-alike. def assertLess(a, b): self.assertLess(a, b) self.assertGreater(b, a) @@ -375,15 +378,15 @@ def assertLess(a, b): P() < {} def test_parts_common(self): - # `parts` returns a tuple + # `parts` returns a tuple. sep = self.sep P = self.cls p = P('a/b') parts = p.parts self.assertEqual(parts, ('a', 'b')) - # The object gets reused + # The object gets reused. self.assertIs(parts, p.parts) - # When the path is absolute, the anchor is a separate part + # When the path is absolute, the anchor is a separate part. p = P('/a/b') parts = p.parts self.assertEqual(parts, (sep, 'a', 'b')) @@ -562,14 +565,14 @@ def test_with_suffix_common(self): self.assertEqual(P('/a/b').with_suffix('.gz'), P('/a/b.gz')) self.assertEqual(P('a/b.py').with_suffix('.gz'), P('a/b.gz')) self.assertEqual(P('/a/b.py').with_suffix('.gz'), P('/a/b.gz')) - # Stripping suffix + # Stripping suffix. self.assertEqual(P('a/b.py').with_suffix(''), P('a/b')) self.assertEqual(P('/a/b').with_suffix(''), P('/a/b')) - # Path doesn't have a "filename" component + # Path doesn't have a "filename" component. self.assertRaises(ValueError, P('').with_suffix, '.gz') self.assertRaises(ValueError, P('.').with_suffix, '.gz') self.assertRaises(ValueError, P('/').with_suffix, '.gz') - # Invalid suffix + # Invalid suffix. self.assertRaises(ValueError, P('a/b').with_suffix, 'gz') self.assertRaises(ValueError, P('a/b').with_suffix, '/') self.assertRaises(ValueError, P('a/b').with_suffix, '.') @@ -593,9 +596,9 @@ def test_relative_to_common(self): self.assertEqual(p.relative_to('a/'), P('b')) self.assertEqual(p.relative_to(P('a/b')), P()) self.assertEqual(p.relative_to('a/b'), P()) - # With several args + # With several args. self.assertEqual(p.relative_to('a', 'b'), P()) - # Unrelated paths + # Unrelated paths. self.assertRaises(ValueError, p.relative_to, P('c')) self.assertRaises(ValueError, p.relative_to, P('a/b/c')) self.assertRaises(ValueError, p.relative_to, P('a/c')) @@ -608,7 +611,7 @@ def test_relative_to_common(self): self.assertEqual(p.relative_to('/a/'), P('b')) self.assertEqual(p.relative_to(P('/a/b')), P()) self.assertEqual(p.relative_to('/a/b'), P()) - # Unrelated paths + # Unrelated paths. self.assertRaises(ValueError, p.relative_to, P('/c')) self.assertRaises(ValueError, p.relative_to, P('/a/b/c')) self.assertRaises(ValueError, p.relative_to, P('/a/c')) @@ -635,7 +638,7 @@ def test_root(self): P = self.cls self.assertEqual(P('/a/b').root, '/') self.assertEqual(P('///a/b').root, '/') - # POSIX special case for two leading slashes + # POSIX special case for two leading slashes. self.assertEqual(P('//a/b').root, '//') def test_eq(self): @@ -693,7 +696,7 @@ def test_join(self): self.assertEqual(pp, P('/c')) def test_div(self): - # Basically the same as joinpath() + # Basically the same as joinpath(). P = self.cls p = P('//a') pp = p / 'b' @@ -750,7 +753,7 @@ def test_eq(self): self.assertNotEqual(P('c:a/b'), P('d:a/b')) self.assertNotEqual(P('c:a/b'), P('c:/a/b')) self.assertNotEqual(P('/a/b'), P('c:/a/b')) - # Case-insensitivity + # Case-insensitivity. self.assertEqual(P('a/B'), P('A/b')) self.assertEqual(P('C:a/B'), P('c:A/b')) self.assertEqual(P('//Some/SHARE/a/B'), P('//somE/share/A/b')) @@ -773,7 +776,7 @@ def test_as_uri(self): def test_match_common(self): P = self.cls - # Absolute patterns + # Absolute patterns. self.assertTrue(P('c:/b.py').match('/*.py')) self.assertTrue(P('c:/b.py').match('c:*.py')) self.assertTrue(P('c:/b.py').match('c:/*.py')) @@ -785,18 +788,18 @@ def test_match_common(self): self.assertFalse(P('c:b.py').match('c:/*.py')) self.assertFalse(P('/b.py').match('c:*.py')) self.assertFalse(P('/b.py').match('c:/*.py')) - # UNC patterns + # UNC patterns. self.assertTrue(P('//some/share/a.py').match('/*.py')) self.assertTrue(P('//some/share/a.py').match('//some/share/*.py')) self.assertFalse(P('//other/share/a.py').match('//some/share/*.py')) self.assertFalse(P('//some/share/a/b.py').match('//some/share/*.py')) - # Case-insensitivity + # Case-insensitivity. self.assertTrue(P('B.py').match('b.PY')) self.assertTrue(P('c:/a/B.Py').match('C:/A/*.pY')) self.assertTrue(P('//Some/Share/B.Py').match('//somE/sharE/*.pY')) def test_ordering_common(self): - # Case-insensitivity + # Case-insensitivity. def assertOrderedEqual(a, b): self.assertLessEqual(a, b) self.assertGreaterEqual(b, a) @@ -983,12 +986,12 @@ def test_with_suffix(self): self.assertEqual(P('c:/a/b').with_suffix('.gz'), P('c:/a/b.gz')) self.assertEqual(P('c:a/b.py').with_suffix('.gz'), P('c:a/b.gz')) self.assertEqual(P('c:/a/b.py').with_suffix('.gz'), P('c:/a/b.gz')) - # Path doesn't have a "filename" component + # Path doesn't have a "filename" component. self.assertRaises(ValueError, P('').with_suffix, '.gz') self.assertRaises(ValueError, P('.').with_suffix, '.gz') self.assertRaises(ValueError, P('/').with_suffix, '.gz') self.assertRaises(ValueError, P('//My/Share').with_suffix, '.gz') - # Invalid suffix + # Invalid suffix. self.assertRaises(ValueError, P('c:a/b').with_suffix, 'gz') self.assertRaises(ValueError, P('c:a/b').with_suffix, '/') self.assertRaises(ValueError, P('c:a/b').with_suffix, '\\') @@ -1011,7 +1014,7 @@ def test_relative_to(self): self.assertEqual(p.relative_to('c:foO/'), P('Bar')) self.assertEqual(p.relative_to(P('c:foO/baR')), P()) self.assertEqual(p.relative_to('c:foO/baR'), P()) - # Unrelated paths + # Unrelated paths. self.assertRaises(ValueError, p.relative_to, P()) self.assertRaises(ValueError, p.relative_to, '') self.assertRaises(ValueError, p.relative_to, P('d:')) @@ -1033,7 +1036,7 @@ def test_relative_to(self): self.assertEqual(p.relative_to('c:/foO/'), P('Bar')) self.assertEqual(p.relative_to(P('c:/foO/baR')), P()) self.assertEqual(p.relative_to('c:/foO/baR'), P()) - # Unrelated paths + # Unrelated paths. self.assertRaises(ValueError, p.relative_to, P('C:/Baz')) self.assertRaises(ValueError, p.relative_to, P('C:/Foo/Bar/Baz')) self.assertRaises(ValueError, p.relative_to, P('C:/Foo/Baz')) @@ -1043,7 +1046,7 @@ def test_relative_to(self): self.assertRaises(ValueError, p.relative_to, P('/')) self.assertRaises(ValueError, p.relative_to, P('/Foo')) self.assertRaises(ValueError, p.relative_to, P('//C/Foo')) - # UNC paths + # UNC paths. p = P('//Server/Share/Foo/Bar') self.assertEqual(p.relative_to(P('//sErver/sHare')), P('Foo/Bar')) self.assertEqual(p.relative_to('//sErver/sHare'), P('Foo/Bar')) @@ -1053,7 +1056,7 @@ def test_relative_to(self): self.assertEqual(p.relative_to('//sErver/sHare/Foo/'), P('Bar')) self.assertEqual(p.relative_to(P('//sErver/sHare/Foo/Bar')), P()) self.assertEqual(p.relative_to('//sErver/sHare/Foo/Bar'), P()) - # Unrelated paths + # Unrelated paths. self.assertRaises(ValueError, p.relative_to, P('/Server/Share/Foo')) self.assertRaises(ValueError, p.relative_to, P('c:/Server/Share/Foo')) self.assertRaises(ValueError, p.relative_to, P('//z/Share/Foo')) @@ -1061,7 +1064,7 @@ def test_relative_to(self): def test_is_absolute(self): P = self.cls - # Under NT, only paths with both a drive and a root are absolute + # Under NT, only paths with both a drive and a root are absolute. self.assertFalse(P().is_absolute()) self.assertFalse(P('a').is_absolute()) self.assertFalse(P('a/b/').is_absolute()) @@ -1074,7 +1077,7 @@ def test_is_absolute(self): self.assertTrue(P('c:/').is_absolute()) self.assertTrue(P('c:/a').is_absolute()) self.assertTrue(P('c:/a/b/').is_absolute()) - # UNC paths are absolute by definition + # UNC paths are absolute by definition. self.assertTrue(P('//a/b').is_absolute()) self.assertTrue(P('//a/b/').is_absolute()) self.assertTrue(P('//a/b/c').is_absolute()) @@ -1103,7 +1106,7 @@ def test_join(self): self.assertEqual(pp, P('C:/x/y')) def test_div(self): - # Basically the same as joinpath() + # Basically the same as joinpath(). P = self.cls p = P('C:/a/b') self.assertEqual(p / 'x/y', P('C:/a/b/x/y')) @@ -1136,9 +1139,9 @@ def test_is_reserved(self): self.assertIs(True, P('lpt1').is_reserved()) self.assertIs(True, P('lpt9.bar').is_reserved()) self.assertIs(False, P('bar.lpt9').is_reserved()) - # Only the last component matters + # Only the last component matters. self.assertIs(False, P('c:/NUL/con/baz').is_reserved()) - # UNC paths are never reserved + # UNC paths are never reserved. self.assertIs(False, P('//my/share/nul/con/aux').is_reserved()) class PurePathTest(_BasePurePathTest, unittest.TestCase): @@ -1168,10 +1171,10 @@ def test_different_flavours_unordered(self): # -# Tests for the concrete classes +# Tests for the concrete classes. # -# Make sure any symbolic links in the base test path are resolved +# Make sure any symbolic links in the base test path are resolved. BASE = os.path.realpath(TESTFN) join = lambda *x: os.path.join(BASE, *x) rel_join = lambda *x: os.path.join(TESTFN, *x) @@ -1242,16 +1245,16 @@ def cleanup(): f.write(b"this is file D\n") os.chmod(join('dirE'), 0) if support.can_symlink(): - # Relative symlinks + # Relative symlinks. os.symlink('fileA', join('linkA')) os.symlink('non-existing', join('brokenLink')) self.dirlink('dirB', join('linkB')) self.dirlink(os.path.join('..', 'dirB'), join('dirA', 'linkC')) - # This one goes upwards, creating a loop + # This one goes upwards, creating a loop. self.dirlink(os.path.join('..', 'dirB'), join('dirB', 'linkD')) if os.name == 'nt': - # Workaround for http://bugs.python.org/issue13772 + # Workaround for http://bugs.python.org/issue13772. def dirlink(self, src, dest): os.symlink(src, dest, target_is_directory=True) else: @@ -1361,7 +1364,7 @@ def test_read_write_bytes(self): p = self.cls(BASE) (p / 'fileA').write_bytes(b'abcdefg') self.assertEqual((p / 'fileA').read_bytes(), b'abcdefg') - # check that trying to write str does not truncate the file + # Check that trying to write str does not truncate the file. self.assertRaises(TypeError, (p / 'fileA').write_bytes, 'somestr') self.assertEqual((p / 'fileA').read_bytes(), b'abcdefg') @@ -1370,7 +1373,7 @@ def test_read_write_text(self): (p / 'fileA').write_text('?bcdefg', encoding='latin-1') self.assertEqual((p / 'fileA').read_text( encoding='utf-8', errors='ignore'), 'bcdefg') - # check that trying to write bytes does not truncate the file + # Check that trying to write bytes does not truncate the file. self.assertRaises(TypeError, (p / 'fileA').write_text, b'somebytes') self.assertEqual((p / 'fileA').read_text(encoding='latin-1'), '?bcdefg') @@ -1386,7 +1389,7 @@ def test_iterdir(self): @support.skip_unless_symlink def test_iterdir_symlink(self): - # __iter__ on a symlink to a directory + # __iter__ on a symlink to a directory. P = self.cls p = P(BASE, 'linkB') paths = set(p.iterdir()) @@ -1394,12 +1397,12 @@ def test_iterdir_symlink(self): self.assertEqual(paths, expected) def test_iterdir_nodir(self): - # __iter__ on something that is not a directory + # __iter__ on something that is not a directory. p = self.cls(BASE, 'fileA') with self.assertRaises(OSError) as cm: next(p.iterdir()) # ENOENT or EINVAL under Windows, ENOTDIR otherwise - # (see issue #12802) + # (see issue #12802). self.assertIn(cm.exception.errno, (errno.ENOTDIR, errno.ENOENT, errno.EINVAL)) @@ -1450,7 +1453,7 @@ def _check(glob, expected): @support.skip_unless_symlink def test_rglob_symlink_loop(self): - # Don't get fooled by symlink loops (Issue #26012) + # Don't get fooled by symlink loops (Issue #26012). P = self.cls p = P(BASE) given = set(p.rglob('*')) @@ -1466,7 +1469,7 @@ def test_rglob_symlink_loop(self): self.assertEqual(given, {p / x for x in expect}) def test_glob_dotdot(self): - # ".." is not special in globs + # ".." is not special in globs. P = self.cls p = P(BASE) self.assertEqual(set(p.glob("..")), { P(BASE, "..") }) @@ -1478,7 +1481,7 @@ def _check_resolve(self, p, expected, strict=True): q = p.resolve(strict) self.assertEqual(q, expected) - # this can be used to check both relative and absolute resolutions + # This can be used to check both relative and absolute resolutions. _check_resolve_relative = _check_resolve_absolute = _check_resolve @support.skip_unless_symlink @@ -1497,7 +1500,7 @@ def test_resolve_common(self): p = P(BASE, '..', 'foo', 'in', 'spam') self.assertEqual(str(p.resolve(strict=False)), os.path.abspath(os.path.join('foo', 'in', 'spam'))) - # These are all relative symlinks + # These are all relative symlinks. p = P(BASE, 'dirB', 'fileB') self._check_resolve_relative(p, p) p = P(BASE, 'linkA') @@ -1520,7 +1523,7 @@ def test_resolve_common(self): # In Posix, if linkY points to dirB, 'dirA/linkY/..' # resolves to 'dirB/..' first before resolving to parent of dirB. self._check_resolve_relative(p, P(BASE, 'foo', 'in', 'spam'), False) - # Now create absolute symlinks + # Now create absolute symlinks. d = support._longpath(tempfile.mkdtemp(suffix='-dirD', dir=os.getcwd())) self.addCleanup(support.rmtree, d) os.symlink(os.path.join(d), join('dirA', 'linkX')) @@ -1562,7 +1565,7 @@ def test_with(self): next(it2) with p: pass - # I/O operation on closed path + # I/O operation on closed path. self.assertRaises(ValueError, next, it) self.assertRaises(ValueError, next, it2) self.assertRaises(ValueError, p.open) @@ -1573,22 +1576,22 @@ def test_with(self): def test_chmod(self): p = self.cls(BASE) / 'fileA' mode = p.stat().st_mode - # Clear writable bit + # Clear writable bit. new_mode = mode & ~0o222 p.chmod(new_mode) self.assertEqual(p.stat().st_mode, new_mode) - # Set writable bit + # Set writable bit. new_mode = mode | 0o222 p.chmod(new_mode) self.assertEqual(p.stat().st_mode, new_mode) - # XXX also need a test for lchmod + # XXX also need a test for lchmod. def test_stat(self): p = self.cls(BASE) / 'fileA' st = p.stat() self.assertEqual(p.stat(), st) - # Change file mode by flipping write bit + # Change file mode by flipping write bit. p.chmod(st.st_mode ^ 0o222) self.addCleanup(p.chmod, st.st_mode) self.assertNotEqual(p.stat(), st) @@ -1644,12 +1647,12 @@ def test_rename(self): P = self.cls(BASE) p = P / 'fileA' size = p.stat().st_size - # Renaming to another path + # Renaming to another path. q = P / 'dirA' / 'fileAA' p.rename(q) self.assertEqual(q.stat().st_size, size) self.assertFileNotFound(p.stat) - # Renaming to a str of a relative path + # Renaming to a str of a relative path. r = rel_join('fileAAA') q.rename(r) self.assertEqual(os.stat(r).st_size, size) @@ -1659,12 +1662,12 @@ def test_replace(self): P = self.cls(BASE) p = P / 'fileA' size = p.stat().st_size - # Replacing a non-existing path + # Replacing a non-existing path. q = P / 'dirA' / 'fileAA' p.replace(q) self.assertEqual(q.stat().st_size, size) self.assertFileNotFound(p.stat) - # Replacing another (existing) path + # Replacing another (existing) path. r = rel_join('dirB', 'fileB') q.replace(r) self.assertEqual(os.stat(r).st_size, size) @@ -1682,12 +1685,12 @@ def test_touch_common(self): # Rewind the mtime sufficiently far in the past to work around # filesystem-specific timestamp granularity. os.utime(str(p), (old_mtime - 10, old_mtime - 10)) - # The file mtime should be refreshed by calling touch() again + # The file mtime should be refreshed by calling touch() again. p.touch() st = p.stat() self.assertGreaterEqual(st.st_mtime_ns, old_mtime_ns) self.assertGreaterEqual(st.st_mtime, old_mtime) - # Now with exist_ok=False + # Now with exist_ok=False. p = P / 'newfileB' self.assertFalse(p.exists()) p.touch(mode=0o700, exist_ok=False) @@ -1713,7 +1716,7 @@ def test_mkdir(self): self.assertEqual(cm.exception.errno, errno.EEXIST) def test_mkdir_parents(self): - # Creating a chain of directories + # Creating a chain of directories. p = self.cls(BASE, 'newdirB', 'newdirC') self.assertFalse(p.exists()) with self.assertRaises(OSError) as cm: @@ -1725,16 +1728,16 @@ def test_mkdir_parents(self): with self.assertRaises(OSError) as cm: p.mkdir(parents=True) self.assertEqual(cm.exception.errno, errno.EEXIST) - # test `mode` arg - mode = stat.S_IMODE(p.stat().st_mode) # default mode + # Test `mode` arg. + mode = stat.S_IMODE(p.stat().st_mode) # Default mode. p = self.cls(BASE, 'newdirD', 'newdirE') p.mkdir(0o555, parents=True) self.assertTrue(p.exists()) self.assertTrue(p.is_dir()) if os.name != 'nt': - # the directory's permissions follow the mode argument + # The directory's permissions follow the mode argument. self.assertEqual(stat.S_IMODE(p.stat().st_mode), 0o7555 & mode) - # the parent's permissions follow the default process settings + # The parent's permissions follow the default process settings. self.assertEqual(stat.S_IMODE(p.parent.stat().st_mode), mode) def test_mkdir_exist_ok(self): @@ -1767,11 +1770,11 @@ def test_mkdir_exist_ok_with_parent(self): self.assertEqual(p.stat().st_ctime, st_ctime_first) def test_mkdir_exist_ok_root(self): - # Issue #25803: A drive root could raise PermissionError on Windows + # Issue #25803: A drive root could raise PermissionError on Windows. self.cls('/').resolve().mkdir(exist_ok=True) self.cls('/').resolve().mkdir(parents=True, exist_ok=True) - @only_nt # XXX: not sure how to test this on POSIX + @only_nt # XXX: not sure how to test this on POSIX. def test_mkdir_with_unknown_drive(self): for d in 'ZYXWVUTSRQPONMLKJIHGFEDCBA': p = self.cls(d + ':\\') @@ -1819,9 +1822,9 @@ def my_mkdir(path, mode=0o777): # function is called at most 5 times (dirCPC/dir1/dir2, # dirCPC/dir1, dirCPC, dirCPC/dir1, dirCPC/dir1/dir2). if pattern.pop(): - os.mkdir(path, mode) # from another process + os.mkdir(path, mode) # From another process. concurrently_created.add(path) - os.mkdir(path, mode) # our real call + os.mkdir(path, mode) # Our real call. pattern = [bool(pattern_num & (1 << n)) for n in range(5)] concurrently_created = set() @@ -1839,18 +1842,18 @@ def my_mkdir(path, mode=0o777): def test_symlink_to(self): P = self.cls(BASE) target = P / 'fileA' - # Symlinking a path target + # Symlinking a path target. link = P / 'dirA' / 'linkAA' link.symlink_to(target) self.assertEqual(link.stat(), target.stat()) self.assertNotEqual(link.lstat(), target.stat()) - # Symlinking a str target + # Symlinking a str target. link = P / 'dirA' / 'linkAAA' link.symlink_to(str(target)) self.assertEqual(link.stat(), target.stat()) self.assertNotEqual(link.lstat(), target.stat()) self.assertFalse(link.is_dir()) - # Symlinking to a directory + # Symlinking to a directory. target = P / 'dirB' link = P / 'dirA' / 'linkAAAA' link.symlink_to(target, target_is_directory=True) @@ -1888,7 +1891,7 @@ def test_is_file(self): @only_posix def test_is_mount(self): P = self.cls(BASE) - R = self.cls('/') # TODO: Work out windows + R = self.cls('/') # TODO: Work out Windows. self.assertFalse((P / 'fileA').is_mount()) self.assertFalse((P / 'dirA').is_mount()) self.assertFalse((P / 'non-existing').is_mount()) @@ -1982,7 +1985,7 @@ def test_is_char_device_false(self): self.assertIs((P / 'fileA\x00').is_char_device(), False) def test_is_char_device_true(self): - # Under Unix, /dev/null should generally be a char device + # Under Unix, /dev/null should generally be a char device. P = self.cls('/dev/null') if not P.exists(): self.skipTest("/dev/null required") @@ -2009,14 +2012,14 @@ def test_parts_interning(self): self.assertIs(p.parts[2], q.parts[3]) def _check_complex_symlinks(self, link0_target): - # Test solving a non-looping chain of symlinks (issue #19887) + # Test solving a non-looping chain of symlinks (issue #19887). P = self.cls(BASE) self.dirlink(os.path.join('link0', 'link0'), join('link1')) self.dirlink(os.path.join('link1', 'link1'), join('link2')) self.dirlink(os.path.join('link2', 'link2'), join('link3')) self.dirlink(link0_target, join('link0')) - # Resolve absolute paths + # Resolve absolute paths. p = (P / 'link0').resolve() self.assertEqual(p, P) self.assertEqual(str(p), BASE) @@ -2030,7 +2033,7 @@ def _check_complex_symlinks(self, link0_target): self.assertEqual(p, P) self.assertEqual(str(p), BASE) - # Resolve relative paths + # Resolve relative paths. old_path = os.getcwd() os.chdir(BASE) try: @@ -2122,7 +2125,7 @@ def test_touch_mode(self): @support.skip_unless_symlink def test_resolve_loop(self): - # Loops with relative symlinks + # Loops with relative symlinks. os.symlink('linkX/inside', join('linkX')) self._check_symlink_loop(BASE, 'linkX') os.symlink('linkY', join('linkY')) @@ -2131,7 +2134,7 @@ def test_resolve_loop(self): self._check_symlink_loop(BASE, 'linkZ') # Non-strict self._check_symlink_loop(BASE, 'linkZ', 'foo', strict=False) - # Loops with absolute symlinks + # Loops with absolute symlinks. os.symlink(join('linkU/inside'), join('linkU')) self._check_symlink_loop(BASE, 'linkU') os.symlink(join('linkV'), join('linkV')) @@ -2166,7 +2169,7 @@ def test_expanduser(self): pwdent = pwd.getpwuid(os.getuid()) username = pwdent.pw_name userhome = pwdent.pw_dir.rstrip('/') or '/' - # find arbitrary different user (if exists) + # Find arbitrary different user (if exists). for pwdent in pwd.getpwall(): othername = pwdent.pw_name otherhome = pwdent.pw_dir.rstrip('/') @@ -2279,11 +2282,11 @@ def check(): self.assertEqual(p5.expanduser(), p5) self.assertEqual(p6.expanduser(), p6) - # test the first lookup key in the env vars + # Test the first lookup key in the env vars. env['HOME'] = 'C:\\Users\\alice' check() - # test that HOMEPATH is available instead + # Test that HOMEPATH is available instead. env.pop('HOME', None) env['HOMEPATH'] = 'C:\\Users\\alice' check() diff --git a/Misc/NEWS.d/next/Tests/2019-01-04-21-34-53.bpo-35488.U7JJzP.rst b/Misc/NEWS.d/next/Tests/2019-01-04-21-34-53.bpo-35488.U7JJzP.rst new file mode 100644 index 000000000000..815754ef561f --- /dev/null +++ b/Misc/NEWS.d/next/Tests/2019-01-04-21-34-53.bpo-35488.U7JJzP.rst @@ -0,0 +1 @@ +Add a test to pathlib's Path.match() to verify it does not support glob-style ** recursive pattern matching. \ No newline at end of file From webhook-mailer at python.org Sun Jan 6 15:55:55 2019 From: webhook-mailer at python.org (Terry Jan Reedy) Date: Sun, 06 Jan 2019 20:55:55 -0000 Subject: [Python-checkins] bpo-35660: Fix imports in idlelib.window (#11434) Message-ID: https://github.com/python/cpython/commit/11303dd6035a7d7f78025ce5a3e3b9bdf7380c9a commit: 11303dd6035a7d7f78025ce5a3e3b9bdf7380c9a branch: master author: Cheryl Sabella committer: Terry Jan Reedy date: 2019-01-06T15:55:52-05:00 summary: bpo-35660: Fix imports in idlelib.window (#11434) * bpo-35660: IDLE: Remove * import from window.py * sys was being imported through the *, so also added an import sys. * Update 2019-01-04-19-14-29.bpo-35660.hMxI7N.rst Anyone who wants details can check the issue, where I added the point about the sys import bug. files: A Misc/NEWS.d/next/IDLE/2019-01-04-19-14-29.bpo-35660.hMxI7N.rst M Lib/idlelib/window.py diff --git a/Lib/idlelib/window.py b/Lib/idlelib/window.py index b2488b28cabe..460d5b67948d 100644 --- a/Lib/idlelib/window.py +++ b/Lib/idlelib/window.py @@ -1,4 +1,5 @@ -from tkinter import * +from tkinter import Toplevel, TclError +import sys class WindowList: diff --git a/Misc/NEWS.d/next/IDLE/2019-01-04-19-14-29.bpo-35660.hMxI7N.rst b/Misc/NEWS.d/next/IDLE/2019-01-04-19-14-29.bpo-35660.hMxI7N.rst new file mode 100644 index 000000000000..1ad83fe29e14 --- /dev/null +++ b/Misc/NEWS.d/next/IDLE/2019-01-04-19-14-29.bpo-35660.hMxI7N.rst @@ -0,0 +1 @@ +Fix imports in idlelib.window. From webhook-mailer at python.org Sun Jan 6 16:13:34 2019 From: webhook-mailer at python.org (Miss Islington (bot)) Date: Sun, 06 Jan 2019 21:13:34 -0000 Subject: [Python-checkins] bpo-35660: Fix imports in idlelib.window (GH-11434) Message-ID: https://github.com/python/cpython/commit/be37dbff1ce3fa2ddd9115352b31ac0f6b44c193 commit: be37dbff1ce3fa2ddd9115352b31ac0f6b44c193 branch: 3.7 author: Miss Islington (bot) <31488909+miss-islington at users.noreply.github.com> committer: GitHub date: 2019-01-06T13:13:30-08:00 summary: bpo-35660: Fix imports in idlelib.window (GH-11434) * bpo-35660: IDLE: Remove * import from window.py * sys was being imported through the *, so also added an import sys. * Update 2019-01-04-19-14-29.bpo-35660.hMxI7N.rst Anyone who wants details can check the issue, where I added the point about the sys import bug. (cherry picked from commit 11303dd6035a7d7f78025ce5a3e3b9bdf7380c9a) Co-authored-by: Cheryl Sabella files: A Misc/NEWS.d/next/IDLE/2019-01-04-19-14-29.bpo-35660.hMxI7N.rst M Lib/idlelib/window.py diff --git a/Lib/idlelib/window.py b/Lib/idlelib/window.py index b2488b28cabe..460d5b67948d 100644 --- a/Lib/idlelib/window.py +++ b/Lib/idlelib/window.py @@ -1,4 +1,5 @@ -from tkinter import * +from tkinter import Toplevel, TclError +import sys class WindowList: diff --git a/Misc/NEWS.d/next/IDLE/2019-01-04-19-14-29.bpo-35660.hMxI7N.rst b/Misc/NEWS.d/next/IDLE/2019-01-04-19-14-29.bpo-35660.hMxI7N.rst new file mode 100644 index 000000000000..1ad83fe29e14 --- /dev/null +++ b/Misc/NEWS.d/next/IDLE/2019-01-04-19-14-29.bpo-35660.hMxI7N.rst @@ -0,0 +1 @@ +Fix imports in idlelib.window. From webhook-mailer at python.org Sun Jan 6 17:10:58 2019 From: webhook-mailer at python.org (Victor Stinner) Date: Sun, 06 Jan 2019 22:10:58 -0000 Subject: [Python-checkins] test_threading_local: add missing "import sys" (GH-8049) Message-ID: https://github.com/python/cpython/commit/a0bb51e44cd43a7d2836a96a3804162203e44514 commit: a0bb51e44cd43a7d2836a96a3804162203e44514 branch: master author: cclauss committer: Victor Stinner date: 2019-01-06T23:10:55+01:00 summary: test_threading_local: add missing "import sys" (GH-8049) files: M Lib/test/test_threading_local.py diff --git a/Lib/test/test_threading_local.py b/Lib/test/test_threading_local.py index 984f8dda3743..2fd14ae2e16f 100644 --- a/Lib/test/test_threading_local.py +++ b/Lib/test/test_threading_local.py @@ -1,3 +1,4 @@ +import sys import unittest from doctest import DocTestSuite from test import support From webhook-mailer at python.org Sun Jan 6 17:32:55 2019 From: webhook-mailer at python.org (Miss Islington (bot)) Date: Sun, 06 Jan 2019 22:32:55 -0000 Subject: [Python-checkins] test_threading_local: add missing "import sys" (GH-8049) Message-ID: https://github.com/python/cpython/commit/65ed9f31cf2239b4d41d959b270f5d5dd0adcdbe commit: 65ed9f31cf2239b4d41d959b270f5d5dd0adcdbe branch: 3.7 author: Miss Islington (bot) <31488909+miss-islington at users.noreply.github.com> committer: GitHub date: 2019-01-06T14:32:52-08:00 summary: test_threading_local: add missing "import sys" (GH-8049) (cherry picked from commit a0bb51e44cd43a7d2836a96a3804162203e44514) Co-authored-by: cclauss files: M Lib/test/test_threading_local.py diff --git a/Lib/test/test_threading_local.py b/Lib/test/test_threading_local.py index 984f8dda3743..2fd14ae2e16f 100644 --- a/Lib/test/test_threading_local.py +++ b/Lib/test/test_threading_local.py @@ -1,3 +1,4 @@ +import sys import unittest from doctest import DocTestSuite from test import support From webhook-mailer at python.org Mon Jan 7 02:25:13 2019 From: webhook-mailer at python.org (Benjamin Peterson) Date: Mon, 07 Jan 2019 07:25:13 -0000 Subject: [Python-checkins] closes bpo-35643: Fix a SyntaxWarning: invalid escape sequence in Modules/_sha3/cleanup.py (GH-11413) Message-ID: https://github.com/python/cpython/commit/3e0144a6c95433f347bb7e44a98c84deae8853ff commit: 3e0144a6c95433f347bb7e44a98c84deae8853ff branch: 3.6 author: Miss Islington (bot) <31488909+miss-islington at users.noreply.github.com> committer: Benjamin Peterson date: 2019-01-06T23:25:06-08:00 summary: closes bpo-35643: Fix a SyntaxWarning: invalid escape sequence in Modules/_sha3/cleanup.py (GH-11413) (cherry picked from commit d466c43e55cd32af84e353f0e9a48b09b7534f61) Co-authored-by: Micka?l Schoentgen files: A Misc/NEWS.d/next/Library/2019-01-02-20-04-49.bpo-35643.DaMiaV.rst M Modules/_sha3/cleanup.py diff --git a/Misc/NEWS.d/next/Library/2019-01-02-20-04-49.bpo-35643.DaMiaV.rst b/Misc/NEWS.d/next/Library/2019-01-02-20-04-49.bpo-35643.DaMiaV.rst new file mode 100644 index 000000000000..0b47bb61fc05 --- /dev/null +++ b/Misc/NEWS.d/next/Library/2019-01-02-20-04-49.bpo-35643.DaMiaV.rst @@ -0,0 +1,2 @@ +Fixed a SyntaxWarning: invalid escape sequence in Modules/_sha3/cleanup.py. +Patch by Micka?l Schoentgen. diff --git a/Modules/_sha3/cleanup.py b/Modules/_sha3/cleanup.py index 17c56b34b261..4f53681b49e6 100755 --- a/Modules/_sha3/cleanup.py +++ b/Modules/_sha3/cleanup.py @@ -8,7 +8,7 @@ import re CPP1 = re.compile("^//(.*)") -CPP2 = re.compile("\ //(.*)") +CPP2 = re.compile(r"\ //(.*)") STATICS = ("void ", "int ", "HashReturn ", "const UINT64 ", "UINT16 ", " int prefix##") From solipsis at pitrou.net Mon Jan 7 04:07:40 2019 From: solipsis at pitrou.net (solipsis at pitrou.net) Date: Mon, 07 Jan 2019 09:07:40 +0000 Subject: [Python-checkins] Daily reference leaks (4243df51fe43): sum=4 Message-ID: <20190107090740.1.8F1A000210A3D5B5@psf.io> results for 4243df51fe43 on branch "default" -------------------------------------------- test_functools leaked [0, 3, 1] memory blocks, sum=4 Command line was: ['./python', '-m', 'test.regrtest', '-uall', '-R', '3:3:/home/psf-users/antoine/refleaks/reflogix9lVa', '--timeout', '7200'] From webhook-mailer at python.org Mon Jan 7 10:09:18 2019 From: webhook-mailer at python.org (Victor Stinner) Date: Mon, 07 Jan 2019 15:09:18 -0000 Subject: [Python-checkins] bpo-35560: Remove assertion from format(float, "n") (GH-11288) Message-ID: https://github.com/python/cpython/commit/3f7983a25a3d19779283c707fbdd5bc91b1587ef commit: 3f7983a25a3d19779283c707fbdd5bc91b1587ef branch: master author: Xtreak committer: Victor Stinner date: 2019-01-07T16:09:14+01:00 summary: bpo-35560: Remove assertion from format(float, "n") (GH-11288) Fix an assertion error in format() in debug build for floating point formatting with "n" format, zero padding and small width. Release build is not impacted. Patch by Karthikeyan Singaravelan. files: A Misc/NEWS.d/next/Core and Builtins/2018-12-22-22-19-51.bpo-35560.9vMWSP.rst M Lib/test/test_float.py M Objects/unicodeobject.c diff --git a/Lib/test/test_float.py b/Lib/test/test_float.py index 06ea90c207f5..49c1fbcd4ef8 100644 --- a/Lib/test/test_float.py +++ b/Lib/test/test_float.py @@ -701,6 +701,25 @@ def test_issue5864(self): self.assertEqual(format(1234.56, '.4'), '1.235e+03') self.assertEqual(format(12345.6, '.4'), '1.235e+04') + def test_issue35560(self): + self.assertEqual(format(123.0, '00'), '123.0') + self.assertEqual(format(123.34, '00f'), '123.340000') + self.assertEqual(format(123.34, '00e'), '1.233400e+02') + self.assertEqual(format(123.34, '00g'), '123.34') + self.assertEqual(format(123.34, '00.10f'), '123.3400000000') + self.assertEqual(format(123.34, '00.10e'), '1.2334000000e+02') + self.assertEqual(format(123.34, '00.10g'), '123.34') + self.assertEqual(format(123.34, '01f'), '123.340000') + + self.assertEqual(format(-123.0, '00'), '-123.0') + self.assertEqual(format(-123.34, '00f'), '-123.340000') + self.assertEqual(format(-123.34, '00e'), '-1.233400e+02') + self.assertEqual(format(-123.34, '00g'), '-123.34') + self.assertEqual(format(-123.34, '00.10f'), '-123.3400000000') + self.assertEqual(format(-123.34, '00.10f'), '-123.3400000000') + self.assertEqual(format(-123.34, '00.10e'), '-1.2334000000e+02') + self.assertEqual(format(-123.34, '00.10g'), '-123.34') + class ReprTestCase(unittest.TestCase): def test_repr(self): floats_file = open(os.path.join(os.path.split(__file__)[0], diff --git a/Misc/NEWS.d/next/Core and Builtins/2018-12-22-22-19-51.bpo-35560.9vMWSP.rst b/Misc/NEWS.d/next/Core and Builtins/2018-12-22-22-19-51.bpo-35560.9vMWSP.rst new file mode 100644 index 000000000000..01458f11088e --- /dev/null +++ b/Misc/NEWS.d/next/Core and Builtins/2018-12-22-22-19-51.bpo-35560.9vMWSP.rst @@ -0,0 +1,3 @@ +Fix an assertion error in :func:`format` in debug build for floating point +formatting with "n" format, zero padding and small width. Release build is +not impacted. Patch by Karthikeyan Singaravelan. diff --git a/Objects/unicodeobject.c b/Objects/unicodeobject.c index f1dcfe9ab72a..304ea7471f49 100644 --- a/Objects/unicodeobject.c +++ b/Objects/unicodeobject.c @@ -9381,6 +9381,7 @@ _PyUnicode_InsertThousandsGrouping( PyObject *thousands_sep, Py_UCS4 *maxchar) { + min_width = Py_MAX(0, min_width); if (writer) { assert(digits != NULL); assert(maxchar == NULL); @@ -9391,7 +9392,6 @@ _PyUnicode_InsertThousandsGrouping( } assert(0 <= d_pos); assert(0 <= n_digits); - assert(0 <= min_width); assert(grouping != NULL); if (digits != NULL) { From webhook-mailer at python.org Mon Jan 7 10:26:24 2019 From: webhook-mailer at python.org (Miss Islington (bot)) Date: Mon, 07 Jan 2019 15:26:24 -0000 Subject: [Python-checkins] bpo-35560: Remove assertion from format(float, "n") (GH-11288) Message-ID: https://github.com/python/cpython/commit/9a413faa8708e5c2df509e97d48f18685c198b24 commit: 9a413faa8708e5c2df509e97d48f18685c198b24 branch: 3.7 author: Miss Islington (bot) <31488909+miss-islington at users.noreply.github.com> committer: GitHub date: 2019-01-07T07:26:20-08:00 summary: bpo-35560: Remove assertion from format(float, "n") (GH-11288) Fix an assertion error in format() in debug build for floating point formatting with "n" format, zero padding and small width. Release build is not impacted. Patch by Karthikeyan Singaravelan. (cherry picked from commit 3f7983a25a3d19779283c707fbdd5bc91b1587ef) Co-authored-by: Xtreak files: A Misc/NEWS.d/next/Core and Builtins/2018-12-22-22-19-51.bpo-35560.9vMWSP.rst M Lib/test/test_float.py M Objects/unicodeobject.c diff --git a/Lib/test/test_float.py b/Lib/test/test_float.py index 06ea90c207f5..49c1fbcd4ef8 100644 --- a/Lib/test/test_float.py +++ b/Lib/test/test_float.py @@ -701,6 +701,25 @@ def test_issue5864(self): self.assertEqual(format(1234.56, '.4'), '1.235e+03') self.assertEqual(format(12345.6, '.4'), '1.235e+04') + def test_issue35560(self): + self.assertEqual(format(123.0, '00'), '123.0') + self.assertEqual(format(123.34, '00f'), '123.340000') + self.assertEqual(format(123.34, '00e'), '1.233400e+02') + self.assertEqual(format(123.34, '00g'), '123.34') + self.assertEqual(format(123.34, '00.10f'), '123.3400000000') + self.assertEqual(format(123.34, '00.10e'), '1.2334000000e+02') + self.assertEqual(format(123.34, '00.10g'), '123.34') + self.assertEqual(format(123.34, '01f'), '123.340000') + + self.assertEqual(format(-123.0, '00'), '-123.0') + self.assertEqual(format(-123.34, '00f'), '-123.340000') + self.assertEqual(format(-123.34, '00e'), '-1.233400e+02') + self.assertEqual(format(-123.34, '00g'), '-123.34') + self.assertEqual(format(-123.34, '00.10f'), '-123.3400000000') + self.assertEqual(format(-123.34, '00.10f'), '-123.3400000000') + self.assertEqual(format(-123.34, '00.10e'), '-1.2334000000e+02') + self.assertEqual(format(-123.34, '00.10g'), '-123.34') + class ReprTestCase(unittest.TestCase): def test_repr(self): floats_file = open(os.path.join(os.path.split(__file__)[0], diff --git a/Misc/NEWS.d/next/Core and Builtins/2018-12-22-22-19-51.bpo-35560.9vMWSP.rst b/Misc/NEWS.d/next/Core and Builtins/2018-12-22-22-19-51.bpo-35560.9vMWSP.rst new file mode 100644 index 000000000000..01458f11088e --- /dev/null +++ b/Misc/NEWS.d/next/Core and Builtins/2018-12-22-22-19-51.bpo-35560.9vMWSP.rst @@ -0,0 +1,3 @@ +Fix an assertion error in :func:`format` in debug build for floating point +formatting with "n" format, zero padding and small width. Release build is +not impacted. Patch by Karthikeyan Singaravelan. diff --git a/Objects/unicodeobject.c b/Objects/unicodeobject.c index d46ab2a1e2ab..35c8a24b7c0c 100644 --- a/Objects/unicodeobject.c +++ b/Objects/unicodeobject.c @@ -9356,6 +9356,7 @@ _PyUnicode_InsertThousandsGrouping( PyObject *thousands_sep, Py_UCS4 *maxchar) { + min_width = Py_MAX(0, min_width); if (writer) { assert(digits != NULL); assert(maxchar == NULL); @@ -9366,7 +9367,6 @@ _PyUnicode_InsertThousandsGrouping( } assert(0 <= d_pos); assert(0 <= n_digits); - assert(0 <= min_width); assert(grouping != NULL); if (digits != NULL) { From webhook-mailer at python.org Mon Jan 7 11:38:45 2019 From: webhook-mailer at python.org (Raymond Hettinger) Date: Mon, 07 Jan 2019 16:38:45 -0000 Subject: [Python-checkins] bpo-35664: Optimize operator.itemgetter (GH-11435) Message-ID: https://github.com/python/cpython/commit/2d53bed79c1953390f85b191c72855e457e09305 commit: 2d53bed79c1953390f85b191c72855e457e09305 branch: master author: Raymond Hettinger committer: GitHub date: 2019-01-07T09:38:41-07:00 summary: bpo-35664: Optimize operator.itemgetter (GH-11435) files: A Misc/NEWS.d/next/Library/2019-01-04-22-18-25.bpo-35664.Z-Gyyj.rst M Lib/test/test_operator.py M Modules/_operator.c diff --git a/Lib/test/test_operator.py b/Lib/test/test_operator.py index 6254091e78fe..f46d94a22671 100644 --- a/Lib/test/test_operator.py +++ b/Lib/test/test_operator.py @@ -401,6 +401,19 @@ def __getitem__(self, name): self.assertEqual(operator.itemgetter(2,10,5)(data), ('2', '10', '5')) self.assertRaises(TypeError, operator.itemgetter(2, 'x', 5), data) + # interesting indices + t = tuple('abcde') + self.assertEqual(operator.itemgetter(-1)(t), 'e') + self.assertEqual(operator.itemgetter(slice(2, 4))(t), ('c', 'd')) + + # interesting sequences + class T(tuple): + 'Tuple subclass' + pass + self.assertEqual(operator.itemgetter(0)(T('abc')), 'a') + self.assertEqual(operator.itemgetter(0)(['a', 'b', 'c']), 'a') + self.assertEqual(operator.itemgetter(0)(range(100, 200)), 100) + def test_methodcaller(self): operator = self.module self.assertRaises(TypeError, operator.methodcaller) diff --git a/Misc/NEWS.d/next/Library/2019-01-04-22-18-25.bpo-35664.Z-Gyyj.rst b/Misc/NEWS.d/next/Library/2019-01-04-22-18-25.bpo-35664.Z-Gyyj.rst new file mode 100644 index 000000000000..f4acc5ae57e7 --- /dev/null +++ b/Misc/NEWS.d/next/Library/2019-01-04-22-18-25.bpo-35664.Z-Gyyj.rst @@ -0,0 +1,4 @@ +Improve operator.itemgetter() performance by 33% with optimized argument +handling and with adding a fast path for the common case of a single +non-negative integer index into a tuple (which is the typical use case in +the standard library). diff --git a/Modules/_operator.c b/Modules/_operator.c index 3bf8c1276d7b..d6c6a18d81b4 100644 --- a/Modules/_operator.c +++ b/Modules/_operator.c @@ -937,6 +937,7 @@ typedef struct { PyObject_HEAD Py_ssize_t nitems; PyObject *item; + Py_ssize_t index; // -1 unless *item* is a single non-negative integer index } itemgetterobject; static PyTypeObject itemgetter_type; @@ -948,6 +949,7 @@ itemgetter_new(PyTypeObject *type, PyObject *args, PyObject *kwds) itemgetterobject *ig; PyObject *item; Py_ssize_t nitems; + Py_ssize_t index; if (!_PyArg_NoKeywords("itemgetter", kwds)) return NULL; @@ -967,6 +969,21 @@ itemgetter_new(PyTypeObject *type, PyObject *args, PyObject *kwds) Py_INCREF(item); ig->item = item; ig->nitems = nitems; + ig->index = -1; + if (PyLong_CheckExact(item)) { + index = PyLong_AsSsize_t(item); + if (index < 0) { + /* If we get here, then either the index conversion failed + * due to being out of range, or the index was a negative + * integer. Either way, we clear any possible exception + * and fall back to the slow path, where ig->index is -1. + */ + PyErr_Clear(); + } + else { + ig->index = index; + } + } PyObject_GC_Track(ig); return (PyObject *)ig; @@ -993,12 +1010,27 @@ itemgetter_call(itemgetterobject *ig, PyObject *args, PyObject *kw) PyObject *obj, *result; Py_ssize_t i, nitems=ig->nitems; - if (!_PyArg_NoKeywords("itemgetter", kw)) - return NULL; - if (!PyArg_UnpackTuple(args, "itemgetter", 1, 1, &obj)) - return NULL; - if (nitems == 1) + assert(PyTuple_CheckExact(args)); + if (kw == NULL && PyTuple_GET_SIZE(args) == 1) { + obj = PyTuple_GET_ITEM(args, 0); + } + else { + if (!_PyArg_NoKeywords("itemgetter", kw)) + return NULL; + if (!PyArg_UnpackTuple(args, "itemgetter", 1, 1, &obj)) + return NULL; + } + if (nitems == 1) { + if (ig->index >= 0 + && PyTuple_CheckExact(obj) + && ig->index < PyTuple_GET_SIZE(obj)) + { + result = PyTuple_GET_ITEM(obj, ig->index); + Py_INCREF(result); + return result; + } return PyObject_GetItem(obj, ig->item); + } assert(PyTuple_Check(ig->item)); assert(PyTuple_GET_SIZE(ig->item) == nitems); From webhook-mailer at python.org Mon Jan 7 17:56:11 2019 From: webhook-mailer at python.org (Victor Stinner) Date: Mon, 07 Jan 2019 22:56:11 -0000 Subject: [Python-checkins] bpo-32710: test_asyncio: test_sendfile reset policy (GH-11461) Message-ID: https://github.com/python/cpython/commit/df8e1fb4e388e18430a9be8c6ceeb03c330f166c commit: df8e1fb4e388e18430a9be8c6ceeb03c330f166c branch: master author: Victor Stinner committer: GitHub date: 2019-01-07T23:55:57+01:00 summary: bpo-32710: test_asyncio: test_sendfile reset policy (GH-11461) test_asyncio/test_sendfile.py now resets the event loop policy using tearDownModule() as done in other tests, to prevent a warning when running tests on Windows. files: A Misc/NEWS.d/next/Tests/2019-01-07-23-34-41.bpo-32710.Hzo1b8.rst M Lib/test/test_asyncio/test_sendfile.py diff --git a/Lib/test/test_asyncio/test_sendfile.py b/Lib/test/test_asyncio/test_sendfile.py index 26e44a3348a5..f148fe27e6ad 100644 --- a/Lib/test/test_asyncio/test_sendfile.py +++ b/Lib/test/test_asyncio/test_sendfile.py @@ -18,6 +18,10 @@ ssl = None +def tearDownModule(): + asyncio.set_event_loop_policy(None) + + class MySendfileProto(asyncio.Protocol): def __init__(self, loop=None, close_after=0): diff --git a/Misc/NEWS.d/next/Tests/2019-01-07-23-34-41.bpo-32710.Hzo1b8.rst b/Misc/NEWS.d/next/Tests/2019-01-07-23-34-41.bpo-32710.Hzo1b8.rst new file mode 100644 index 000000000000..dd602a94c349 --- /dev/null +++ b/Misc/NEWS.d/next/Tests/2019-01-07-23-34-41.bpo-32710.Hzo1b8.rst @@ -0,0 +1,3 @@ +``test_asyncio/test_sendfile.py`` now resets the event loop policy using +:func:`tearDownModule` as done in other tests, to prevent a warning when +running tests on Windows. From webhook-mailer at python.org Mon Jan 7 19:27:35 2019 From: webhook-mailer at python.org (Victor Stinner) Date: Tue, 08 Jan 2019 00:27:35 -0000 Subject: [Python-checkins] bpo-33717: pythoninfo logs information of all clocks (GH-11460) Message-ID: https://github.com/python/cpython/commit/ddd7c422fd89a053700f9ed5272cf732ccb09088 commit: ddd7c422fd89a053700f9ed5272cf732ccb09088 branch: master author: Victor Stinner committer: GitHub date: 2019-01-08T01:27:27+01:00 summary: bpo-33717: pythoninfo logs information of all clocks (GH-11460) test.pythoninfo now logs information of all clocks, not only time.time() and time.perf_counter(). files: A Misc/NEWS.d/next/Tests/2019-01-07-23-22-44.bpo-33717.GhHXv8.rst M Lib/test/pythoninfo.py diff --git a/Lib/test/pythoninfo.py b/Lib/test/pythoninfo.py index 26bcf5f12d66..7ce6bf7b1ab9 100644 --- a/Lib/test/pythoninfo.py +++ b/Lib/test/pythoninfo.py @@ -6,6 +6,7 @@ import re import sys import traceback +import warnings def normalize_text(text): @@ -380,9 +381,17 @@ def collect_time(info_add): copy_attributes(info_add, time, 'time.%s', attributes) if hasattr(time, 'get_clock_info'): - for clock in ('time', 'perf_counter'): - tinfo = time.get_clock_info(clock) - info_add('time.get_clock_info(%s)' % clock, tinfo) + for clock in ('clock', 'monotonic', 'perf_counter', + 'process_time', 'thread_time', 'time'): + try: + # prevent DeprecatingWarning on get_clock_info('clock') + with warnings.catch_warnings(record=True): + clock_info = time.get_clock_info(clock) + except ValueError: + # missing clock like time.thread_time() + pass + else: + info_add('time.get_clock_info(%s)' % clock, clock_info) def collect_datetime(info_add): diff --git a/Misc/NEWS.d/next/Tests/2019-01-07-23-22-44.bpo-33717.GhHXv8.rst b/Misc/NEWS.d/next/Tests/2019-01-07-23-22-44.bpo-33717.GhHXv8.rst new file mode 100644 index 000000000000..05338a329ef8 --- /dev/null +++ b/Misc/NEWS.d/next/Tests/2019-01-07-23-22-44.bpo-33717.GhHXv8.rst @@ -0,0 +1,2 @@ +test.pythoninfo now logs information of all clocks, not only time.time() and +time.perf_counter(). From webhook-mailer at python.org Mon Jan 7 20:27:22 2019 From: webhook-mailer at python.org (Steve Dower) Date: Tue, 08 Jan 2019 01:27:22 -0000 Subject: [Python-checkins] bpo-35642: Remove asynciomodule.c from pythoncore.vcxproj (GH-11410) Message-ID: https://github.com/python/cpython/commit/fbf50683b3a2301097d5cd48bc68b530c1e1720f commit: fbf50683b3a2301097d5cd48bc68b530c1e1720f branch: master author: Gregory Szorc committer: Steve Dower date: 2019-01-07T17:27:18-08:00 summary: bpo-35642: Remove asynciomodule.c from pythoncore.vcxproj (GH-11410) This module is built by _asyncio.vcxproj and does not need to be included in pythoncore. files: A Misc/NEWS.d/next/Build/2019-01-02-11-23-33.bpo-35642.pjkhJe.rst M PCbuild/pythoncore.vcxproj M PCbuild/pythoncore.vcxproj.filters diff --git a/Misc/NEWS.d/next/Build/2019-01-02-11-23-33.bpo-35642.pjkhJe.rst b/Misc/NEWS.d/next/Build/2019-01-02-11-23-33.bpo-35642.pjkhJe.rst new file mode 100644 index 000000000000..9f6da315e2d5 --- /dev/null +++ b/Misc/NEWS.d/next/Build/2019-01-02-11-23-33.bpo-35642.pjkhJe.rst @@ -0,0 +1 @@ +Remove asynciomodule.c from pythoncore.vcxproj diff --git a/PCbuild/pythoncore.vcxproj b/PCbuild/pythoncore.vcxproj index ddf7f49d7a8e..f33cdb503038 100644 --- a/PCbuild/pythoncore.vcxproj +++ b/PCbuild/pythoncore.vcxproj @@ -247,7 +247,6 @@ - diff --git a/PCbuild/pythoncore.vcxproj.filters b/PCbuild/pythoncore.vcxproj.filters index 77b018ffb4e9..9dbd0669f76d 100644 --- a/PCbuild/pythoncore.vcxproj.filters +++ b/PCbuild/pythoncore.vcxproj.filters @@ -1073,9 +1073,6 @@ PC - - Modules - Modules From webhook-mailer at python.org Mon Jan 7 20:47:12 2019 From: webhook-mailer at python.org (Victor Stinner) Date: Tue, 08 Jan 2019 01:47:12 -0000 Subject: [Python-checkins] bpo-35682: Fix _ProactorBasePipeTransport._force_close() (GH-11462) Message-ID: https://github.com/python/cpython/commit/80fda712c83f5dd9560d42bf2aa65a72b18b7759 commit: 80fda712c83f5dd9560d42bf2aa65a72b18b7759 branch: master author: Victor Stinner committer: GitHub date: 2019-01-08T02:46:59+01:00 summary: bpo-35682: Fix _ProactorBasePipeTransport._force_close() (GH-11462) bpo-32622, bpo-35682: Fix asyncio.ProactorEventLoop.sendfile(): don't attempt to set the result of an internal future if it's already done. Fix asyncio _ProactorBasePipeTransport._force_close(): don't set the result of _empty_waiter if it's already done. files: A Misc/NEWS.d/next/Library/2019-01-08-01-54-02.bpo-35682.KDM9lk.rst M Lib/asyncio/proactor_events.py diff --git a/Lib/asyncio/proactor_events.py b/Lib/asyncio/proactor_events.py index 9b9e0aa7c7f1..da204c69d45e 100644 --- a/Lib/asyncio/proactor_events.py +++ b/Lib/asyncio/proactor_events.py @@ -111,7 +111,7 @@ def _fatal_error(self, exc, message='Fatal error on pipe transport'): self._force_close(exc) def _force_close(self, exc): - if self._empty_waiter is not None: + if self._empty_waiter is not None and not self._empty_waiter.done(): if exc is None: self._empty_waiter.set_result(None) else: diff --git a/Misc/NEWS.d/next/Library/2019-01-08-01-54-02.bpo-35682.KDM9lk.rst b/Misc/NEWS.d/next/Library/2019-01-08-01-54-02.bpo-35682.KDM9lk.rst new file mode 100644 index 000000000000..8152bd707ba5 --- /dev/null +++ b/Misc/NEWS.d/next/Library/2019-01-08-01-54-02.bpo-35682.KDM9lk.rst @@ -0,0 +1,2 @@ +Fix ``asyncio.ProactorEventLoop.sendfile()``: don't attempt to set the result +of an internal future if it's already done. From webhook-mailer at python.org Mon Jan 7 21:02:29 2019 From: webhook-mailer at python.org (Steve Dower) Date: Tue, 08 Jan 2019 02:02:29 -0000 Subject: [Python-checkins] Remove spurious quote in Azure Pipelines script (GH-10763) Message-ID: https://github.com/python/cpython/commit/202d1bde8f0d8854c7cba5ed2d93b469972b57cf commit: 202d1bde8f0d8854c7cba5ed2d93b469972b57cf branch: master author: Pierre Glaser committer: Steve Dower date: 2019-01-07T18:02:26-08:00 summary: Remove spurious quote in Azure Pipelines script (GH-10763) files: M .azure-pipelines/posix-steps.yml diff --git a/.azure-pipelines/posix-steps.yml b/.azure-pipelines/posix-steps.yml index 9fec9be8014f..6e2606fff7bc 100644 --- a/.azure-pipelines/posix-steps.yml +++ b/.azure-pipelines/posix-steps.yml @@ -26,7 +26,7 @@ steps: xvfb-run ./venv/bin/python -m coverage run --pylib -m test \ --fail-env-changed \ -uall,-cpu \ - --junit-xml=$(build.binariesDirectory)/test-results.xml" \ + --junit-xml=$(build.binariesDirectory)/test-results.xml \ -x test_multiprocessing_fork \ -x test_multiprocessing_forkserver \ -x test_multiprocessing_spawn \ From webhook-mailer at python.org Mon Jan 7 21:15:29 2019 From: webhook-mailer at python.org (Miss Islington (bot)) Date: Tue, 08 Jan 2019 02:15:29 -0000 Subject: [Python-checkins] bpo-35682: Fix _ProactorBasePipeTransport._force_close() (GH-11462) Message-ID: https://github.com/python/cpython/commit/88bd26a72eb4ab341cf19bea78a0039fbe4be3a2 commit: 88bd26a72eb4ab341cf19bea78a0039fbe4be3a2 branch: 3.7 author: Miss Islington (bot) <31488909+miss-islington at users.noreply.github.com> committer: GitHub date: 2019-01-07T18:15:25-08:00 summary: bpo-35682: Fix _ProactorBasePipeTransport._force_close() (GH-11462) bpo-32622, bpo-35682: Fix asyncio.ProactorEventLoop.sendfile(): don't attempt to set the result of an internal future if it's already done. Fix asyncio _ProactorBasePipeTransport._force_close(): don't set the result of _empty_waiter if it's already done. (cherry picked from commit 80fda712c83f5dd9560d42bf2aa65a72b18b7759) Co-authored-by: Victor Stinner files: A Misc/NEWS.d/next/Library/2019-01-08-01-54-02.bpo-35682.KDM9lk.rst M Lib/asyncio/proactor_events.py diff --git a/Lib/asyncio/proactor_events.py b/Lib/asyncio/proactor_events.py index e350e8bc0c24..782c86106bce 100644 --- a/Lib/asyncio/proactor_events.py +++ b/Lib/asyncio/proactor_events.py @@ -110,7 +110,7 @@ def _fatal_error(self, exc, message='Fatal error on pipe transport'): self._force_close(exc) def _force_close(self, exc): - if self._empty_waiter is not None: + if self._empty_waiter is not None and not self._empty_waiter.done(): if exc is None: self._empty_waiter.set_result(None) else: diff --git a/Misc/NEWS.d/next/Library/2019-01-08-01-54-02.bpo-35682.KDM9lk.rst b/Misc/NEWS.d/next/Library/2019-01-08-01-54-02.bpo-35682.KDM9lk.rst new file mode 100644 index 000000000000..8152bd707ba5 --- /dev/null +++ b/Misc/NEWS.d/next/Library/2019-01-08-01-54-02.bpo-35682.KDM9lk.rst @@ -0,0 +1,2 @@ +Fix ``asyncio.ProactorEventLoop.sendfile()``: don't attempt to set the result +of an internal future if it's already done. From webhook-mailer at python.org Mon Jan 7 21:57:35 2019 From: webhook-mailer at python.org (Steve Dower) Date: Tue, 08 Jan 2019 02:57:35 -0000 Subject: [Python-checkins] bpo-35374: Avoid trailing space in hhc file name if found on PATH. (GH-10849) Message-ID: https://github.com/python/cpython/commit/e61cc481e02b758c8d8289163102c236d0658a55 commit: e61cc481e02b758c8d8289163102c236d0658a55 branch: master author: chrullrich committer: Steve Dower date: 2019-01-07T18:57:29-08:00 summary: bpo-35374: Avoid trailing space in hhc file name if found on PATH. (GH-10849) files: M Doc/make.bat diff --git a/Doc/make.bat b/Doc/make.bat index 461c35c5a114..e6604956ea91 100644 --- a/Doc/make.bat +++ b/Doc/make.bat @@ -41,7 +41,7 @@ if exist "%HTMLHELP%" goto :skiphhcsearch rem Search for HHC in likely places set HTMLHELP= -where hhc /q && set HTMLHELP=hhc && goto :skiphhcsearch +where hhc /q && set "HTMLHELP=hhc" && goto :skiphhcsearch where /R ..\externals hhc > "%TEMP%\hhc.loc" 2> nul && set /P HTMLHELP= < "%TEMP%\hhc.loc" & del "%TEMP%\hhc.loc" if not exist "%HTMLHELP%" where /R "%ProgramFiles(x86)%" hhc > "%TEMP%\hhc.loc" 2> nul && set /P HTMLHELP= < "%TEMP%\hhc.loc" & del "%TEMP%\hhc.loc" if not exist "%HTMLHELP%" where /R "%ProgramFiles%" hhc > "%TEMP%\hhc.loc" 2> nul && set /P HTMLHELP= < "%TEMP%\hhc.loc" & del "%TEMP%\hhc.loc" From webhook-mailer at python.org Mon Jan 7 22:04:18 2019 From: webhook-mailer at python.org (Miss Islington (bot)) Date: Tue, 08 Jan 2019 03:04:18 -0000 Subject: [Python-checkins] bpo-35374: Avoid trailing space in hhc file name if found on PATH. (GH-10849) Message-ID: https://github.com/python/cpython/commit/5d1e0124cf562153a681d1b5b362e7c8e23edea9 commit: 5d1e0124cf562153a681d1b5b362e7c8e23edea9 branch: 3.7 author: Miss Islington (bot) <31488909+miss-islington at users.noreply.github.com> committer: GitHub date: 2019-01-07T19:04:14-08:00 summary: bpo-35374: Avoid trailing space in hhc file name if found on PATH. (GH-10849) (cherry picked from commit e61cc481e02b758c8d8289163102c236d0658a55) Co-authored-by: chrullrich files: M Doc/make.bat diff --git a/Doc/make.bat b/Doc/make.bat index 461c35c5a114..e6604956ea91 100644 --- a/Doc/make.bat +++ b/Doc/make.bat @@ -41,7 +41,7 @@ if exist "%HTMLHELP%" goto :skiphhcsearch rem Search for HHC in likely places set HTMLHELP= -where hhc /q && set HTMLHELP=hhc && goto :skiphhcsearch +where hhc /q && set "HTMLHELP=hhc" && goto :skiphhcsearch where /R ..\externals hhc > "%TEMP%\hhc.loc" 2> nul && set /P HTMLHELP= < "%TEMP%\hhc.loc" & del "%TEMP%\hhc.loc" if not exist "%HTMLHELP%" where /R "%ProgramFiles(x86)%" hhc > "%TEMP%\hhc.loc" 2> nul && set /P HTMLHELP= < "%TEMP%\hhc.loc" & del "%TEMP%\hhc.loc" if not exist "%HTMLHELP%" where /R "%ProgramFiles%" hhc > "%TEMP%\hhc.loc" 2> nul && set /P HTMLHELP= < "%TEMP%\hhc.loc" & del "%TEMP%\hhc.loc" From solipsis at pitrou.net Tue Jan 8 04:11:59 2019 From: solipsis at pitrou.net (solipsis at pitrou.net) Date: Tue, 08 Jan 2019 09:11:59 +0000 Subject: [Python-checkins] Daily reference leaks (4243df51fe43): sum=3 Message-ID: <20190108091159.1.9321A7CAF157C808@psf.io> results for 4243df51fe43 on branch "default" -------------------------------------------- test_functools leaked [0, 3, 1] memory blocks, sum=4 test_multiprocessing_fork leaked [0, -2, 1] memory blocks, sum=-1 Command line was: ['./python', '-m', 'test.regrtest', '-uall', '-R', '3:3:/home/psf-users/antoine/refleaks/refloghPKutg', '--timeout', '7200'] From webhook-mailer at python.org Tue Jan 8 04:58:30 2019 From: webhook-mailer at python.org (Miss Islington (bot)) Date: Tue, 08 Jan 2019 09:58:30 -0000 Subject: [Python-checkins] bpo-35568: add 'raise_signal' function (GH-11335) Message-ID: https://github.com/python/cpython/commit/c24c6c2c9357da99961bf257078240529181daf3 commit: c24c6c2c9357da99961bf257078240529181daf3 branch: master author: Vladimir Matveev committer: Miss Islington (bot) <31488909+miss-islington at users.noreply.github.com> date: 2019-01-08T01:58:25-08:00 summary: bpo-35568: add 'raise_signal' function (GH-11335) As in title, expose C `raise` function as `raise_function` in `signal` module. Also drop existing `raise_signal` in `_testcapi` module and replace all usages with new function. https://bugs.python.org/issue35568 files: A Misc/NEWS.d/next/Library/2018-12-27-19-23-00.bpo-35568.PutiOC.rst M Doc/library/signal.rst M Lib/test/test_asyncio/test_windows_events.py M Lib/test/test_faulthandler.py M Lib/test/test_posix.py M Lib/test/test_regrtest.py M Lib/test/test_signal.py M Modules/_testcapimodule.c M Modules/clinic/signalmodule.c.h M Modules/signalmodule.c diff --git a/Doc/library/signal.rst b/Doc/library/signal.rst index 5c48c88f08df..ac6cad9aff8e 100644 --- a/Doc/library/signal.rst +++ b/Doc/library/signal.rst @@ -237,6 +237,13 @@ The :mod:`signal` module defines the following functions: :func:`sigpending`. +.. function:: raise_signal(signum) + + Sends a signal to the calling process. Returns nothing. + + .. versionadded:: 3.8 + + .. function:: pthread_kill(thread_id, signalnum) Send the signal *signalnum* to the thread *thread_id*, another thread in the diff --git a/Lib/test/test_asyncio/test_windows_events.py b/Lib/test/test_asyncio/test_windows_events.py index a200a8a80ad9..05f85159be0c 100644 --- a/Lib/test/test_asyncio/test_windows_events.py +++ b/Lib/test/test_asyncio/test_windows_events.py @@ -45,7 +45,7 @@ def test_ctrl_c(self): def SIGINT_after_delay(): time.sleep(1) - _testcapi.raise_signal(signal.SIGINT) + signal.raise_signal(signal.SIGINT) asyncio.set_event_loop_policy(asyncio.WindowsProactorEventLoopPolicy()) l = asyncio.get_event_loop() diff --git a/Lib/test/test_faulthandler.py b/Lib/test/test_faulthandler.py index 6aee22be2e03..f0be91844ffa 100644 --- a/Lib/test/test_faulthandler.py +++ b/Lib/test/test_faulthandler.py @@ -198,14 +198,13 @@ def test_sigfpe(self): @skip_segfault_on_android def test_sigbus(self): self.check_fatal_error(""" - import _testcapi import faulthandler import signal faulthandler.enable() - _testcapi.raise_signal(signal.SIGBUS) + signal.raise_signal(signal.SIGBUS) """, - 6, + 5, 'Bus error') @unittest.skipIf(_testcapi is None, 'need _testcapi') @@ -213,14 +212,13 @@ def test_sigbus(self): @skip_segfault_on_android def test_sigill(self): self.check_fatal_error(""" - import _testcapi import faulthandler import signal faulthandler.enable() - _testcapi.raise_signal(signal.SIGILL) + signal.raise_signal(signal.SIGILL) """, - 6, + 5, 'Illegal instruction') def test_fatal_error(self): diff --git a/Lib/test/test_posix.py b/Lib/test/test_posix.py index 86c04b9f324a..d7e512c99f03 100644 --- a/Lib/test/test_posix.py +++ b/Lib/test/test_posix.py @@ -1596,8 +1596,8 @@ def test_setpgroup_wrong_type(self): 'need signal.pthread_sigmask()') def test_setsigmask(self): code = textwrap.dedent("""\ - import _testcapi, signal - _testcapi.raise_signal(signal.SIGUSR1)""") + import signal + signal.raise_signal(signal.SIGUSR1)""") pid = posix.posix_spawn( sys.executable, @@ -1627,8 +1627,8 @@ def test_setsigmask_wrong_type(self): def test_setsigdef(self): original_handler = signal.signal(signal.SIGUSR1, signal.SIG_IGN) code = textwrap.dedent("""\ - import _testcapi, signal - _testcapi.raise_signal(signal.SIGUSR1)""") + import signal + signal.raise_signal(signal.SIGUSR1)""") try: pid = posix.posix_spawn( sys.executable, diff --git a/Lib/test/test_regrtest.py b/Lib/test/test_regrtest.py index a67458313add..61937767ec12 100644 --- a/Lib/test/test_regrtest.py +++ b/Lib/test/test_regrtest.py @@ -26,9 +26,8 @@ ROOT_DIR = os.path.abspath(os.path.normpath(ROOT_DIR)) TEST_INTERRUPTED = textwrap.dedent(""" - from signal import SIGINT + from signal import SIGINT, raise_signal try: - from _testcapi import raise_signal raise_signal(SIGINT) except ImportError: import os diff --git a/Lib/test/test_signal.py b/Lib/test/test_signal.py index b10faa010b2b..2a6217ef6432 100644 --- a/Lib/test/test_signal.py +++ b/Lib/test/test_signal.py @@ -1,3 +1,4 @@ +import errno import os import random import signal @@ -254,7 +255,7 @@ def handler(signum, frame): signal.set_wakeup_fd(r) try: with captured_stderr() as err: - _testcapi.raise_signal(signal.SIGALRM) + signal.raise_signal(signal.SIGALRM) except ZeroDivisionError: # An ignored exception should have been printed out on stderr err = err.getvalue() @@ -348,10 +349,9 @@ def handler(signum, frame): def test_signum(self): self.check_wakeup("""def test(): - import _testcapi signal.signal(signal.SIGUSR1, handler) - _testcapi.raise_signal(signal.SIGUSR1) - _testcapi.raise_signal(signal.SIGALRM) + signal.raise_signal(signal.SIGUSR1) + signal.raise_signal(signal.SIGALRM) """, signal.SIGUSR1, signal.SIGALRM) @unittest.skipUnless(hasattr(signal, 'pthread_sigmask'), @@ -365,8 +365,8 @@ def test_pending(self): signal.signal(signum2, handler) signal.pthread_sigmask(signal.SIG_BLOCK, (signum1, signum2)) - _testcapi.raise_signal(signum1) - _testcapi.raise_signal(signum2) + signal.raise_signal(signum1) + signal.raise_signal(signum2) # Unblocking the 2 signals calls the C signal handler twice signal.pthread_sigmask(signal.SIG_UNBLOCK, (signum1, signum2)) """, signal.SIGUSR1, signal.SIGUSR2, ordered=False) @@ -396,7 +396,7 @@ def handler(signum, frame): write.setblocking(False) signal.set_wakeup_fd(write.fileno()) - _testcapi.raise_signal(signum) + signal.raise_signal(signum) data = read.recv(1) if not data: @@ -445,7 +445,7 @@ def handler(signum, frame): write.close() with captured_stderr() as err: - _testcapi.raise_signal(signum) + signal.raise_signal(signum) err = err.getvalue() if ('Exception ignored when trying to {action} to the signal wakeup fd' @@ -519,7 +519,7 @@ def handler(signum, frame): signal.set_wakeup_fd(write.fileno()) with captured_stderr() as err: - _testcapi.raise_signal(signum) + signal.raise_signal(signum) err = err.getvalue() if msg not in err: @@ -530,7 +530,7 @@ def handler(signum, frame): signal.set_wakeup_fd(write.fileno(), warn_on_full_buffer=True) with captured_stderr() as err: - _testcapi.raise_signal(signum) + signal.raise_signal(signum) err = err.getvalue() if msg not in err: @@ -541,7 +541,7 @@ def handler(signum, frame): signal.set_wakeup_fd(write.fileno(), warn_on_full_buffer=False) with captured_stderr() as err: - _testcapi.raise_signal(signum) + signal.raise_signal(signum) err = err.getvalue() if err != "": @@ -553,7 +553,7 @@ def handler(signum, frame): signal.set_wakeup_fd(write.fileno()) with captured_stderr() as err: - _testcapi.raise_signal(signum) + signal.raise_signal(signum) err = err.getvalue() if msg not in err: @@ -1214,6 +1214,38 @@ def handler(signum, frame): # Python handler self.assertEqual(len(sigs), N, "Some signals were lost") +class RaiseSignalTest(unittest.TestCase): + + def test_sigint(self): + try: + signal.raise_signal(signal.SIGINT) + self.fail("Expected KeyInterrupt") + except KeyboardInterrupt: + pass + + @unittest.skipIf(sys.platform != "win32", "Windows specific test") + def test_invalid_argument(self): + try: + SIGHUP = 1 # not supported on win32 + signal.raise_signal(SIGHUP) + self.fail("OSError (Invalid argument) expected") + except OSError as e: + if e.errno == errno.EINVAL: + pass + else: + raise + + def test_handler(self): + is_ok = False + def handler(a, b): + nonlocal is_ok + is_ok = True + old_signal = signal.signal(signal.SIGINT, handler) + self.addCleanup(signal.signal, signal.SIGINT, old_signal) + + signal.raise_signal(signal.SIGINT) + self.assertTrue(is_ok) + def tearDownModule(): support.reap_children() diff --git a/Misc/NEWS.d/next/Library/2018-12-27-19-23-00.bpo-35568.PutiOC.rst b/Misc/NEWS.d/next/Library/2018-12-27-19-23-00.bpo-35568.PutiOC.rst new file mode 100644 index 000000000000..d70806404f87 --- /dev/null +++ b/Misc/NEWS.d/next/Library/2018-12-27-19-23-00.bpo-35568.PutiOC.rst @@ -0,0 +1 @@ +Expose ``raise(signum)`` as `raise_signal` diff --git a/Modules/_testcapimodule.c b/Modules/_testcapimodule.c index 4933ef3b61c4..85810f30b1c5 100644 --- a/Modules/_testcapimodule.c +++ b/Modules/_testcapimodule.c @@ -3859,25 +3859,6 @@ call_in_temporary_c_thread(PyObject *self, PyObject *callback) return res; } -static PyObject* -test_raise_signal(PyObject* self, PyObject *args) -{ - int signum, err; - - if (!PyArg_ParseTuple(args, "i:raise_signal", &signum)) { - return NULL; - } - - err = raise(signum); - if (err) - return PyErr_SetFromErrno(PyExc_OSError); - - if (PyErr_CheckSignals() < 0) - return NULL; - - Py_RETURN_NONE; -} - /* marshal */ static PyObject* @@ -4908,8 +4889,6 @@ static PyMethodDef TestMethods[] = { {"docstring_with_signature_with_defaults", (PyCFunction)test_with_docstring, METH_NOARGS, docstring_with_signature_with_defaults}, - {"raise_signal", - (PyCFunction)test_raise_signal, METH_VARARGS}, {"call_in_temporary_c_thread", call_in_temporary_c_thread, METH_O, PyDoc_STR("set_error_class(error_class) -> None")}, {"pymarshal_write_long_to_file", diff --git a/Modules/clinic/signalmodule.c.h b/Modules/clinic/signalmodule.c.h index 6745f45ef751..f3742262d867 100644 --- a/Modules/clinic/signalmodule.c.h +++ b/Modules/clinic/signalmodule.c.h @@ -66,6 +66,39 @@ signal_pause(PyObject *module, PyObject *Py_UNUSED(ignored)) #endif /* defined(HAVE_PAUSE) */ +PyDoc_STRVAR(signal_raise_signal__doc__, +"raise_signal($module, signalnum, /)\n" +"--\n" +"\n" +"Send a signal to the executing process."); + +#define SIGNAL_RAISE_SIGNAL_METHODDEF \ + {"raise_signal", (PyCFunction)signal_raise_signal, METH_O, signal_raise_signal__doc__}, + +static PyObject * +signal_raise_signal_impl(PyObject *module, int signalnum); + +static PyObject * +signal_raise_signal(PyObject *module, PyObject *arg) +{ + PyObject *return_value = NULL; + int signalnum; + + if (PyFloat_Check(arg)) { + PyErr_SetString(PyExc_TypeError, + "integer argument expected, got float" ); + goto exit; + } + signalnum = _PyLong_AsInt(arg); + if (signalnum == -1 && PyErr_Occurred()) { + goto exit; + } + return_value = signal_raise_signal_impl(module, signalnum); + +exit: + return return_value; +} + PyDoc_STRVAR(signal_signal__doc__, "signal($module, signalnum, handler, /)\n" "--\n" @@ -558,4 +591,4 @@ signal_pthread_kill(PyObject *module, PyObject *const *args, Py_ssize_t nargs) #ifndef SIGNAL_PTHREAD_KILL_METHODDEF #define SIGNAL_PTHREAD_KILL_METHODDEF #endif /* !defined(SIGNAL_PTHREAD_KILL_METHODDEF) */ -/*[clinic end generated code: output=4ed8c36860f9f577 input=a9049054013a1b77]*/ +/*[clinic end generated code: output=365db4e807c26d4e input=a9049054013a1b77]*/ diff --git a/Modules/signalmodule.c b/Modules/signalmodule.c index 52ab4e998a97..4f8f71a0a1df 100644 --- a/Modules/signalmodule.c +++ b/Modules/signalmodule.c @@ -390,6 +390,31 @@ signal_pause_impl(PyObject *module) #endif +/*[clinic input] +signal.raise_signal + + signalnum: int + / + +Send a signal to the executing process. +[clinic start generated code]*/ + +static PyObject * +signal_raise_signal_impl(PyObject *module, int signalnum) +/*[clinic end generated code: output=e2b014220aa6111d input=e90c0f9a42358de6]*/ +{ + int err; + Py_BEGIN_ALLOW_THREADS + _Py_BEGIN_SUPPRESS_IPH + err = raise(signalnum); + _Py_END_SUPPRESS_IPH + Py_END_ALLOW_THREADS + + if (err) { + return PyErr_SetFromErrno(PyExc_OSError); + } + Py_RETURN_NONE; +} /*[clinic input] signal.signal @@ -1208,6 +1233,7 @@ static PyMethodDef signal_methods[] = { SIGNAL_SETITIMER_METHODDEF SIGNAL_GETITIMER_METHODDEF SIGNAL_SIGNAL_METHODDEF + SIGNAL_RAISE_SIGNAL_METHODDEF SIGNAL_STRSIGNAL_METHODDEF SIGNAL_GETSIGNAL_METHODDEF {"set_wakeup_fd", (PyCFunction)(void(*)(void))signal_set_wakeup_fd, METH_VARARGS | METH_KEYWORDS, set_wakeup_fd_doc}, From webhook-mailer at python.org Tue Jan 8 05:38:06 2019 From: webhook-mailer at python.org (Steve Dower) Date: Tue, 08 Jan 2019 10:38:06 -0000 Subject: [Python-checkins] bpo-35596: Use unchecked PYCs for the embeddable distro to avoid zipimport restrictions (GH-11465) Message-ID: https://github.com/python/cpython/commit/872bd2b57ce8e4ea7a54acb3934222c0e4e7276b commit: 872bd2b57ce8e4ea7a54acb3934222c0e4e7276b branch: master author: Steve Dower committer: GitHub date: 2019-01-08T02:38:01-08:00 summary: bpo-35596: Use unchecked PYCs for the embeddable distro to avoid zipimport restrictions (GH-11465) Also adds extra steps to the CI build for Windows on Azure Pipelines to validate that the various layouts at least execute. files: A .azure-pipelines/windows-layout-steps.yml A Misc/NEWS.d/next/Windows/2019-01-08-13-56-01.bpo-35596.oFvhcm.rst M .azure-pipelines/ci.yml M PC/layout/main.py diff --git a/.azure-pipelines/ci.yml b/.azure-pipelines/ci.yml index 49a7bb6232aa..78075bcfc147 100644 --- a/.azure-pipelines/ci.yml +++ b/.azure-pipelines/ci.yml @@ -134,3 +134,13 @@ jobs: steps: - template: ./windows-steps.yml + + - template: ./windows-layout-steps.yml + parameters: + kind: nuget + - template: ./windows-layout-steps.yml + parameters: + kind: embed + - template: ./windows-layout-steps.yml + parameters: + kind: appx diff --git a/.azure-pipelines/windows-layout-steps.yml b/.azure-pipelines/windows-layout-steps.yml new file mode 100644 index 000000000000..62e5259375f5 --- /dev/null +++ b/.azure-pipelines/windows-layout-steps.yml @@ -0,0 +1,11 @@ +parameters: + kind: nuget + extraOpts: --precompile + +steps: +- script: .\python.bat PC\layout -vv -s "$(Build.SourcesDirectory)" -b "$(Py_OutDir)\$(arch)" -t "$(Py_IntDir)\layout-tmp-${{ parameters['kind'] }}-$(arch)" --copy "$(Py_OutDir)\layout-${{ parameters['kind'] }}-$(arch)" ${{ parameters['extraOpts'] }} --preset-${{ parameters['kind'] }} --include-tests + displayName: Create ${{ parameters['kind'] }} layout + +- script: .\python.exe -m test.pythoninfo + workingDirectory: $(Py_OutDir)\layout-${{ parameters['kind'] }}-$(arch) + displayName: Show layout info (${{ parameters['kind'] }}) diff --git a/Misc/NEWS.d/next/Windows/2019-01-08-13-56-01.bpo-35596.oFvhcm.rst b/Misc/NEWS.d/next/Windows/2019-01-08-13-56-01.bpo-35596.oFvhcm.rst new file mode 100644 index 000000000000..db4d8fa420d6 --- /dev/null +++ b/Misc/NEWS.d/next/Windows/2019-01-08-13-56-01.bpo-35596.oFvhcm.rst @@ -0,0 +1,2 @@ +Use unchecked PYCs for the embeddable distro to avoid zipimport +restrictions. diff --git a/PC/layout/main.py b/PC/layout/main.py index 7eaf201d532e..d372fe50df32 100644 --- a/PC/layout/main.py +++ b/PC/layout/main.py @@ -240,12 +240,18 @@ def _c(d): yield "DLLs/{}".format(ns.include_cat.name), ns.include_cat -def _compile_one_py(src, dest, name, optimize): +def _compile_one_py(src, dest, name, optimize, checked=True): import py_compile if dest is not None: dest = str(dest) + mode = ( + py_compile.PycInvalidationMode.CHECKED_HASH + if checked + else py_compile.PycInvalidationMode.UNCHECKED_HASH + ) + try: return Path( py_compile.compile( @@ -254,7 +260,7 @@ def _compile_one_py(src, dest, name, optimize): str(name), doraise=True, optimize=optimize, - invalidation_mode=py_compile.PycInvalidationMode.CHECKED_HASH, + invalidation_mode=mode, ) ) except py_compile.PyCompileError: @@ -262,16 +268,16 @@ def _compile_one_py(src, dest, name, optimize): return None -def _py_temp_compile(src, ns, dest_dir=None): +def _py_temp_compile(src, ns, dest_dir=None, checked=True): if not ns.precompile or src not in PY_FILES or src.parent in DATA_DIRS: return None dest = (dest_dir or ns.temp) / (src.stem + ".py") - return _compile_one_py(src, dest.with_suffix(".pyc"), dest, optimize=2) + return _compile_one_py(src, dest.with_suffix(".pyc"), dest, optimize=2, checked=checked) -def _write_to_zip(zf, dest, src, ns): - pyc = _py_temp_compile(src, ns) +def _write_to_zip(zf, dest, src, ns, checked=True): + pyc = _py_temp_compile(src, ns, checked=checked) if pyc: try: zf.write(str(pyc), dest.with_suffix(".pyc")) @@ -321,7 +327,7 @@ def generate_source_files(ns): ns.temp.mkdir(parents=True, exist_ok=True) with zipfile.ZipFile(zip_path, "w", zipfile.ZIP_DEFLATED) as zf: for dest, src in get_lib_layout(ns): - _write_to_zip(zf, dest, src, ns) + _write_to_zip(zf, dest, src, ns, checked=False) if ns.include_underpth: log_info("Generating {} in {}", PYTHON_PTH_NAME, ns.temp) From webhook-mailer at python.org Tue Jan 8 05:56:27 2019 From: webhook-mailer at python.org (Miss Islington (bot)) Date: Tue, 08 Jan 2019 10:56:27 -0000 Subject: [Python-checkins] bpo-35596: Use unchecked PYCs for the embeddable distro to avoid zipimport restrictions (GH-11465) Message-ID: https://github.com/python/cpython/commit/69f64b67e43c65c2178c865fd1be80ed07f02d3c commit: 69f64b67e43c65c2178c865fd1be80ed07f02d3c branch: 3.7 author: Miss Islington (bot) <31488909+miss-islington at users.noreply.github.com> committer: GitHub date: 2019-01-08T02:56:14-08:00 summary: bpo-35596: Use unchecked PYCs for the embeddable distro to avoid zipimport restrictions (GH-11465) Also adds extra steps to the CI build for Windows on Azure Pipelines to validate that the various layouts at least execute. (cherry picked from commit 872bd2b57ce8e4ea7a54acb3934222c0e4e7276b) Co-authored-by: Steve Dower files: A .azure-pipelines/windows-layout-steps.yml A Misc/NEWS.d/next/Windows/2019-01-08-13-56-01.bpo-35596.oFvhcm.rst M .azure-pipelines/ci.yml M PC/layout/main.py diff --git a/.azure-pipelines/ci.yml b/.azure-pipelines/ci.yml index 49a7bb6232aa..78075bcfc147 100644 --- a/.azure-pipelines/ci.yml +++ b/.azure-pipelines/ci.yml @@ -134,3 +134,13 @@ jobs: steps: - template: ./windows-steps.yml + + - template: ./windows-layout-steps.yml + parameters: + kind: nuget + - template: ./windows-layout-steps.yml + parameters: + kind: embed + - template: ./windows-layout-steps.yml + parameters: + kind: appx diff --git a/.azure-pipelines/windows-layout-steps.yml b/.azure-pipelines/windows-layout-steps.yml new file mode 100644 index 000000000000..62e5259375f5 --- /dev/null +++ b/.azure-pipelines/windows-layout-steps.yml @@ -0,0 +1,11 @@ +parameters: + kind: nuget + extraOpts: --precompile + +steps: +- script: .\python.bat PC\layout -vv -s "$(Build.SourcesDirectory)" -b "$(Py_OutDir)\$(arch)" -t "$(Py_IntDir)\layout-tmp-${{ parameters['kind'] }}-$(arch)" --copy "$(Py_OutDir)\layout-${{ parameters['kind'] }}-$(arch)" ${{ parameters['extraOpts'] }} --preset-${{ parameters['kind'] }} --include-tests + displayName: Create ${{ parameters['kind'] }} layout + +- script: .\python.exe -m test.pythoninfo + workingDirectory: $(Py_OutDir)\layout-${{ parameters['kind'] }}-$(arch) + displayName: Show layout info (${{ parameters['kind'] }}) diff --git a/Misc/NEWS.d/next/Windows/2019-01-08-13-56-01.bpo-35596.oFvhcm.rst b/Misc/NEWS.d/next/Windows/2019-01-08-13-56-01.bpo-35596.oFvhcm.rst new file mode 100644 index 000000000000..db4d8fa420d6 --- /dev/null +++ b/Misc/NEWS.d/next/Windows/2019-01-08-13-56-01.bpo-35596.oFvhcm.rst @@ -0,0 +1,2 @@ +Use unchecked PYCs for the embeddable distro to avoid zipimport +restrictions. diff --git a/PC/layout/main.py b/PC/layout/main.py index 7eaf201d532e..d372fe50df32 100644 --- a/PC/layout/main.py +++ b/PC/layout/main.py @@ -240,12 +240,18 @@ def _c(d): yield "DLLs/{}".format(ns.include_cat.name), ns.include_cat -def _compile_one_py(src, dest, name, optimize): +def _compile_one_py(src, dest, name, optimize, checked=True): import py_compile if dest is not None: dest = str(dest) + mode = ( + py_compile.PycInvalidationMode.CHECKED_HASH + if checked + else py_compile.PycInvalidationMode.UNCHECKED_HASH + ) + try: return Path( py_compile.compile( @@ -254,7 +260,7 @@ def _compile_one_py(src, dest, name, optimize): str(name), doraise=True, optimize=optimize, - invalidation_mode=py_compile.PycInvalidationMode.CHECKED_HASH, + invalidation_mode=mode, ) ) except py_compile.PyCompileError: @@ -262,16 +268,16 @@ def _compile_one_py(src, dest, name, optimize): return None -def _py_temp_compile(src, ns, dest_dir=None): +def _py_temp_compile(src, ns, dest_dir=None, checked=True): if not ns.precompile or src not in PY_FILES or src.parent in DATA_DIRS: return None dest = (dest_dir or ns.temp) / (src.stem + ".py") - return _compile_one_py(src, dest.with_suffix(".pyc"), dest, optimize=2) + return _compile_one_py(src, dest.with_suffix(".pyc"), dest, optimize=2, checked=checked) -def _write_to_zip(zf, dest, src, ns): - pyc = _py_temp_compile(src, ns) +def _write_to_zip(zf, dest, src, ns, checked=True): + pyc = _py_temp_compile(src, ns, checked=checked) if pyc: try: zf.write(str(pyc), dest.with_suffix(".pyc")) @@ -321,7 +327,7 @@ def generate_source_files(ns): ns.temp.mkdir(parents=True, exist_ok=True) with zipfile.ZipFile(zip_path, "w", zipfile.ZIP_DEFLATED) as zf: for dest, src in get_lib_layout(ns): - _write_to_zip(zf, dest, src, ns) + _write_to_zip(zf, dest, src, ns, checked=False) if ns.include_underpth: log_info("Generating {} in {}", PYTHON_PTH_NAME, ns.temp) From webhook-mailer at python.org Tue Jan 8 08:23:20 2019 From: webhook-mailer at python.org (Victor Stinner) Date: Tue, 08 Jan 2019 13:23:20 -0000 Subject: [Python-checkins] bpo-32710: Fix leak in Overlapped_WSASend() (GH-11469) Message-ID: https://github.com/python/cpython/commit/a234e148394c2c7419372ab65b773d53a57f3625 commit: a234e148394c2c7419372ab65b773d53a57f3625 branch: master author: Victor Stinner committer: GitHub date: 2019-01-08T14:23:09+01:00 summary: bpo-32710: Fix leak in Overlapped_WSASend() (GH-11469) Fix a memory leak in asyncio in the ProactorEventLoop when ReadFile() or WSASend() overlapped operation fail immediately: release the internal buffer. files: A Misc/NEWS.d/next/Library/2019-01-08-14-00-52.bpo-32710.Sn5Ujj.rst M Modules/overlapped.c diff --git a/Misc/NEWS.d/next/Library/2019-01-08-14-00-52.bpo-32710.Sn5Ujj.rst b/Misc/NEWS.d/next/Library/2019-01-08-14-00-52.bpo-32710.Sn5Ujj.rst new file mode 100644 index 000000000000..5c3961c33d96 --- /dev/null +++ b/Misc/NEWS.d/next/Library/2019-01-08-14-00-52.bpo-32710.Sn5Ujj.rst @@ -0,0 +1,3 @@ +Fix a memory leak in asyncio in the ProactorEventLoop when ``ReadFile()`` or +``WSASend()`` overlapped operation fail immediately: release the internal +buffer. diff --git a/Modules/overlapped.c b/Modules/overlapped.c index 69875a7f37da..bbaa4fb3008f 100644 --- a/Modules/overlapped.c +++ b/Modules/overlapped.c @@ -723,6 +723,7 @@ do_ReadFile(OverlappedObject *self, HANDLE handle, case ERROR_IO_PENDING: Py_RETURN_NONE; default: + PyBuffer_Release(&self->user_buffer); self->type = TYPE_NOT_STARTED; return SetFromWindowsErr(err); } @@ -1011,6 +1012,7 @@ Overlapped_WSASend(OverlappedObject *self, PyObject *args) case ERROR_IO_PENDING: Py_RETURN_NONE; default: + PyBuffer_Release(&self->user_buffer); self->type = TYPE_NOT_STARTED; return SetFromWindowsErr(err); } From webhook-mailer at python.org Tue Jan 8 08:40:53 2019 From: webhook-mailer at python.org (Miss Islington (bot)) Date: Tue, 08 Jan 2019 13:40:53 -0000 Subject: [Python-checkins] bpo-32710: Fix leak in Overlapped_WSASend() (GH-11469) Message-ID: https://github.com/python/cpython/commit/88ad48bc98980a40591cc5521703dbb0ad3a9b17 commit: 88ad48bc98980a40591cc5521703dbb0ad3a9b17 branch: 3.7 author: Miss Islington (bot) <31488909+miss-islington at users.noreply.github.com> committer: GitHub date: 2019-01-08T05:40:50-08:00 summary: bpo-32710: Fix leak in Overlapped_WSASend() (GH-11469) Fix a memory leak in asyncio in the ProactorEventLoop when ReadFile() or WSASend() overlapped operation fail immediately: release the internal buffer. (cherry picked from commit a234e148394c2c7419372ab65b773d53a57f3625) Co-authored-by: Victor Stinner files: A Misc/NEWS.d/next/Library/2019-01-08-14-00-52.bpo-32710.Sn5Ujj.rst M Modules/overlapped.c diff --git a/Misc/NEWS.d/next/Library/2019-01-08-14-00-52.bpo-32710.Sn5Ujj.rst b/Misc/NEWS.d/next/Library/2019-01-08-14-00-52.bpo-32710.Sn5Ujj.rst new file mode 100644 index 000000000000..5c3961c33d96 --- /dev/null +++ b/Misc/NEWS.d/next/Library/2019-01-08-14-00-52.bpo-32710.Sn5Ujj.rst @@ -0,0 +1,3 @@ +Fix a memory leak in asyncio in the ProactorEventLoop when ``ReadFile()`` or +``WSASend()`` overlapped operation fail immediately: release the internal +buffer. diff --git a/Modules/overlapped.c b/Modules/overlapped.c index ae7cddadd02d..4ef5a96b1e4d 100644 --- a/Modules/overlapped.c +++ b/Modules/overlapped.c @@ -723,6 +723,7 @@ do_ReadFile(OverlappedObject *self, HANDLE handle, case ERROR_IO_PENDING: Py_RETURN_NONE; default: + PyBuffer_Release(&self->user_buffer); self->type = TYPE_NOT_STARTED; return SetFromWindowsErr(err); } @@ -1011,6 +1012,7 @@ Overlapped_WSASend(OverlappedObject *self, PyObject *args) case ERROR_IO_PENDING: Py_RETURN_NONE; default: + PyBuffer_Release(&self->user_buffer); self->type = TYPE_NOT_STARTED; return SetFromWindowsErr(err); } From solipsis at pitrou.net Wed Jan 9 04:12:23 2019 From: solipsis at pitrou.net (solipsis at pitrou.net) Date: Wed, 09 Jan 2019 09:12:23 +0000 Subject: [Python-checkins] Daily reference leaks (4243df51fe43): sum=6 Message-ID: <20190109091223.1.DD0D24698EC1F7A7@psf.io> results for 4243df51fe43 on branch "default" -------------------------------------------- test_collections leaked [7, -7, 1] memory blocks, sum=1 test_functools leaked [0, 3, 1] memory blocks, sum=4 test_multiprocessing_forkserver leaked [0, -2, 1] memory blocks, sum=-1 test_multiprocessing_spawn leaked [0, 2, 0] memory blocks, sum=2 Command line was: ['./python', '-m', 'test.regrtest', '-uall', '-R', '3:3:/home/psf-users/antoine/refleaks/reflog6ZtqCS', '--timeout', '7200'] From webhook-mailer at python.org Wed Jan 9 08:38:47 2019 From: webhook-mailer at python.org (Senthil Kumaran) Date: Wed, 09 Jan 2019 13:38:47 -0000 Subject: [Python-checkins] bpo-24746: Avoid stripping trailing whitespace in doctest fancy diff (#10639) Message-ID: https://github.com/python/cpython/commit/cbb16459934eaf29c7c7d362939cd05550b2f21f commit: cbb16459934eaf29c7c7d362939cd05550b2f21f branch: master author: Sanyam Khurana <8039608+CuriousLearner at users.noreply.github.com> committer: Senthil Kumaran date: 2019-01-09T05:38:38-08:00 summary: bpo-24746: Avoid stripping trailing whitespace in doctest fancy diff (#10639) files: A Misc/NEWS.d/next/Library/2018-11-22-15-22-56.bpo-24746.eSLKBE.rst M Lib/doctest.py M Lib/test/test_doctest.py diff --git a/Lib/doctest.py b/Lib/doctest.py index c1d8a1db111d..79d91a040c2e 100644 --- a/Lib/doctest.py +++ b/Lib/doctest.py @@ -1690,8 +1690,6 @@ def output_difference(self, example, got, optionflags): kind = 'ndiff with -expected +actual' else: assert 0, 'Bad diff option' - # Remove trailing whitespace on diff output. - diff = [line.rstrip() + '\n' for line in diff] return 'Differences (%s):\n' % kind + _indent(''.join(diff)) # If we're not using diff, then simply list the expected diff --git a/Lib/test/test_doctest.py b/Lib/test/test_doctest.py index efdce3c90c26..c40c38926873 100644 --- a/Lib/test/test_doctest.py +++ b/Lib/test/test_doctest.py @@ -2451,6 +2451,11 @@ def test_unittest_reportflags(): Then the default eporting options are ignored: >>> result = suite.run(unittest.TestResult()) + """ + """ + *NOTE*: These doctest are intentionally not placed in raw string to depict + the trailing whitespace using `\x20` in the diff below. + >>> print(result.failures[0][1]) # doctest: +ELLIPSIS Traceback ... Failed example: @@ -2464,7 +2469,7 @@ def test_unittest_reportflags(): Differences (ndiff with -expected +actual): a - - + + +\x20 b @@ -2953,6 +2958,47 @@ def test_CLI(): r""" """ +def test_no_trailing_whitespace_stripping(): + r""" + The fancy reports had a bug for a long time where any trailing whitespace on + the reported diff lines was stripped, making it impossible to see the + differences in line reported as different that differed only in the amount of + trailing whitespace. The whitespace still isn't particularly visible unless + you use NDIFF, but at least it is now there to be found. + + *NOTE*: This snippet was intentionally put inside a raw string to get rid of + leading whitespace error in executing the example below + + >>> def f(x): + ... r''' + ... >>> print('\n'.join(['a ', 'b'])) + ... a + ... b + ... ''' + """ + """ + *NOTE*: These doctest are not placed in raw string to depict the trailing whitespace + using `\x20` + + >>> test = doctest.DocTestFinder().find(f)[0] + >>> flags = doctest.REPORT_NDIFF + >>> doctest.DocTestRunner(verbose=False, optionflags=flags).run(test) + ... # doctest: +ELLIPSIS + ********************************************************************** + File ..., line 3, in f + Failed example: + print('\n'.join(['a ', 'b'])) + Differences (ndiff with -expected +actual): + - a + + a + b + TestResults(failed=1, attempted=1) + + *NOTE*: `\x20` is for checking the trailing whitespace on the +a line above. + We cannot use actual spaces there, as a commit hook prevents from committing + patches that contain trailing whitespace. More info on Issue 24746. + """ + ###################################################################### ## Main ###################################################################### diff --git a/Misc/NEWS.d/next/Library/2018-11-22-15-22-56.bpo-24746.eSLKBE.rst b/Misc/NEWS.d/next/Library/2018-11-22-15-22-56.bpo-24746.eSLKBE.rst new file mode 100644 index 000000000000..c592516d1466 --- /dev/null +++ b/Misc/NEWS.d/next/Library/2018-11-22-15-22-56.bpo-24746.eSLKBE.rst @@ -0,0 +1,2 @@ +Avoid stripping trailing whitespace in doctest fancy diff. Orignial patch by +R. David Murray & Jairo Trad. Enhanced by Sanyam Khurana. From webhook-mailer at python.org Wed Jan 9 08:56:52 2019 From: webhook-mailer at python.org (Miss Islington (bot)) Date: Wed, 09 Jan 2019 13:56:52 -0000 Subject: [Python-checkins] bpo-24746: Avoid stripping trailing whitespace in doctest fancy diff (GH-10639) Message-ID: https://github.com/python/cpython/commit/53cf5f084b01cb16630361be5377047c068d2b44 commit: 53cf5f084b01cb16630361be5377047c068d2b44 branch: 3.7 author: Miss Islington (bot) <31488909+miss-islington at users.noreply.github.com> committer: GitHub date: 2019-01-09T05:56:40-08:00 summary: bpo-24746: Avoid stripping trailing whitespace in doctest fancy diff (GH-10639) (cherry picked from commit cbb16459934eaf29c7c7d362939cd05550b2f21f) Co-authored-by: Sanyam Khurana <8039608+CuriousLearner at users.noreply.github.com> files: A Misc/NEWS.d/next/Library/2018-11-22-15-22-56.bpo-24746.eSLKBE.rst M Lib/doctest.py M Lib/test/test_doctest.py diff --git a/Lib/doctest.py b/Lib/doctest.py index c1d8a1db111d..79d91a040c2e 100644 --- a/Lib/doctest.py +++ b/Lib/doctest.py @@ -1690,8 +1690,6 @@ def output_difference(self, example, got, optionflags): kind = 'ndiff with -expected +actual' else: assert 0, 'Bad diff option' - # Remove trailing whitespace on diff output. - diff = [line.rstrip() + '\n' for line in diff] return 'Differences (%s):\n' % kind + _indent(''.join(diff)) # If we're not using diff, then simply list the expected diff --git a/Lib/test/test_doctest.py b/Lib/test/test_doctest.py index 83941c129f44..6081009006df 100644 --- a/Lib/test/test_doctest.py +++ b/Lib/test/test_doctest.py @@ -2450,6 +2450,11 @@ def test_unittest_reportflags(): Then the default eporting options are ignored: >>> result = suite.run(unittest.TestResult()) + """ + """ + *NOTE*: These doctest are intentionally not placed in raw string to depict + the trailing whitespace using `\x20` in the diff below. + >>> print(result.failures[0][1]) # doctest: +ELLIPSIS Traceback ... Failed example: @@ -2463,7 +2468,7 @@ def test_unittest_reportflags(): Differences (ndiff with -expected +actual): a - - + + +\x20 b @@ -2952,6 +2957,47 @@ def test_CLI(): r""" """ +def test_no_trailing_whitespace_stripping(): + r""" + The fancy reports had a bug for a long time where any trailing whitespace on + the reported diff lines was stripped, making it impossible to see the + differences in line reported as different that differed only in the amount of + trailing whitespace. The whitespace still isn't particularly visible unless + you use NDIFF, but at least it is now there to be found. + + *NOTE*: This snippet was intentionally put inside a raw string to get rid of + leading whitespace error in executing the example below + + >>> def f(x): + ... r''' + ... >>> print('\n'.join(['a ', 'b'])) + ... a + ... b + ... ''' + """ + """ + *NOTE*: These doctest are not placed in raw string to depict the trailing whitespace + using `\x20` + + >>> test = doctest.DocTestFinder().find(f)[0] + >>> flags = doctest.REPORT_NDIFF + >>> doctest.DocTestRunner(verbose=False, optionflags=flags).run(test) + ... # doctest: +ELLIPSIS + ********************************************************************** + File ..., line 3, in f + Failed example: + print('\n'.join(['a ', 'b'])) + Differences (ndiff with -expected +actual): + - a + + a + b + TestResults(failed=1, attempted=1) + + *NOTE*: `\x20` is for checking the trailing whitespace on the +a line above. + We cannot use actual spaces there, as a commit hook prevents from committing + patches that contain trailing whitespace. More info on Issue 24746. + """ + ###################################################################### ## Main ###################################################################### diff --git a/Misc/NEWS.d/next/Library/2018-11-22-15-22-56.bpo-24746.eSLKBE.rst b/Misc/NEWS.d/next/Library/2018-11-22-15-22-56.bpo-24746.eSLKBE.rst new file mode 100644 index 000000000000..c592516d1466 --- /dev/null +++ b/Misc/NEWS.d/next/Library/2018-11-22-15-22-56.bpo-24746.eSLKBE.rst @@ -0,0 +1,2 @@ +Avoid stripping trailing whitespace in doctest fancy diff. Orignial patch by +R. David Murray & Jairo Trad. Enhanced by Sanyam Khurana. From webhook-mailer at python.org Wed Jan 9 09:46:33 2019 From: webhook-mailer at python.org (Senthil Kumaran) Date: Wed, 09 Jan 2019 14:46:33 -0000 Subject: [Python-checkins] bpo-24746: Avoid stripping trailing whitespace in doctest fancy diff (GH-10639) (#11477) Message-ID: https://github.com/python/cpython/commit/5d9ae8b9df8371dd65514e0d60b561fd37056986 commit: 5d9ae8b9df8371dd65514e0d60b561fd37056986 branch: 3.6 author: Miss Islington (bot) <31488909+miss-islington at users.noreply.github.com> committer: Senthil Kumaran date: 2019-01-09T06:46:28-08:00 summary: bpo-24746: Avoid stripping trailing whitespace in doctest fancy diff (GH-10639) (#11477) (cherry picked from commit cbb16459934eaf29c7c7d362939cd05550b2f21f) Co-authored-by: Sanyam Khurana <8039608+CuriousLearner at users.noreply.github.com> files: A Misc/NEWS.d/next/Library/2018-11-22-15-22-56.bpo-24746.eSLKBE.rst M Lib/doctest.py M Lib/test/test_doctest.py diff --git a/Lib/doctest.py b/Lib/doctest.py index 0b78544d8d0e..ca5be5927594 100644 --- a/Lib/doctest.py +++ b/Lib/doctest.py @@ -1691,8 +1691,6 @@ def output_difference(self, example, got, optionflags): kind = 'ndiff with -expected +actual' else: assert 0, 'Bad diff option' - # Remove trailing whitespace on diff output. - diff = [line.rstrip() + '\n' for line in diff] return 'Differences (%s):\n' % kind + _indent(''.join(diff)) # If we're not using diff, then simply list the expected diff --git a/Lib/test/test_doctest.py b/Lib/test/test_doctest.py index 2258c6b1baed..ad527626d1b0 100644 --- a/Lib/test/test_doctest.py +++ b/Lib/test/test_doctest.py @@ -2431,6 +2431,11 @@ def test_unittest_reportflags(): Then the default eporting options are ignored: >>> result = suite.run(unittest.TestResult()) + """ + """ + *NOTE*: These doctest are intentionally not placed in raw string to depict + the trailing whitespace using `\x20` in the diff below. + >>> print(result.failures[0][1]) # doctest: +ELLIPSIS Traceback ... Failed example: @@ -2444,7 +2449,7 @@ def test_unittest_reportflags(): Differences (ndiff with -expected +actual): a - - + + +\x20 b @@ -2933,6 +2938,47 @@ def test_CLI(): r""" """ +def test_no_trailing_whitespace_stripping(): + r""" + The fancy reports had a bug for a long time where any trailing whitespace on + the reported diff lines was stripped, making it impossible to see the + differences in line reported as different that differed only in the amount of + trailing whitespace. The whitespace still isn't particularly visible unless + you use NDIFF, but at least it is now there to be found. + + *NOTE*: This snippet was intentionally put inside a raw string to get rid of + leading whitespace error in executing the example below + + >>> def f(x): + ... r''' + ... >>> print('\n'.join(['a ', 'b'])) + ... a + ... b + ... ''' + """ + """ + *NOTE*: These doctest are not placed in raw string to depict the trailing whitespace + using `\x20` + + >>> test = doctest.DocTestFinder().find(f)[0] + >>> flags = doctest.REPORT_NDIFF + >>> doctest.DocTestRunner(verbose=False, optionflags=flags).run(test) + ... # doctest: +ELLIPSIS + ********************************************************************** + File ..., line 3, in f + Failed example: + print('\n'.join(['a ', 'b'])) + Differences (ndiff with -expected +actual): + - a + + a + b + TestResults(failed=1, attempted=1) + + *NOTE*: `\x20` is for checking the trailing whitespace on the +a line above. + We cannot use actual spaces there, as a commit hook prevents from committing + patches that contain trailing whitespace. More info on Issue 24746. + """ + ###################################################################### ## Main ###################################################################### diff --git a/Misc/NEWS.d/next/Library/2018-11-22-15-22-56.bpo-24746.eSLKBE.rst b/Misc/NEWS.d/next/Library/2018-11-22-15-22-56.bpo-24746.eSLKBE.rst new file mode 100644 index 000000000000..c592516d1466 --- /dev/null +++ b/Misc/NEWS.d/next/Library/2018-11-22-15-22-56.bpo-24746.eSLKBE.rst @@ -0,0 +1,2 @@ +Avoid stripping trailing whitespace in doctest fancy diff. Orignial patch by +R. David Murray & Jairo Trad. Enhanced by Sanyam Khurana. From webhook-mailer at python.org Wed Jan 9 10:44:12 2019 From: webhook-mailer at python.org (Terry Jan Reedy) Date: Wed, 09 Jan 2019 15:44:12 -0000 Subject: [Python-checkins] bpo-35641: Move IDLE blurb to IDLE directory (#11479) Message-ID: https://github.com/python/cpython/commit/ee6559436797032b816dfb8c6376c9a451014962 commit: ee6559436797032b816dfb8c6376c9a451014962 branch: master author: Terry Jan Reedy committer: GitHub date: 2019-01-09T10:44:07-05:00 summary: bpo-35641: Move IDLE blurb to IDLE directory (#11479) files: A Misc/NEWS.d/next/IDLE/2019-01-02-22-15-01.bpo-35641.QEaANl.rst D Misc/NEWS.d/next/Library/2019-01-02-22-15-01.bpo-35641.QEaANl.rst diff --git a/Misc/NEWS.d/next/Library/2019-01-02-22-15-01.bpo-35641.QEaANl.rst b/Misc/NEWS.d/next/IDLE/2019-01-02-22-15-01.bpo-35641.QEaANl.rst similarity index 100% rename from Misc/NEWS.d/next/Library/2019-01-02-22-15-01.bpo-35641.QEaANl.rst rename to Misc/NEWS.d/next/IDLE/2019-01-02-22-15-01.bpo-35641.QEaANl.rst From webhook-mailer at python.org Wed Jan 9 10:49:45 2019 From: webhook-mailer at python.org (Miss Islington (bot)) Date: Wed, 09 Jan 2019 15:49:45 -0000 Subject: [Python-checkins] bpo-35641: Move IDLE blurb to IDLE directory (GH-11479) Message-ID: https://github.com/python/cpython/commit/6f76ef81596bbd885957b7fea3f40024ed9d6797 commit: 6f76ef81596bbd885957b7fea3f40024ed9d6797 branch: 3.7 author: Miss Islington (bot) <31488909+miss-islington at users.noreply.github.com> committer: GitHub date: 2019-01-09T07:49:38-08:00 summary: bpo-35641: Move IDLE blurb to IDLE directory (GH-11479) (cherry picked from commit ee6559436797032b816dfb8c6376c9a451014962) Co-authored-by: Terry Jan Reedy files: A Misc/NEWS.d/next/IDLE/2019-01-02-22-15-01.bpo-35641.QEaANl.rst D Misc/NEWS.d/next/Library/2019-01-02-22-15-01.bpo-35641.QEaANl.rst diff --git a/Misc/NEWS.d/next/Library/2019-01-02-22-15-01.bpo-35641.QEaANl.rst b/Misc/NEWS.d/next/IDLE/2019-01-02-22-15-01.bpo-35641.QEaANl.rst similarity index 100% rename from Misc/NEWS.d/next/Library/2019-01-02-22-15-01.bpo-35641.QEaANl.rst rename to Misc/NEWS.d/next/IDLE/2019-01-02-22-15-01.bpo-35641.QEaANl.rst From webhook-mailer at python.org Wed Jan 9 14:03:12 2019 From: webhook-mailer at python.org (Senthil Kumaran) Date: Wed, 09 Jan 2019 19:03:12 -0000 Subject: [Python-checkins] [2.7] bpo-24746: Avoid stripping trailing whitespace in doctest fancy diff (#11482) Message-ID: https://github.com/python/cpython/commit/02e33d9567aa4bd612f9f08053acbfd5e68480d0 commit: 02e33d9567aa4bd612f9f08053acbfd5e68480d0 branch: 2.7 author: Sanyam Khurana <8039608+CuriousLearner at users.noreply.github.com> committer: Senthil Kumaran date: 2019-01-09T11:03:03-08:00 summary: [2.7] bpo-24746: Avoid stripping trailing whitespace in doctest fancy diff (#11482) * bpo-24746: Avoid stripping trailing whitespace in doctest fancy diff * [2.7] bpo-24746: Avoid stripping trailing whitespace in doctest fancy diff (GH-10639). (cherry picked from commit cbb16459934eaf29c7c7d362939cd05550b2f21f) Co-authored-by: Sanyam Khurana <8039608+CuriousLearner at users.noreply.github.com> files: A Misc/NEWS.d/next/Library/2018-11-22-15-22-56.bpo-24746.eSLKBE.rst M Lib/doctest.py M Lib/test/test_doctest.py diff --git a/Lib/doctest.py b/Lib/doctest.py index fedf67011d92..1d822b576bb3 100644 --- a/Lib/doctest.py +++ b/Lib/doctest.py @@ -1651,8 +1651,6 @@ def output_difference(self, example, got, optionflags): kind = 'ndiff with -expected +actual' else: assert 0, 'Bad diff option' - # Remove trailing whitespace on diff output. - diff = [line.rstrip() + '\n' for line in diff] return 'Differences (%s):\n' % kind + _indent(''.join(diff)) # If we're not using diff, then simply list the expected diff --git a/Lib/test/test_doctest.py b/Lib/test/test_doctest.py index 354c33baa988..88f4b6127938 100644 --- a/Lib/test/test_doctest.py +++ b/Lib/test/test_doctest.py @@ -2355,7 +2355,12 @@ def test_unittest_reportflags(): Then the default eporting options are ignored: >>> result = suite.run(unittest.TestResult()) - >>> print result.failures[0][1] # doctest: +ELLIPSIS + """ + """ + *NOTE*: These doctest are intentionally not placed in raw string to depict + the trailing whitespace using `\x20` in the diff below. + + >>> print(result.failures[0][1]) # doctest: +ELLIPSIS Traceback ... Failed example: favorite_color @@ -2368,7 +2373,7 @@ def test_unittest_reportflags(): Differences (ndiff with -expected +actual): a - - + + +\x20 b @@ -2717,6 +2722,47 @@ def old_test4(): """ TestResults(failed=0, attempted=4) """ +def test_no_trailing_whitespace_stripping(): + r""" + The fancy reports had a bug for a long time where any trailing whitespace on + the reported diff lines was stripped, making it impossible to see the + differences in line reported as different that differed only in the amount of + trailing whitespace. The whitespace still isn't particularly visible unless + you use NDIFF, but at least it is now there to be found. + + *NOTE*: This snippet was intentionally put inside a raw string to get rid of + leading whitespace error in executing the example below + + >>> def f(x): + ... r''' + ... >>> print('\n'.join(['a ', 'b'])) + ... a + ... b + ... ''' + """ + """ + *NOTE*: These doctest are not placed in raw string to depict the trailing whitespace + using `\x20` + + >>> test = doctest.DocTestFinder().find(f)[0] + >>> flags = doctest.REPORT_NDIFF + >>> doctest.DocTestRunner(verbose=False, optionflags=flags).run(test) + ... # doctest: +ELLIPSIS + ********************************************************************** + File ..., line 3, in f + Failed example: + print('\n'.join(['a ', 'b'])) + Differences (ndiff with -expected +actual): + - a + + a + b + TestResults(failed=1, attempted=1) + + *NOTE*: `\x20` is for checking the trailing whitespace on the +a line above. + We cannot use actual spaces there, as a commit hook prevents from committing + patches that contain trailing whitespace. More info on Issue 24746. + """ + ###################################################################### ## Main ###################################################################### diff --git a/Misc/NEWS.d/next/Library/2018-11-22-15-22-56.bpo-24746.eSLKBE.rst b/Misc/NEWS.d/next/Library/2018-11-22-15-22-56.bpo-24746.eSLKBE.rst new file mode 100644 index 000000000000..c592516d1466 --- /dev/null +++ b/Misc/NEWS.d/next/Library/2018-11-22-15-22-56.bpo-24746.eSLKBE.rst @@ -0,0 +1,2 @@ +Avoid stripping trailing whitespace in doctest fancy diff. Orignial patch by +R. David Murray & Jairo Trad. Enhanced by Sanyam Khurana. From webhook-mailer at python.org Wed Jan 9 16:43:35 2019 From: webhook-mailer at python.org (Pablo Galindo) Date: Wed, 09 Jan 2019 21:43:35 -0000 Subject: [Python-checkins] Add example to the documentation for calling unittest.mock.patch with create=True (GH-11056) Message-ID: https://github.com/python/cpython/commit/d6acf17c05315cd34124d678057d9543adbad404 commit: d6acf17c05315cd34124d678057d9543adbad404 branch: master author: Pablo Galindo committer: GitHub date: 2019-01-09T21:43:24Z summary: Add example to the documentation for calling unittest.mock.patch with create=True (GH-11056) files: M Doc/library/unittest.mock.rst diff --git a/Doc/library/unittest.mock.rst b/Doc/library/unittest.mock.rst index bfab00eb7514..c011e54e3e73 100644 --- a/Doc/library/unittest.mock.rst +++ b/Doc/library/unittest.mock.rst @@ -1119,13 +1119,13 @@ patch Instead of ``autospec=True`` you can pass ``autospec=some_object`` to use an arbitrary object as the spec instead of the one being replaced. - By default :func:`patch` will fail to replace attributes that don't exist. If - you pass in ``create=True``, and the attribute doesn't exist, patch will - create the attribute for you when the patched function is called, and - delete it again afterwards. This is useful for writing tests against - attributes that your production code creates at runtime. It is off by - default because it can be dangerous. With it switched on you can write - passing tests against APIs that don't actually exist! + By default :func:`patch` will fail to replace attributes that don't exist. + If you pass in ``create=True``, and the attribute doesn't exist, patch will + create the attribute for you when the patched function is called, and delete + it again after the patched function has exited. This is useful for writing + tests against attributes that your production code creates at runtime. It is + off by default because it can be dangerous. With it switched on you can + write passing tests against APIs that don't actually exist! .. note:: @@ -1247,6 +1247,27 @@ into a :func:`patch` call using ``**``:: ... KeyError +By default, attempting to patch a function in a module (or a method or an +attribute in a class) that does not exist will fail with :exc:`AttributeError`:: + + >>> @patch('sys.non_existing_attribute', 42) + ... def test(): + ... assert sys.non_existing_attribute == 42 + ... + >>> test() + Traceback (most recent call last): + ... + AttributeError: does not have the attribute 'non_existing' + +but adding ``create=True`` in the call to :func:`patch` will make the previous example +work as expected:: + + >>> @patch('sys.non_existing_attribute', 42, create=True) + ... def test(mock_stdout): + ... assert sys.non_existing_attribute == 42 + ... + >>> test() + patch.object ~~~~~~~~~~~~ From webhook-mailer at python.org Wed Jan 9 16:50:06 2019 From: webhook-mailer at python.org (Miss Islington (bot)) Date: Wed, 09 Jan 2019 21:50:06 -0000 Subject: [Python-checkins] Add example to the documentation for calling unittest.mock.patch with create=True (GH-11056) Message-ID: https://github.com/python/cpython/commit/2b3db493690c286565b91b3d604cda38adb24636 commit: 2b3db493690c286565b91b3d604cda38adb24636 branch: 3.7 author: Miss Islington (bot) <31488909+miss-islington at users.noreply.github.com> committer: GitHub date: 2019-01-09T13:50:02-08:00 summary: Add example to the documentation for calling unittest.mock.patch with create=True (GH-11056) (cherry picked from commit d6acf17c05315cd34124d678057d9543adbad404) Co-authored-by: Pablo Galindo files: M Doc/library/unittest.mock.rst diff --git a/Doc/library/unittest.mock.rst b/Doc/library/unittest.mock.rst index f1b25958a670..6daf0feca3ce 100644 --- a/Doc/library/unittest.mock.rst +++ b/Doc/library/unittest.mock.rst @@ -1097,13 +1097,13 @@ patch Instead of ``autospec=True`` you can pass ``autospec=some_object`` to use an arbitrary object as the spec instead of the one being replaced. - By default :func:`patch` will fail to replace attributes that don't exist. If - you pass in ``create=True``, and the attribute doesn't exist, patch will - create the attribute for you when the patched function is called, and - delete it again afterwards. This is useful for writing tests against - attributes that your production code creates at runtime. It is off by - default because it can be dangerous. With it switched on you can write - passing tests against APIs that don't actually exist! + By default :func:`patch` will fail to replace attributes that don't exist. + If you pass in ``create=True``, and the attribute doesn't exist, patch will + create the attribute for you when the patched function is called, and delete + it again after the patched function has exited. This is useful for writing + tests against attributes that your production code creates at runtime. It is + off by default because it can be dangerous. With it switched on you can + write passing tests against APIs that don't actually exist! .. note:: @@ -1225,6 +1225,27 @@ into a :func:`patch` call using ``**``: ... KeyError +By default, attempting to patch a function in a module (or a method or an +attribute in a class) that does not exist will fail with :exc:`AttributeError`:: + + >>> @patch('sys.non_existing_attribute', 42) + ... def test(): + ... assert sys.non_existing_attribute == 42 + ... + >>> test() + Traceback (most recent call last): + ... + AttributeError: does not have the attribute 'non_existing' + +but adding ``create=True`` in the call to :func:`patch` will make the previous example +work as expected:: + + >>> @patch('sys.non_existing_attribute', 42, create=True) + ... def test(mock_stdout): + ... assert sys.non_existing_attribute == 42 + ... + >>> test() + patch.object ~~~~~~~~~~~~ From webhook-mailer at python.org Wed Jan 9 17:52:24 2019 From: webhook-mailer at python.org (Brian Curtin) Date: Wed, 09 Jan 2019 22:52:24 -0000 Subject: [Python-checkins] bpo-35404: Clarify how to import _structure in email.message doc (GH-10886) Message-ID: https://github.com/python/cpython/commit/e394ba32147f687b6bc7518d461f1d84211698e0 commit: e394ba32147f687b6bc7518d461f1d84211698e0 branch: master author: Charles-Axel Dein committer: Brian Curtin date: 2019-01-09T15:52:10-07:00 summary: bpo-35404: Clarify how to import _structure in email.message doc (GH-10886) files: M Doc/library/email.message.rst diff --git a/Doc/library/email.message.rst b/Doc/library/email.message.rst index 77b8099a7d4c..f1806a0866bb 100644 --- a/Doc/library/email.message.rst +++ b/Doc/library/email.message.rst @@ -487,7 +487,6 @@ message objects. from email import message_from_binary_file with open('../Lib/test/test_email/data/msg_16.txt', 'rb') as f: msg = message_from_binary_file(f) - from email.iterators import _structure .. doctest:: @@ -509,6 +508,7 @@ message objects. .. doctest:: + >>> from email.iterators import _structure >>> for part in msg.walk(): ... print(part.get_content_maintype() == 'multipart', ... part.is_multipart()) From webhook-mailer at python.org Wed Jan 9 17:54:16 2019 From: webhook-mailer at python.org (Brian Curtin) Date: Wed, 09 Jan 2019 22:54:16 -0000 Subject: [Python-checkins] Update bugs.rst (GH-9648) Message-ID: https://github.com/python/cpython/commit/91c6158dbc5d70fcd91993b4e62c7bae926c2ea2 commit: 91c6158dbc5d70fcd91993b4e62c7bae926c2ea2 branch: master author: Andre Delfino committer: Brian Curtin date: 2019-01-09T15:54:12-07:00 summary: Update bugs.rst (GH-9648) files: M Doc/bugs.rst diff --git a/Doc/bugs.rst b/Doc/bugs.rst index 109e9eb202d8..3206a456b3a8 100644 --- a/Doc/bugs.rst +++ b/Doc/bugs.rst @@ -17,7 +17,7 @@ Documentation bugs If you find a bug in this documentation or would like to propose an improvement, please submit a bug report on the :ref:`tracker `. If you -have a suggestion how to fix it, include that as well. +have a suggestion on how to fix it, include that as well. If you're short on time, you can also email documentation bug reports to docs at python.org (behavioral bugs can be sent to python-list at python.org). From webhook-mailer at python.org Wed Jan 9 18:00:15 2019 From: webhook-mailer at python.org (Brian Curtin) Date: Wed, 09 Jan 2019 23:00:15 -0000 Subject: [Python-checkins] Update bugs.rst (GH-11485) Message-ID: https://github.com/python/cpython/commit/14190000c753835e3618b87b2a0cecb3fa4e3134 commit: 14190000c753835e3618b87b2a0cecb3fa4e3134 branch: 3.7 author: Miss Islington (bot) <31488909+miss-islington at users.noreply.github.com> committer: Brian Curtin date: 2019-01-09T16:00:12-07:00 summary: Update bugs.rst (GH-11485) (cherry picked from commit 91c6158dbc5d70fcd91993b4e62c7bae926c2ea2) Co-authored-by: Andre Delfino files: M Doc/bugs.rst diff --git a/Doc/bugs.rst b/Doc/bugs.rst index 109e9eb202d8..3206a456b3a8 100644 --- a/Doc/bugs.rst +++ b/Doc/bugs.rst @@ -17,7 +17,7 @@ Documentation bugs If you find a bug in this documentation or would like to propose an improvement, please submit a bug report on the :ref:`tracker `. If you -have a suggestion how to fix it, include that as well. +have a suggestion on how to fix it, include that as well. If you're short on time, you can also email documentation bug reports to docs at python.org (behavioral bugs can be sent to python-list at python.org). From webhook-mailer at python.org Wed Jan 9 18:00:51 2019 From: webhook-mailer at python.org (Brian Curtin) Date: Wed, 09 Jan 2019 23:00:51 -0000 Subject: [Python-checkins] Update bugs.rst (GH-11487) Message-ID: https://github.com/python/cpython/commit/c0a1d73c64cbc3753b6e16f78d2c77a0389fff21 commit: c0a1d73c64cbc3753b6e16f78d2c77a0389fff21 branch: 2.7 author: Miss Islington (bot) <31488909+miss-islington at users.noreply.github.com> committer: Brian Curtin date: 2019-01-09T16:00:48-07:00 summary: Update bugs.rst (GH-11487) (cherry picked from commit 91c6158dbc5d70fcd91993b4e62c7bae926c2ea2) Co-authored-by: Andre Delfino files: M Doc/bugs.rst diff --git a/Doc/bugs.rst b/Doc/bugs.rst index 25ce3cac5abe..4dbcb45765e7 100644 --- a/Doc/bugs.rst +++ b/Doc/bugs.rst @@ -14,7 +14,7 @@ Documentation bugs If you find a bug in this documentation or would like to propose an improvement, please submit a bug report on the :ref:`tracker `. If you -have a suggestion how to fix it, include that as well. +have a suggestion on how to fix it, include that as well. If you're short on time, you can also email documentation bug reports to docs at python.org (behavioral bugs can be sent to python-list at python.org). From webhook-mailer at python.org Wed Jan 9 18:55:06 2019 From: webhook-mailer at python.org (Steve Dower) Date: Wed, 09 Jan 2019 23:55:06 -0000 Subject: [Python-checkins] Distutils no longer needs to remain compatible with 2.3 (GH-11423) Message-ID: https://github.com/python/cpython/commit/a306bdd39f5d4a8a615487e7840355e923706811 commit: a306bdd39f5d4a8a615487e7840355e923706811 branch: master author: Miro Hron?ok committer: Steve Dower date: 2019-01-10T10:55:03+11:00 summary: Distutils no longer needs to remain compatible with 2.3 (GH-11423) files: M Lib/distutils/README diff --git a/Lib/distutils/README b/Lib/distutils/README index 408a203b85d5..23f488506f85 100644 --- a/Lib/distutils/README +++ b/Lib/distutils/README @@ -8,6 +8,4 @@ The Distutils-SIG web page is also a good starting point: http://www.python.org/sigs/distutils-sig/ -WARNING : Distutils must remain compatible with 2.3 - $Id$ From webhook-mailer at python.org Wed Jan 9 19:19:32 2019 From: webhook-mailer at python.org (Steve Dower) Date: Thu, 10 Jan 2019 00:19:32 -0000 Subject: [Python-checkins] bpo-34855: Fix EXTERNALS_DIR build variable for Windows (GH-11177) Message-ID: https://github.com/python/cpython/commit/6aedfa6b9ac324587f64133c23757a66a8f355bb commit: 6aedfa6b9ac324587f64133c23757a66a8f355bb branch: master author: antektek <45912913+antektek at users.noreply.github.com> committer: Steve Dower date: 2019-01-10T11:19:29+11:00 summary: bpo-34855: Fix EXTERNALS_DIR build variable for Windows (GH-11177) files: M .azure-pipelines/windows-appx-test.yml M .azure-pipelines/windows-steps.yml M PCbuild/find_python.bat diff --git a/.azure-pipelines/windows-appx-test.yml b/.azure-pipelines/windows-appx-test.yml index 5f3fe6c94578..cad752b0a1e7 100644 --- a/.azure-pipelines/windows-appx-test.yml +++ b/.azure-pipelines/windows-appx-test.yml @@ -36,7 +36,7 @@ jobs: Write-Host '##vso[task.setvariable variable=Py_IntDir]$(Build.BinariesDirectory)\obj' # UNDONE: Do not build to a different directory because of broken tests Write-Host '##vso[task.setvariable variable=Py_OutDir]$(Build.SourcesDirectory)\PCbuild' - Write-Host '##vso[task.setvariable variable=EXTERNAL_DIR]$(Build.BinariesDirectory)\externals' + Write-Host '##vso[task.setvariable variable=EXTERNALS_DIR]$(Build.BinariesDirectory)\externals' displayName: Update build locations - script: PCbuild\build.bat -e $(buildOpt) diff --git a/.azure-pipelines/windows-steps.yml b/.azure-pipelines/windows-steps.yml index cba00158ad13..3651ae03bc1d 100644 --- a/.azure-pipelines/windows-steps.yml +++ b/.azure-pipelines/windows-steps.yml @@ -8,7 +8,7 @@ steps: Write-Host '##vso[task.setvariable variable=Py_IntDir]$(Build.BinariesDirectory)\obj' # UNDONE: Do not build to a different directory because of broken tests Write-Host '##vso[task.setvariable variable=Py_OutDir]$(Build.SourcesDirectory)\PCbuild' - Write-Host '##vso[task.setvariable variable=EXTERNAL_DIR]$(Build.BinariesDirectory)\externals' + Write-Host '##vso[task.setvariable variable=EXTERNALS_DIR]$(Build.BinariesDirectory)\externals' displayName: Update build locations - script: PCbuild\build.bat -e $(buildOpt) diff --git a/PCbuild/find_python.bat b/PCbuild/find_python.bat index d0778ddd7347..d5c787fd77aa 100644 --- a/PCbuild/find_python.bat +++ b/PCbuild/find_python.bat @@ -24,7 +24,7 @@ :begin_search @set PYTHON= - at set _Py_EXTERNALS_DIR=%EXTERNAL_DIR% + at set _Py_EXTERNALS_DIR=%EXTERNALS_DIR% @if "%_Py_EXTERNALS_DIR%"=="" (set _Py_EXTERNALS_DIR=%~dp0\..\externals) @rem If we have Python in externals, use that one From webhook-mailer at python.org Wed Jan 9 19:46:44 2019 From: webhook-mailer at python.org (Miss Islington (bot)) Date: Thu, 10 Jan 2019 00:46:44 -0000 Subject: [Python-checkins] bpo-34855: Fix EXTERNALS_DIR build variable for Windows (GH-11177) Message-ID: https://github.com/python/cpython/commit/2bd5f7e91add6a20c311d46c332c4325b1e77883 commit: 2bd5f7e91add6a20c311d46c332c4325b1e77883 branch: 3.7 author: Miss Islington (bot) <31488909+miss-islington at users.noreply.github.com> committer: GitHub date: 2019-01-09T16:46:40-08:00 summary: bpo-34855: Fix EXTERNALS_DIR build variable for Windows (GH-11177) (cherry picked from commit 6aedfa6b9ac324587f64133c23757a66a8f355bb) Co-authored-by: antektek <45912913+antektek at users.noreply.github.com> files: M .azure-pipelines/windows-appx-test.yml M .azure-pipelines/windows-steps.yml M PCbuild/find_python.bat diff --git a/.azure-pipelines/windows-appx-test.yml b/.azure-pipelines/windows-appx-test.yml index 5f3fe6c94578..cad752b0a1e7 100644 --- a/.azure-pipelines/windows-appx-test.yml +++ b/.azure-pipelines/windows-appx-test.yml @@ -36,7 +36,7 @@ jobs: Write-Host '##vso[task.setvariable variable=Py_IntDir]$(Build.BinariesDirectory)\obj' # UNDONE: Do not build to a different directory because of broken tests Write-Host '##vso[task.setvariable variable=Py_OutDir]$(Build.SourcesDirectory)\PCbuild' - Write-Host '##vso[task.setvariable variable=EXTERNAL_DIR]$(Build.BinariesDirectory)\externals' + Write-Host '##vso[task.setvariable variable=EXTERNALS_DIR]$(Build.BinariesDirectory)\externals' displayName: Update build locations - script: PCbuild\build.bat -e $(buildOpt) diff --git a/.azure-pipelines/windows-steps.yml b/.azure-pipelines/windows-steps.yml index cba00158ad13..3651ae03bc1d 100644 --- a/.azure-pipelines/windows-steps.yml +++ b/.azure-pipelines/windows-steps.yml @@ -8,7 +8,7 @@ steps: Write-Host '##vso[task.setvariable variable=Py_IntDir]$(Build.BinariesDirectory)\obj' # UNDONE: Do not build to a different directory because of broken tests Write-Host '##vso[task.setvariable variable=Py_OutDir]$(Build.SourcesDirectory)\PCbuild' - Write-Host '##vso[task.setvariable variable=EXTERNAL_DIR]$(Build.BinariesDirectory)\externals' + Write-Host '##vso[task.setvariable variable=EXTERNALS_DIR]$(Build.BinariesDirectory)\externals' displayName: Update build locations - script: PCbuild\build.bat -e $(buildOpt) diff --git a/PCbuild/find_python.bat b/PCbuild/find_python.bat index d0778ddd7347..d5c787fd77aa 100644 --- a/PCbuild/find_python.bat +++ b/PCbuild/find_python.bat @@ -24,7 +24,7 @@ :begin_search @set PYTHON= - at set _Py_EXTERNALS_DIR=%EXTERNAL_DIR% + at set _Py_EXTERNALS_DIR=%EXTERNALS_DIR% @if "%_Py_EXTERNALS_DIR%"=="" (set _Py_EXTERNALS_DIR=%~dp0\..\externals) @rem If we have Python in externals, use that one From solipsis at pitrou.net Thu Jan 10 04:07:15 2019 From: solipsis at pitrou.net (solipsis at pitrou.net) Date: Thu, 10 Jan 2019 09:07:15 +0000 Subject: [Python-checkins] Daily reference leaks (4243df51fe43): sum=0 Message-ID: <20190110090715.1.538722FBC5699017@psf.io> results for 4243df51fe43 on branch "default" -------------------------------------------- test_collections leaked [0, -7, 1] memory blocks, sum=-6 test_functools leaked [0, 3, 1] memory blocks, sum=4 test_multiprocessing_forkserver leaked [2, -2, 2] memory blocks, sum=2 Command line was: ['./python', '-m', 'test.regrtest', '-uall', '-R', '3:3:/home/psf-users/antoine/refleaks/reflogeQ94Y4', '--timeout', '7200'] From webhook-mailer at python.org Thu Jan 10 05:23:39 2019 From: webhook-mailer at python.org (Victor Stinner) Date: Thu, 10 Jan 2019 10:23:39 -0000 Subject: [Python-checkins] IocpProactor: prevent modification if closed (GH-11494) Message-ID: https://github.com/python/cpython/commit/9b07681c09182d4b9d23cd52566a4992b8afecbb commit: 9b07681c09182d4b9d23cd52566a4992b8afecbb branch: master author: Victor Stinner committer: GitHub date: 2019-01-10T11:23:26+01:00 summary: IocpProactor: prevent modification if closed (GH-11494) * _wait_for_handle(), _register() and _unregister() methods of IocpProactor now raise an exception if closed * Add "closed" to IocpProactor.__repr__() * Simplify IocpProactor.close() files: M Lib/asyncio/windows_events.py diff --git a/Lib/asyncio/windows_events.py b/Lib/asyncio/windows_events.py index 0f3e9f425f68..bdb9a6e28a86 100644 --- a/Lib/asyncio/windows_events.py +++ b/Lib/asyncio/windows_events.py @@ -408,10 +408,16 @@ def __init__(self, concurrency=0xffffffff): self._unregistered = [] self._stopped_serving = weakref.WeakSet() + def _check_closed(self): + if self._iocp is None: + raise RuntimeError('IocpProactor is closed') + def __repr__(self): - return ('<%s overlapped#=%s result#=%s>' - % (self.__class__.__name__, len(self._cache), - len(self._results))) + info = ['overlapped#=%s' % len(self._cache), + 'result#=%s' % len(self._results)] + if self._iocp is None: + info.append('closed') + return '<%s %s>' % (self.__class__.__name__, " ".join(info)) def set_loop(self, loop): self._loop = loop @@ -618,6 +624,8 @@ def _wait_cancel(self, event, done_callback): return fut def _wait_for_handle(self, handle, timeout, _is_cancel): + self._check_closed() + if timeout is None: ms = _winapi.INFINITE else: @@ -660,6 +668,8 @@ def _register_with_iocp(self, obj): # that succeed immediately. def _register(self, ov, obj, callback): + self._check_closed() + # Return a future which will be set with the result of the # operation when it completes. The future's value is actually # the value returned by callback(). @@ -696,6 +706,7 @@ def _unregister(self, ov): already be signalled (pending in the proactor event queue). It is also safe if the event is never signalled (because it was cancelled). """ + self._check_closed() self._unregistered.append(ov) def _get_accept_socket(self, family): @@ -765,6 +776,10 @@ def _stop_serving(self, obj): self._stopped_serving.add(obj) def close(self): + if self._iocp is None: + # already closed + return + # Cancel remaining registered operations. for address, (fut, ov, obj, callback) in list(self._cache.items()): if fut.cancelled(): @@ -787,14 +802,15 @@ def close(self): context['source_traceback'] = fut._source_traceback self._loop.call_exception_handler(context) + # wait until all cancelled overlapped future complete while self._cache: if not self._poll(1): logger.debug('taking long time to close proactor') self._results = [] - if self._iocp is not None: - _winapi.CloseHandle(self._iocp) - self._iocp = None + + _winapi.CloseHandle(self._iocp) + self._iocp = None def __del__(self): self.close() From webhook-mailer at python.org Thu Jan 10 05:24:44 2019 From: webhook-mailer at python.org (Victor Stinner) Date: Thu, 10 Jan 2019 10:24:44 -0000 Subject: [Python-checkins] asyncio: __del__() keep reference to warnings.warn (GH-11491) Message-ID: https://github.com/python/cpython/commit/fb2c3465f09e1f720cdae7eca87d62125a327fd9 commit: fb2c3465f09e1f720cdae7eca87d62125a327fd9 branch: master author: Victor Stinner committer: GitHub date: 2019-01-10T11:24:40+01:00 summary: asyncio: __del__() keep reference to warnings.warn (GH-11491) * asyncio: __del__() keep reference to warnings.warn The __del__() methods of asyncio classes now keep a strong reference to the warnings.warn() to be able to display the ResourceWarning warning in more cases. Ensure that the function remains available if instances are destroyed late during Python shutdown (while module symbols are cleared). * Rename warn parameter to _warn "_warn" name is a hint that it's not the regular warnings.warn() function. files: M Lib/asyncio/base_events.py M Lib/asyncio/base_subprocess.py M Lib/asyncio/proactor_events.py M Lib/asyncio/selector_events.py M Lib/asyncio/sslproto.py M Lib/asyncio/unix_events.py M Lib/asyncio/windows_utils.py diff --git a/Lib/asyncio/base_events.py b/Lib/asyncio/base_events.py index 60a189bdfb7e..cec47ce67f38 100644 --- a/Lib/asyncio/base_events.py +++ b/Lib/asyncio/base_events.py @@ -622,10 +622,9 @@ def is_closed(self): """Returns True if the event loop was closed.""" return self._closed - def __del__(self): + def __del__(self, _warn=warnings.warn): if not self.is_closed(): - warnings.warn(f"unclosed event loop {self!r}", ResourceWarning, - source=self) + _warn(f"unclosed event loop {self!r}", ResourceWarning, source=self) if not self.is_running(): self.close() diff --git a/Lib/asyncio/base_subprocess.py b/Lib/asyncio/base_subprocess.py index b547c444ad5d..f503f78fdda3 100644 --- a/Lib/asyncio/base_subprocess.py +++ b/Lib/asyncio/base_subprocess.py @@ -120,10 +120,9 @@ def close(self): # Don't clear the _proc reference yet: _post_init() may still run - def __del__(self): + def __del__(self, _warn=warnings.warn): if not self._closed: - warnings.warn(f"unclosed transport {self!r}", ResourceWarning, - source=self) + _warn(f"unclosed transport {self!r}", ResourceWarning, source=self) self.close() def get_pid(self): diff --git a/Lib/asyncio/proactor_events.py b/Lib/asyncio/proactor_events.py index da204c69d45e..3a1826e2c0d2 100644 --- a/Lib/asyncio/proactor_events.py +++ b/Lib/asyncio/proactor_events.py @@ -89,10 +89,9 @@ def close(self): self._read_fut.cancel() self._read_fut = None - def __del__(self): + def __del__(self, _warn=warnings.warn): if self._sock is not None: - warnings.warn(f"unclosed transport {self!r}", ResourceWarning, - source=self) + _warn(f"unclosed transport {self!r}", ResourceWarning, source=self) self.close() def _fatal_error(self, exc, message='Fatal error on pipe transport'): diff --git a/Lib/asyncio/selector_events.py b/Lib/asyncio/selector_events.py index 112c4b15b8d8..93b688950943 100644 --- a/Lib/asyncio/selector_events.py +++ b/Lib/asyncio/selector_events.py @@ -658,10 +658,9 @@ def close(self): self._loop._remove_writer(self._sock_fd) self._loop.call_soon(self._call_connection_lost, None) - def __del__(self): + def __del__(self, _warn=warnings.warn): if self._sock is not None: - warnings.warn(f"unclosed transport {self!r}", ResourceWarning, - source=self) + _warn(f"unclosed transport {self!r}", ResourceWarning, source=self) self._sock.close() def _fatal_error(self, exc, message='Fatal error on transport'): diff --git a/Lib/asyncio/sslproto.py b/Lib/asyncio/sslproto.py index 12fdb0d1c5ec..42785609dcd2 100644 --- a/Lib/asyncio/sslproto.py +++ b/Lib/asyncio/sslproto.py @@ -316,10 +316,9 @@ def close(self): self._closed = True self._ssl_protocol._start_shutdown() - def __del__(self): + def __del__(self, _warn=warnings.warn): if not self._closed: - warnings.warn(f"unclosed transport {self!r}", ResourceWarning, - source=self) + _warn(f"unclosed transport {self!r}", ResourceWarning, source=self) self.close() def is_reading(self): diff --git a/Lib/asyncio/unix_events.py b/Lib/asyncio/unix_events.py index 1a62db4f59bc..73d4bda7c234 100644 --- a/Lib/asyncio/unix_events.py +++ b/Lib/asyncio/unix_events.py @@ -511,10 +511,9 @@ def close(self): if not self._closing: self._close(None) - def __del__(self): + def __del__(self, _warn=warnings.warn): if self._pipe is not None: - warnings.warn(f"unclosed transport {self!r}", ResourceWarning, - source=self) + _warn(f"unclosed transport {self!r}", ResourceWarning, source=self) self._pipe.close() def _fatal_error(self, exc, message='Fatal error on pipe transport'): @@ -707,10 +706,9 @@ def close(self): # write_eof is all what we needed to close the write pipe self.write_eof() - def __del__(self): + def __del__(self, _warn=warnings.warn): if self._pipe is not None: - warnings.warn(f"unclosed transport {self!r}", ResourceWarning, - source=self) + _warn(f"unclosed transport {self!r}", ResourceWarning, source=self) self._pipe.close() def abort(self): diff --git a/Lib/asyncio/windows_utils.py b/Lib/asyncio/windows_utils.py index 9e22f6e0740e..ef277fac3e29 100644 --- a/Lib/asyncio/windows_utils.py +++ b/Lib/asyncio/windows_utils.py @@ -107,10 +107,9 @@ def close(self, *, CloseHandle=_winapi.CloseHandle): CloseHandle(self._handle) self._handle = None - def __del__(self): + def __del__(self, _warn=warnings.warn): if self._handle is not None: - warnings.warn(f"unclosed {self!r}", ResourceWarning, - source=self) + _warn(f"unclosed {self!r}", ResourceWarning, source=self) self.close() def __enter__(self): From webhook-mailer at python.org Thu Jan 10 09:29:44 2019 From: webhook-mailer at python.org (Senthil Kumaran) Date: Thu, 10 Jan 2019 14:29:44 -0000 Subject: [Python-checkins] bpo-24746: Fix doctest failures when running the testsuite with -R (#11501) Message-ID: https://github.com/python/cpython/commit/c5dc60ea858b8ccf78e8d26db81c307a8f9b2314 commit: c5dc60ea858b8ccf78e8d26db81c307a8f9b2314 branch: master author: Pablo Galindo committer: Senthil Kumaran date: 2019-01-10T06:29:40-08:00 summary: bpo-24746: Fix doctest failures when running the testsuite with -R (#11501) files: M Lib/test/test_doctest.py diff --git a/Lib/test/test_doctest.py b/Lib/test/test_doctest.py index c40c38926873..f1013f257259 100644 --- a/Lib/test/test_doctest.py +++ b/Lib/test/test_doctest.py @@ -2451,8 +2451,7 @@ def test_unittest_reportflags(): Then the default eporting options are ignored: >>> result = suite.run(unittest.TestResult()) - """ - """ + *NOTE*: These doctest are intentionally not placed in raw string to depict the trailing whitespace using `\x20` in the diff below. From webhook-mailer at python.org Thu Jan 10 11:02:32 2019 From: webhook-mailer at python.org (Miss Islington (bot)) Date: Thu, 10 Jan 2019 16:02:32 -0000 Subject: [Python-checkins] bpo-24746: Fix doctest failures when running the testsuite with -R (GH-11501) Message-ID: https://github.com/python/cpython/commit/1cbd17c6987afc48c16caa7ccc7d19b01fbd39f2 commit: 1cbd17c6987afc48c16caa7ccc7d19b01fbd39f2 branch: 3.7 author: Miss Islington (bot) <31488909+miss-islington at users.noreply.github.com> committer: GitHub date: 2019-01-10T08:02:26-08:00 summary: bpo-24746: Fix doctest failures when running the testsuite with -R (GH-11501) (cherry picked from commit c5dc60ea858b8ccf78e8d26db81c307a8f9b2314) Co-authored-by: Pablo Galindo files: M Lib/test/test_doctest.py diff --git a/Lib/test/test_doctest.py b/Lib/test/test_doctest.py index 6081009006df..4a3c488738d3 100644 --- a/Lib/test/test_doctest.py +++ b/Lib/test/test_doctest.py @@ -2450,8 +2450,7 @@ def test_unittest_reportflags(): Then the default eporting options are ignored: >>> result = suite.run(unittest.TestResult()) - """ - """ + *NOTE*: These doctest are intentionally not placed in raw string to depict the trailing whitespace using `\x20` in the diff below. From webhook-mailer at python.org Thu Jan 10 11:12:36 2019 From: webhook-mailer at python.org (Serhiy Storchaka) Date: Thu, 10 Jan 2019 16:12:36 -0000 Subject: [Python-checkins] bpo-35470: Fix a reference counting bug in _PyImport_FindExtensionObjectEx(). (GH-11128) Message-ID: https://github.com/python/cpython/commit/89c4f90df97f6039325e354167e8f507bf199fd9 commit: 89c4f90df97f6039325e354167e8f507bf199fd9 branch: master author: Zackery Spytz committer: Serhiy Storchaka date: 2019-01-10T18:12:31+02:00 summary: bpo-35470: Fix a reference counting bug in _PyImport_FindExtensionObjectEx(). (GH-11128) files: M Python/import.c diff --git a/Python/import.c b/Python/import.c index c4f087761cd6..344f199216d0 100644 --- a/Python/import.c +++ b/Python/import.c @@ -749,7 +749,6 @@ _PyImport_FindExtensionObjectEx(PyObject *name, PyObject *filename, } if (_PyState_AddModule(mod, def) < 0) { PyMapping_DelItem(modules, name); - Py_DECREF(mod); return NULL; } if (Py_VerboseFlag) From webhook-mailer at python.org Thu Jan 10 11:36:53 2019 From: webhook-mailer at python.org (Miss Islington (bot)) Date: Thu, 10 Jan 2019 16:36:53 -0000 Subject: [Python-checkins] bpo-35470: Fix a reference counting bug in _PyImport_FindExtensionObjectEx(). (GH-11128) Message-ID: https://github.com/python/cpython/commit/3e3d57d8490b729a80c8cd9e90475dec122dfe9e commit: 3e3d57d8490b729a80c8cd9e90475dec122dfe9e branch: 3.7 author: Miss Islington (bot) <31488909+miss-islington at users.noreply.github.com> committer: GitHub date: 2019-01-10T08:36:50-08:00 summary: bpo-35470: Fix a reference counting bug in _PyImport_FindExtensionObjectEx(). (GH-11128) (cherry picked from commit 89c4f90df97f6039325e354167e8f507bf199fd9) Co-authored-by: Zackery Spytz files: M Python/import.c diff --git a/Python/import.c b/Python/import.c index 5f5d135c7a2e..ccdd59930505 100644 --- a/Python/import.c +++ b/Python/import.c @@ -743,7 +743,6 @@ _PyImport_FindExtensionObjectEx(PyObject *name, PyObject *filename, } if (_PyState_AddModule(mod, def) < 0) { PyMapping_DelItem(modules, name); - Py_DECREF(mod); return NULL; } if (Py_VerboseFlag) From webhook-mailer at python.org Thu Jan 10 11:56:42 2019 From: webhook-mailer at python.org (Victor Stinner) Date: Thu, 10 Jan 2019 16:56:42 -0000 Subject: [Python-checkins] bpo-35702: Add new identifier time.CLOCK_UPTIME_RAW for macOS 10.12 (GH-11503) Message-ID: https://github.com/python/cpython/commit/572168a016ece1b7346695eb7289190c46f1ae55 commit: 572168a016ece1b7346695eb7289190c46f1ae55 branch: master author: Joannah Nanjekye <33177550+nanjekyejoannah at users.noreply.github.com> committer: Victor Stinner date: 2019-01-10T17:56:38+01:00 summary: bpo-35702: Add new identifier time.CLOCK_UPTIME_RAW for macOS 10.12 (GH-11503) files: A Misc/NEWS.d/next/Library/2019-01-10-14-03-12.bpo-35702._ct_0H.rst M Doc/library/time.rst M Doc/whatsnew/3.8.rst M Modules/timemodule.c diff --git a/Doc/library/time.rst b/Doc/library/time.rst index 0ffce475a368..892ed1343a19 100644 --- a/Doc/library/time.rst +++ b/Doc/library/time.rst @@ -815,9 +815,21 @@ These constants are used as parameters for :func:`clock_getres` and .. versionadded:: 3.7 +.. data:: CLOCK_UPTIME_RAW + + Clock that increments monotonically, tracking the time since an arbitrary + point, unaffected by frequency or time adjustments and not incremented while + the system is asleep. + + .. availability:: macOS 10.12 and newer. + + .. versionadded:: 3.8 + + The following constant is the only parameter that can be sent to :func:`clock_settime`. + .. data:: CLOCK_REALTIME System-wide real-time clock. Setting this clock requires appropriate diff --git a/Doc/whatsnew/3.8.rst b/Doc/whatsnew/3.8.rst index c592f00d2d9d..370ef4604834 100644 --- a/Doc/whatsnew/3.8.rst +++ b/Doc/whatsnew/3.8.rst @@ -234,6 +234,12 @@ Added method :meth:`~tkinter.Canvas.moveto` in the :class:`tkinter.Canvas` class. (Contributed by Juliette Monsel in :issue:`23831`.) +time +---- + +Added new clock :data:`~time.CLOCK_UPTIME_RAW` for macOS 10.12. +(Contributed by Joannah Nanjekye in :issue:`35702`.) + unicodedata ----------- diff --git a/Misc/NEWS.d/next/Library/2019-01-10-14-03-12.bpo-35702._ct_0H.rst b/Misc/NEWS.d/next/Library/2019-01-10-14-03-12.bpo-35702._ct_0H.rst new file mode 100644 index 000000000000..f97f3d4abb71 --- /dev/null +++ b/Misc/NEWS.d/next/Library/2019-01-10-14-03-12.bpo-35702._ct_0H.rst @@ -0,0 +1 @@ +The :data:`time.CLOCK_UPTIME_RAW` constant is now available for macOS 10.12. \ No newline at end of file diff --git a/Modules/timemodule.c b/Modules/timemodule.c index fa0f1982c65e..2e0f08d24a66 100644 --- a/Modules/timemodule.c +++ b/Modules/timemodule.c @@ -1806,6 +1806,9 @@ PyInit_time(void) #ifdef CLOCK_UPTIME PyModule_AddIntMacro(m, CLOCK_UPTIME); #endif +#ifdef CLOCK_UPTIME_RAW + PyModule_AddIntMacro(m, CLOCK_UPTIME_RAW); +#endif #endif /* defined(HAVE_CLOCK_GETTIME) || defined(HAVE_CLOCK_SETTIME) || defined(HAVE_CLOCK_GETRES) */ From webhook-mailer at python.org Thu Jan 10 12:51:31 2019 From: webhook-mailer at python.org (Victor Stinner) Date: Thu, 10 Jan 2019 17:51:31 -0000 Subject: [Python-checkins] bpo-32146: Add documentation about frozen executables on Unix (GH-5850) Message-ID: https://github.com/python/cpython/commit/bab4bbb4c9cd5d25ede21a1b8c99d56e3b8dae9d commit: bab4bbb4c9cd5d25ede21a1b8c99d56e3b8dae9d branch: master author: Bo Bayles committer: Victor Stinner date: 2019-01-10T18:51:28+01:00 summary: bpo-32146: Add documentation about frozen executables on Unix (GH-5850) files: A Misc/NEWS.d/next/Library/2018-02-25-10-17-23.bpo-32146.xOzUFW.rst M Doc/library/multiprocessing.rst diff --git a/Doc/library/multiprocessing.rst b/Doc/library/multiprocessing.rst index 1d0920aa608b..a77815918d2b 100644 --- a/Doc/library/multiprocessing.rst +++ b/Doc/library/multiprocessing.rst @@ -186,6 +186,13 @@ A library which wants to use a particular start method should probably use :func:`get_context` to avoid interfering with the choice of the library user. +.. warning:: + + The ``'spawn'`` and ``'forkserver'`` start methods cannot currently + be used with "frozen" executables (i.e., binaries produced by + packages like **PyInstaller** and **cx_Freeze**) on Unix. + The ``'fork'`` start method does work. + Exchanging objects between processes ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ diff --git a/Misc/NEWS.d/next/Library/2018-02-25-10-17-23.bpo-32146.xOzUFW.rst b/Misc/NEWS.d/next/Library/2018-02-25-10-17-23.bpo-32146.xOzUFW.rst new file mode 100644 index 000000000000..f071c7101806 --- /dev/null +++ b/Misc/NEWS.d/next/Library/2018-02-25-10-17-23.bpo-32146.xOzUFW.rst @@ -0,0 +1,2 @@ +Document the interaction between frozen executables and the spawn and +forkserver start methods in multiprocessing. From webhook-mailer at python.org Thu Jan 10 13:13:26 2019 From: webhook-mailer at python.org (Miss Islington (bot)) Date: Thu, 10 Jan 2019 18:13:26 -0000 Subject: [Python-checkins] bpo-32146: Add documentation about frozen executables on Unix (GH-5850) Message-ID: https://github.com/python/cpython/commit/b9cd38f928f7eb4e18ad4b63e5c49c05c626c33e commit: b9cd38f928f7eb4e18ad4b63e5c49c05c626c33e branch: 3.7 author: Miss Islington (bot) <31488909+miss-islington at users.noreply.github.com> committer: GitHub date: 2019-01-10T10:13:21-08:00 summary: bpo-32146: Add documentation about frozen executables on Unix (GH-5850) (cherry picked from commit bab4bbb4c9cd5d25ede21a1b8c99d56e3b8dae9d) Co-authored-by: Bo Bayles files: A Misc/NEWS.d/next/Library/2018-02-25-10-17-23.bpo-32146.xOzUFW.rst M Doc/library/multiprocessing.rst diff --git a/Doc/library/multiprocessing.rst b/Doc/library/multiprocessing.rst index 504f3a1c3c33..c50625dda320 100644 --- a/Doc/library/multiprocessing.rst +++ b/Doc/library/multiprocessing.rst @@ -186,6 +186,13 @@ A library which wants to use a particular start method should probably use :func:`get_context` to avoid interfering with the choice of the library user. +.. warning:: + + The ``'spawn'`` and ``'forkserver'`` start methods cannot currently + be used with "frozen" executables (i.e., binaries produced by + packages like **PyInstaller** and **cx_Freeze**) on Unix. + The ``'fork'`` start method does work. + Exchanging objects between processes ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ diff --git a/Misc/NEWS.d/next/Library/2018-02-25-10-17-23.bpo-32146.xOzUFW.rst b/Misc/NEWS.d/next/Library/2018-02-25-10-17-23.bpo-32146.xOzUFW.rst new file mode 100644 index 000000000000..f071c7101806 --- /dev/null +++ b/Misc/NEWS.d/next/Library/2018-02-25-10-17-23.bpo-32146.xOzUFW.rst @@ -0,0 +1,2 @@ +Document the interaction between frozen executables and the spawn and +forkserver start methods in multiprocessing. From webhook-mailer at python.org Thu Jan 10 13:56:14 2019 From: webhook-mailer at python.org (Ned Deily) Date: Thu, 10 Jan 2019 18:56:14 -0000 Subject: [Python-checkins] Revert "bpo-24746: Avoid stripping trailing whitespace in doctest fancy diff (GH-10639) (GH-11477)" (GH-11509) Message-ID: https://github.com/python/cpython/commit/d09e8cecf214b1de457feae01860f5592f912a8e commit: d09e8cecf214b1de457feae01860f5592f912a8e branch: 3.6 author: Senthil Kumaran committer: Ned Deily date: 2019-01-10T13:56:02-05:00 summary: Revert "bpo-24746: Avoid stripping trailing whitespace in doctest fancy diff (GH-10639) (GH-11477)" (GH-11509) This reverts commit 5d9ae8b9df8371dd65514e0d60b561fd37056986 which was merged to 3.6 in error. files: D Misc/NEWS.d/next/Library/2018-11-22-15-22-56.bpo-24746.eSLKBE.rst M Lib/doctest.py M Lib/test/test_doctest.py diff --git a/Lib/doctest.py b/Lib/doctest.py index ca5be5927594..0b78544d8d0e 100644 --- a/Lib/doctest.py +++ b/Lib/doctest.py @@ -1691,6 +1691,8 @@ def output_difference(self, example, got, optionflags): kind = 'ndiff with -expected +actual' else: assert 0, 'Bad diff option' + # Remove trailing whitespace on diff output. + diff = [line.rstrip() + '\n' for line in diff] return 'Differences (%s):\n' % kind + _indent(''.join(diff)) # If we're not using diff, then simply list the expected diff --git a/Lib/test/test_doctest.py b/Lib/test/test_doctest.py index ad527626d1b0..2258c6b1baed 100644 --- a/Lib/test/test_doctest.py +++ b/Lib/test/test_doctest.py @@ -2431,11 +2431,6 @@ def test_unittest_reportflags(): Then the default eporting options are ignored: >>> result = suite.run(unittest.TestResult()) - """ - """ - *NOTE*: These doctest are intentionally not placed in raw string to depict - the trailing whitespace using `\x20` in the diff below. - >>> print(result.failures[0][1]) # doctest: +ELLIPSIS Traceback ... Failed example: @@ -2449,7 +2444,7 @@ def test_unittest_reportflags(): Differences (ndiff with -expected +actual): a - - +\x20 + + b @@ -2938,47 +2933,6 @@ def test_CLI(): r""" """ -def test_no_trailing_whitespace_stripping(): - r""" - The fancy reports had a bug for a long time where any trailing whitespace on - the reported diff lines was stripped, making it impossible to see the - differences in line reported as different that differed only in the amount of - trailing whitespace. The whitespace still isn't particularly visible unless - you use NDIFF, but at least it is now there to be found. - - *NOTE*: This snippet was intentionally put inside a raw string to get rid of - leading whitespace error in executing the example below - - >>> def f(x): - ... r''' - ... >>> print('\n'.join(['a ', 'b'])) - ... a - ... b - ... ''' - """ - """ - *NOTE*: These doctest are not placed in raw string to depict the trailing whitespace - using `\x20` - - >>> test = doctest.DocTestFinder().find(f)[0] - >>> flags = doctest.REPORT_NDIFF - >>> doctest.DocTestRunner(verbose=False, optionflags=flags).run(test) - ... # doctest: +ELLIPSIS - ********************************************************************** - File ..., line 3, in f - Failed example: - print('\n'.join(['a ', 'b'])) - Differences (ndiff with -expected +actual): - - a - + a - b - TestResults(failed=1, attempted=1) - - *NOTE*: `\x20` is for checking the trailing whitespace on the +a line above. - We cannot use actual spaces there, as a commit hook prevents from committing - patches that contain trailing whitespace. More info on Issue 24746. - """ - ###################################################################### ## Main ###################################################################### diff --git a/Misc/NEWS.d/next/Library/2018-11-22-15-22-56.bpo-24746.eSLKBE.rst b/Misc/NEWS.d/next/Library/2018-11-22-15-22-56.bpo-24746.eSLKBE.rst deleted file mode 100644 index c592516d1466..000000000000 --- a/Misc/NEWS.d/next/Library/2018-11-22-15-22-56.bpo-24746.eSLKBE.rst +++ /dev/null @@ -1,2 +0,0 @@ -Avoid stripping trailing whitespace in doctest fancy diff. Orignial patch by -R. David Murray & Jairo Trad. Enhanced by Sanyam Khurana. From webhook-mailer at python.org Thu Jan 10 15:55:17 2019 From: webhook-mailer at python.org (Senthil Kumaran) Date: Thu, 10 Jan 2019 20:55:17 -0000 Subject: [Python-checkins] bpo-24746: Fix doctest failures when running the testsuite with -R (#11501) (#11512) Message-ID: https://github.com/python/cpython/commit/0167c08163f44f4a033497102244bbb6150f606b commit: 0167c08163f44f4a033497102244bbb6150f606b branch: 2.7 author: Senthil Kumaran committer: GitHub date: 2019-01-10T12:55:08-08:00 summary: bpo-24746: Fix doctest failures when running the testsuite with -R (#11501) (#11512) files: M Lib/test/test_doctest.py diff --git a/Lib/test/test_doctest.py b/Lib/test/test_doctest.py index 88f4b6127938..a7a684524664 100644 --- a/Lib/test/test_doctest.py +++ b/Lib/test/test_doctest.py @@ -2355,8 +2355,7 @@ def test_unittest_reportflags(): Then the default eporting options are ignored: >>> result = suite.run(unittest.TestResult()) - """ - """ + *NOTE*: These doctest are intentionally not placed in raw string to depict the trailing whitespace using `\x20` in the diff below. From solipsis at pitrou.net Fri Jan 11 04:09:05 2019 From: solipsis at pitrou.net (solipsis at pitrou.net) Date: Fri, 11 Jan 2019 09:09:05 +0000 Subject: [Python-checkins] Daily reference leaks (4243df51fe43): sum=8 Message-ID: <20190111090905.1.374F1B4BDCA4D53B@psf.io> results for 4243df51fe43 on branch "default" -------------------------------------------- test_functools leaked [0, 3, 1] memory blocks, sum=4 test_multiprocessing_fork leaked [2, -1, 1] memory blocks, sum=2 test_multiprocessing_forkserver leaked [1, -1, 2] memory blocks, sum=2 Command line was: ['./python', '-m', 'test.regrtest', '-uall', '-R', '3:3:/home/psf-users/antoine/refleaks/reflog7ZWYOd', '--timeout', '7200'] From webhook-mailer at python.org Fri Jan 11 08:20:01 2019 From: webhook-mailer at python.org (Victor Stinner) Date: Fri, 11 Jan 2019 13:20:01 -0000 Subject: [Python-checkins] bpo-35716: Update time.CLOCK_MONOTONIC_RAW doc (GH-11517) Message-ID: https://github.com/python/cpython/commit/fd7d539be3ce1cc098a4f104b7a7816ca00add16 commit: fd7d539be3ce1cc098a4f104b7a7816ca00add16 branch: master author: Joannah Nanjekye <33177550+nanjekyejoannah at users.noreply.github.com> committer: Victor Stinner date: 2019-01-11T14:19:57+01:00 summary: bpo-35716: Update time.CLOCK_MONOTONIC_RAW doc (GH-11517) Document that the time.CLOCK_MONOTONIC_RAW constant is now also available on macOS 10.12. Co-authored-by: Ricardo Fraile files: M Doc/library/time.rst diff --git a/Doc/library/time.rst b/Doc/library/time.rst index 892ed1343a19..baf92c1400ee 100644 --- a/Doc/library/time.rst +++ b/Doc/library/time.rst @@ -772,7 +772,7 @@ These constants are used as parameters for :func:`clock_getres` and Similar to :data:`CLOCK_MONOTONIC`, but provides access to a raw hardware-based time that is not subject to NTP adjustments. - Availability: Linux 2.6.28 or later. + .. availability:: Linux 2.6.28 and newer, macOS 10.12 and newer. .. versionadded:: 3.3 @@ -799,7 +799,7 @@ These constants are used as parameters for :func:`clock_getres` and Thread-specific CPU-time clock. - Availability: Unix. + .. availability:: Unix. .. versionadded:: 3.3 From webhook-mailer at python.org Fri Jan 11 08:32:15 2019 From: webhook-mailer at python.org (Miss Islington (bot)) Date: Fri, 11 Jan 2019 13:32:15 -0000 Subject: [Python-checkins] bpo-35716: Update time.CLOCK_MONOTONIC_RAW doc (GH-11517) Message-ID: https://github.com/python/cpython/commit/8a5b1aa98f97923c39734b508aead152a5a1c911 commit: 8a5b1aa98f97923c39734b508aead152a5a1c911 branch: 3.7 author: Miss Islington (bot) <31488909+miss-islington at users.noreply.github.com> committer: GitHub date: 2019-01-11T05:32:11-08:00 summary: bpo-35716: Update time.CLOCK_MONOTONIC_RAW doc (GH-11517) Document that the time.CLOCK_MONOTONIC_RAW constant is now also available on macOS 10.12. Co-authored-by: Ricardo Fraile (cherry picked from commit fd7d539be3ce1cc098a4f104b7a7816ca00add16) Co-authored-by: Joannah Nanjekye <33177550+nanjekyejoannah at users.noreply.github.com> files: M Doc/library/time.rst diff --git a/Doc/library/time.rst b/Doc/library/time.rst index 0ffce475a368..8c6813bb7361 100644 --- a/Doc/library/time.rst +++ b/Doc/library/time.rst @@ -772,7 +772,7 @@ These constants are used as parameters for :func:`clock_getres` and Similar to :data:`CLOCK_MONOTONIC`, but provides access to a raw hardware-based time that is not subject to NTP adjustments. - Availability: Linux 2.6.28 or later. + .. availability:: Linux 2.6.28 and newer, macOS 10.12 and newer. .. versionadded:: 3.3 @@ -799,7 +799,7 @@ These constants are used as parameters for :func:`clock_getres` and Thread-specific CPU-time clock. - Availability: Unix. + .. availability:: Unix. .. versionadded:: 3.3 From webhook-mailer at python.org Fri Jan 11 08:35:19 2019 From: webhook-mailer at python.org (Victor Stinner) Date: Fri, 11 Jan 2019 13:35:19 -0000 Subject: [Python-checkins] bpo-32710: Fix _overlapped.Overlapped memory leaks (GH-11489) Message-ID: https://github.com/python/cpython/commit/5485085b324a45307c1ff4ec7d85b5998d7d5e0d commit: 5485085b324a45307c1ff4ec7d85b5998d7d5e0d branch: master author: Victor Stinner committer: GitHub date: 2019-01-11T14:35:14+01:00 summary: bpo-32710: Fix _overlapped.Overlapped memory leaks (GH-11489) Fix memory leaks in asyncio ProactorEventLoop on overlapped operation failures. Changes: * Implement the tp_traverse slot in the _overlapped.Overlapped type to help to break reference cycles and identify referrers in the garbage collector. * Always clear overlapped on failure: not only set type to TYPE_NOT_STARTED, but release also resources. files: A Misc/NEWS.d/next/Library/2019-01-10-15-55-10.bpo-32710.KwECPu.rst M Modules/overlapped.c diff --git a/Misc/NEWS.d/next/Library/2019-01-10-15-55-10.bpo-32710.KwECPu.rst b/Misc/NEWS.d/next/Library/2019-01-10-15-55-10.bpo-32710.KwECPu.rst new file mode 100644 index 000000000000..9f7a95a0aaff --- /dev/null +++ b/Misc/NEWS.d/next/Library/2019-01-10-15-55-10.bpo-32710.KwECPu.rst @@ -0,0 +1,2 @@ +Fix memory leaks in asyncio ProactorEventLoop on overlapped operation +failure. diff --git a/Modules/overlapped.c b/Modules/overlapped.c index bbaa4fb3008f..ef4390b82548 100644 --- a/Modules/overlapped.c +++ b/Modules/overlapped.c @@ -561,6 +561,28 @@ Overlapped_new(PyTypeObject *type, PyObject *args, PyObject *kwds) return (PyObject *)self; } + +/* Note (bpo-32710): OverlappedType.tp_clear is not defined to not release + buffers while overlapped are still running, to prevent a crash. */ +static int +Overlapped_clear(OverlappedObject *self) +{ + switch (self->type) { + case TYPE_READ: + case TYPE_ACCEPT: + Py_CLEAR(self->allocated_buffer); + break; + case TYPE_WRITE: + case TYPE_READINTO: + if (self->user_buffer.obj) { + PyBuffer_Release(&self->user_buffer); + } + break; + } + self->type = TYPE_NOT_STARTED; + return 0; +} + static void Overlapped_dealloc(OverlappedObject *self) { @@ -594,20 +616,11 @@ Overlapped_dealloc(OverlappedObject *self) } } - if (self->overlapped.hEvent != NULL) + if (self->overlapped.hEvent != NULL) { CloseHandle(self->overlapped.hEvent); - - switch (self->type) { - case TYPE_READ: - case TYPE_ACCEPT: - Py_CLEAR(self->allocated_buffer); - break; - case TYPE_WRITE: - case TYPE_READINTO: - if (self->user_buffer.obj) - PyBuffer_Release(&self->user_buffer); - break; } + + Overlapped_clear(self); PyObject_Del(self); SetLastError(olderr); } @@ -723,8 +736,7 @@ do_ReadFile(OverlappedObject *self, HANDLE handle, case ERROR_IO_PENDING: Py_RETURN_NONE; default: - PyBuffer_Release(&self->user_buffer); - self->type = TYPE_NOT_STARTED; + Overlapped_clear(self); return SetFromWindowsErr(err); } } @@ -827,7 +839,7 @@ do_WSARecv(OverlappedObject *self, HANDLE handle, case ERROR_IO_PENDING: Py_RETURN_NONE; default: - self->type = TYPE_NOT_STARTED; + Overlapped_clear(self); return SetFromWindowsErr(err); } } @@ -955,7 +967,7 @@ Overlapped_WriteFile(OverlappedObject *self, PyObject *args) case ERROR_IO_PENDING: Py_RETURN_NONE; default: - self->type = TYPE_NOT_STARTED; + Overlapped_clear(self); return SetFromWindowsErr(err); } } @@ -1012,8 +1024,7 @@ Overlapped_WSASend(OverlappedObject *self, PyObject *args) case ERROR_IO_PENDING: Py_RETURN_NONE; default: - PyBuffer_Release(&self->user_buffer); - self->type = TYPE_NOT_STARTED; + Overlapped_clear(self); return SetFromWindowsErr(err); } } @@ -1063,7 +1074,7 @@ Overlapped_AcceptEx(OverlappedObject *self, PyObject *args) case ERROR_IO_PENDING: Py_RETURN_NONE; default: - self->type = TYPE_NOT_STARTED; + Overlapped_clear(self); return SetFromWindowsErr(err); } } @@ -1155,7 +1166,7 @@ Overlapped_ConnectEx(OverlappedObject *self, PyObject *args) case ERROR_IO_PENDING: Py_RETURN_NONE; default: - self->type = TYPE_NOT_STARTED; + Overlapped_clear(self); return SetFromWindowsErr(err); } } @@ -1194,7 +1205,7 @@ Overlapped_DisconnectEx(OverlappedObject *self, PyObject *args) case ERROR_IO_PENDING: Py_RETURN_NONE; default: - self->type = TYPE_NOT_STARTED; + Overlapped_clear(self); return SetFromWindowsErr(err); } } @@ -1249,7 +1260,7 @@ Overlapped_TransmitFile(OverlappedObject *self, PyObject *args) case ERROR_IO_PENDING: Py_RETURN_NONE; default: - self->type = TYPE_NOT_STARTED; + Overlapped_clear(self); return SetFromWindowsErr(err); } } @@ -1290,7 +1301,7 @@ Overlapped_ConnectNamedPipe(OverlappedObject *self, PyObject *args) case ERROR_IO_PENDING: Py_RETURN_FALSE; default: - self->type = TYPE_NOT_STARTED; + Overlapped_clear(self); return SetFromWindowsErr(err); } } @@ -1340,6 +1351,25 @@ Overlapped_getpending(OverlappedObject *self) self->type != TYPE_NOT_STARTED); } +static int +Overlapped_traverse(OverlappedObject *self, visitproc visit, void *arg) +{ + switch (self->type) { + case TYPE_READ: + case TYPE_ACCEPT: + Py_VISIT(self->allocated_buffer); + break; + case TYPE_WRITE: + case TYPE_READINTO: + if (self->user_buffer.obj) { + Py_VISIT(&self->user_buffer.obj); + } + break; + } + return 0; +} + + static PyMethodDef Overlapped_methods[] = { {"getresult", (PyCFunction) Overlapped_getresult, METH_VARARGS, Overlapped_getresult_doc}, @@ -1410,7 +1440,7 @@ PyTypeObject OverlappedType = { /* tp_as_buffer */ 0, /* tp_flags */ Py_TPFLAGS_DEFAULT, /* tp_doc */ "OVERLAPPED structure wrapper", - /* tp_traverse */ 0, + /* tp_traverse */ (traverseproc)Overlapped_traverse, /* tp_clear */ 0, /* tp_richcompare */ 0, /* tp_weaklistoffset */ 0, From webhook-mailer at python.org Fri Jan 11 09:01:20 2019 From: webhook-mailer at python.org (Serhiy Storchaka) Date: Fri, 11 Jan 2019 14:01:20 -0000 Subject: [Python-checkins] bpo-35582: Argument Clinic: inline parsing code for positional parameters. (GH-11313) Message-ID: https://github.com/python/cpython/commit/4fa9591025b6a098f3d6402e5413ee6740ede6c5 commit: 4fa9591025b6a098f3d6402e5413ee6740ede6c5 branch: master author: Serhiy Storchaka committer: GitHub date: 2019-01-11T16:01:14+02:00 summary: bpo-35582: Argument Clinic: inline parsing code for positional parameters. (GH-11313) files: M Include/modsupport.h M Lib/test/clinic.test M Modules/_io/clinic/bufferedio.c.h M Modules/_io/clinic/bytesio.c.h M Modules/_io/clinic/fileio.c.h M Modules/_io/clinic/iobase.c.h M Modules/_io/clinic/stringio.c.h M Modules/_io/clinic/textio.c.h M Modules/_io/clinic/winconsoleio.c.h M Modules/cjkcodecs/clinic/multibytecodec.c.h M Modules/clinic/_bz2module.c.h M Modules/clinic/_codecsmodule.c.h M Modules/clinic/_collectionsmodule.c.h M Modules/clinic/_cryptmodule.c.h M Modules/clinic/_curses_panel.c.h M Modules/clinic/_cursesmodule.c.h M Modules/clinic/_dbmmodule.c.h M Modules/clinic/_elementtree.c.h M Modules/clinic/_gdbmmodule.c.h M Modules/clinic/_lzmamodule.c.h M Modules/clinic/_operator.c.h M Modules/clinic/_ssl.c.h M Modules/clinic/_struct.c.h M Modules/clinic/_tkinter.c.h M Modules/clinic/_tracemalloc.c.h M Modules/clinic/_weakref.c.h M Modules/clinic/arraymodule.c.h M Modules/clinic/audioop.c.h M Modules/clinic/binascii.c.h M Modules/clinic/cmathmodule.c.h M Modules/clinic/fcntlmodule.c.h M Modules/clinic/itertoolsmodule.c.h M Modules/clinic/mathmodule.c.h M Modules/clinic/posixmodule.c.h M Modules/clinic/pwdmodule.c.h M Modules/clinic/pyexpat.c.h M Modules/clinic/resource.c.h M Modules/clinic/selectmodule.c.h M Modules/clinic/signalmodule.c.h M Modules/clinic/spwdmodule.c.h M Modules/clinic/symtablemodule.c.h M Modules/clinic/unicodedata.c.h M Modules/clinic/zlibmodule.c.h M Objects/clinic/bytearrayobject.c.h M Objects/clinic/bytesobject.c.h M Objects/clinic/floatobject.c.h M Objects/clinic/listobject.c.h M Objects/clinic/longobject.c.h M Objects/clinic/tupleobject.c.h M Objects/clinic/typeobject.c.h M Objects/clinic/unicodeobject.c.h M Objects/stringlib/clinic/transmogrify.h.h M PC/clinic/msvcrtmodule.c.h M PC/clinic/winreg.c.h M Python/clinic/bltinmodule.c.h M Python/clinic/import.c.h M Python/clinic/marshal.c.h M Python/clinic/sysmodule.c.h M Python/getargs.c M Tools/clinic/clinic.py diff --git a/Include/modsupport.h b/Include/modsupport.h index ed24b2b311af..f17060c2e1e2 100644 --- a/Include/modsupport.h +++ b/Include/modsupport.h @@ -66,7 +66,12 @@ PyAPI_FUNC(int) _PyArg_NoPositional(const char *funcname, PyObject *args); #define _PyArg_NoPositional(funcname, args) \ ((args) == NULL || _PyArg_NoPositional((funcname), (args))) -PyAPI_FUNC(void) _PyArg_BadArgument(const char *, const char *, PyObject *); +PyAPI_FUNC(void) _PyArg_BadArgument(const char *, int, const char *, PyObject *); +PyAPI_FUNC(int) _PyArg_CheckPositional(const char *, Py_ssize_t, + Py_ssize_t, Py_ssize_t); +#define _PyArg_CheckPositional(funcname, nargs, min, max) \ + (((min) <= (nargs) && (nargs) <= (max)) \ + || _PyArg_CheckPositional((funcname), (nargs), (min), (max))) #endif diff --git a/Lib/test/clinic.test b/Lib/test/clinic.test index 03571a86b7d5..7ae8f9642e8a 100644 --- a/Lib/test/clinic.test +++ b/Lib/test/clinic.test @@ -35,10 +35,19 @@ test_object_converter(PyObject *module, PyObject *const *args, Py_ssize_t nargs) PyObject *c; PyUnicode_Object *d; - if (!_PyArg_ParseStack(args, nargs, "OO&O!O:test_object_converter", - &a, PyUnicode_FSConverter, &b, &PyUnicode_Type, &c, &d)) { + if (!_PyArg_CheckPositional("test_object_converter", nargs, 4, 4)) { goto exit; } + a = args[0]; + if (!PyUnicode_FSConverter(args[1], &b)) { + goto exit; + } + if (!PyUnicode_Check(args[2])) { + _PyArg_BadArgument("test_object_converter", 3, "str", args[2]); + goto exit; + } + c = args[2]; + d = (PyUnicode_Object *)args[3]; return_value = test_object_converter_impl(module, a, b, c, d); exit: @@ -48,7 +57,7 @@ exit: static PyObject * test_object_converter_impl(PyObject *module, PyObject *a, PyObject *b, PyObject *c, PyUnicode_Object *d) -/*[clinic end generated code: output=ac25517226de46c8 input=005e6a8a711a869b]*/ +/*[clinic end generated code: output=f2c26174b3d46e94 input=005e6a8a711a869b]*/ /*[clinic input] test_object_converter_one_arg @@ -159,10 +168,59 @@ test_object_converter_subclass_of(PyObject *module, PyObject *const *args, Py_ss PyObject *i; PyObject *j; - if (!_PyArg_ParseStack(args, nargs, "O!O!O!O!O!O!O!O!O!O!:test_object_converter_subclass_of", - &PyLong_Type, &a, &PyTuple_Type, &b, &PyList_Type, &c, &PySet_Type, &d, &PyFrozenSet_Type, &e, &PyDict_Type, &f, &PyUnicode_Type, &g, &PyBytes_Type, &h, &PyByteArray_Type, &i, &MyType, &j)) { + if (!_PyArg_CheckPositional("test_object_converter_subclass_of", nargs, 10, 10)) { + goto exit; + } + if (!PyLong_Check(args[0])) { + _PyArg_BadArgument("test_object_converter_subclass_of", 1, "int", args[0]); + goto exit; + } + a = args[0]; + if (!PyTuple_Check(args[1])) { + _PyArg_BadArgument("test_object_converter_subclass_of", 2, "tuple", args[1]); + goto exit; + } + b = args[1]; + if (!PyList_Check(args[2])) { + _PyArg_BadArgument("test_object_converter_subclass_of", 3, "list", args[2]); + goto exit; + } + c = args[2]; + if (!PySet_Check(args[3])) { + _PyArg_BadArgument("test_object_converter_subclass_of", 4, "set", args[3]); + goto exit; + } + d = args[3]; + if (!PyFrozenSet_Check(args[4])) { + _PyArg_BadArgument("test_object_converter_subclass_of", 5, "frozenset", args[4]); goto exit; } + e = args[4]; + if (!PyDict_Check(args[5])) { + _PyArg_BadArgument("test_object_converter_subclass_of", 6, "dict", args[5]); + goto exit; + } + f = args[5]; + if (!PyUnicode_Check(args[6])) { + _PyArg_BadArgument("test_object_converter_subclass_of", 7, "str", args[6]); + goto exit; + } + g = args[6]; + if (!PyBytes_Check(args[7])) { + _PyArg_BadArgument("test_object_converter_subclass_of", 8, "bytes", args[7]); + goto exit; + } + h = args[7]; + if (!PyByteArray_Check(args[8])) { + _PyArg_BadArgument("test_object_converter_subclass_of", 9, "bytearray", args[8]); + goto exit; + } + i = args[8]; + if (!PyObject_TypeCheck(args[9], &MyType)) { + _PyArg_BadArgument("test_object_converter_subclass_of", 10, (&MyType)->tp_name, args[9]); + goto exit; + } + j = args[9]; return_value = test_object_converter_subclass_of_impl(module, a, b, c, d, e, f, g, h, i, j); exit: @@ -174,7 +232,7 @@ test_object_converter_subclass_of_impl(PyObject *module, PyObject *a, PyObject *b, PyObject *c, PyObject *d, PyObject *e, PyObject *f, PyObject *g, PyObject *h, PyObject *i, PyObject *j) -/*[clinic end generated code: output=b26bcdc31dfc7872 input=31b06b772d5f983e]*/ +/*[clinic end generated code: output=99691bda8eeda6d6 input=31b06b772d5f983e]*/ /*[clinic input] test_PyBytesObject_converter @@ -202,7 +260,7 @@ test_PyBytesObject_converter(PyObject *module, PyObject *arg) PyBytesObject *a; if (!PyBytes_Check(arg)) { - _PyArg_BadArgument("test_PyBytesObject_converter", "bytes", arg); + _PyArg_BadArgument("test_PyBytesObject_converter", 0, "bytes", arg); goto exit; } a = (PyBytesObject *)arg; @@ -214,7 +272,7 @@ exit: static PyObject * test_PyBytesObject_converter_impl(PyObject *module, PyBytesObject *a) -/*[clinic end generated code: output=fd69d6df4d26c853 input=12b10c7cb5750400]*/ +/*[clinic end generated code: output=5d9a301c1df24eb5 input=12b10c7cb5750400]*/ /*[clinic input] test_PyByteArrayObject_converter @@ -242,7 +300,7 @@ test_PyByteArrayObject_converter(PyObject *module, PyObject *arg) PyByteArrayObject *a; if (!PyByteArray_Check(arg)) { - _PyArg_BadArgument("test_PyByteArrayObject_converter", "bytearray", arg); + _PyArg_BadArgument("test_PyByteArrayObject_converter", 0, "bytearray", arg); goto exit; } a = (PyByteArrayObject *)arg; @@ -254,7 +312,7 @@ exit: static PyObject * test_PyByteArrayObject_converter_impl(PyObject *module, PyByteArrayObject *a) -/*[clinic end generated code: output=d309c909182c4183 input=5a657da535d194ae]*/ +/*[clinic end generated code: output=9455d06f4f09637b input=5a657da535d194ae]*/ /*[clinic input] test_unicode_converter @@ -282,7 +340,7 @@ test_unicode_converter(PyObject *module, PyObject *arg) PyObject *a; if (!PyUnicode_Check(arg)) { - _PyArg_BadArgument("test_unicode_converter", "str", arg); + _PyArg_BadArgument("test_unicode_converter", 0, "str", arg); goto exit; } if (PyUnicode_READY(arg) == -1) { @@ -297,7 +355,7 @@ exit: static PyObject * test_unicode_converter_impl(PyObject *module, PyObject *a) -/*[clinic end generated code: output=ca603454e1f8f764 input=aa33612df92aa9c5]*/ +/*[clinic end generated code: output=9275c04fe204f4c5 input=aa33612df92aa9c5]*/ /*[clinic input] test_bool_converter @@ -328,10 +386,36 @@ test_bool_converter(PyObject *module, PyObject *const *args, Py_ssize_t nargs) int b = 1; int c = 1; - if (!_PyArg_ParseStack(args, nargs, "|ppi:test_bool_converter", - &a, &b, &c)) { + if (!_PyArg_CheckPositional("test_bool_converter", nargs, 0, 3)) { + goto exit; + } + if (nargs < 1) { + goto skip_optional; + } + a = PyObject_IsTrue(args[0]); + if (a < 0) { + goto exit; + } + if (nargs < 2) { + goto skip_optional; + } + b = PyObject_IsTrue(args[1]); + if (b < 0) { + goto exit; + } + if (nargs < 3) { + goto skip_optional; + } + if (PyFloat_Check(args[2])) { + PyErr_SetString(PyExc_TypeError, + "integer argument expected, got float" ); + goto exit; + } + c = _PyLong_AsInt(args[2]); + if (c == -1 && PyErr_Occurred()) { goto exit; } +skip_optional: return_value = test_bool_converter_impl(module, a, b, c); exit: @@ -340,7 +424,7 @@ exit: static PyObject * test_bool_converter_impl(PyObject *module, int a, int b, int c) -/*[clinic end generated code: output=f08a22d3390ab6f7 input=939854fa9f248c60]*/ +/*[clinic end generated code: output=25f20963894256a1 input=939854fa9f248c60]*/ /*[clinic input] test_char_converter @@ -397,10 +481,192 @@ test_char_converter(PyObject *module, PyObject *const *args, Py_ssize_t nargs) char m = '\x00'; char n = '\xff'; - if (!_PyArg_ParseStack(args, nargs, "|cccccccccccccc:test_char_converter", - &a, &b, &c, &d, &e, &f, &g, &h, &i, &j, &k, &l, &m, &n)) { + if (!_PyArg_CheckPositional("test_char_converter", nargs, 0, 14)) { + goto exit; + } + if (nargs < 1) { + goto skip_optional; + } + if (PyBytes_Check(args[0]) && PyBytes_GET_SIZE(args[0]) == 1) { + a = PyBytes_AS_STRING(args[0])[0]; + } + else if (PyByteArray_Check(args[0]) && PyByteArray_GET_SIZE(args[0]) == 1) { + a = PyByteArray_AS_STRING(args[0])[0]; + } + else { + _PyArg_BadArgument("test_char_converter", 1, "a byte string of length 1", args[0]); + goto exit; + } + if (nargs < 2) { + goto skip_optional; + } + if (PyBytes_Check(args[1]) && PyBytes_GET_SIZE(args[1]) == 1) { + b = PyBytes_AS_STRING(args[1])[0]; + } + else if (PyByteArray_Check(args[1]) && PyByteArray_GET_SIZE(args[1]) == 1) { + b = PyByteArray_AS_STRING(args[1])[0]; + } + else { + _PyArg_BadArgument("test_char_converter", 2, "a byte string of length 1", args[1]); + goto exit; + } + if (nargs < 3) { + goto skip_optional; + } + if (PyBytes_Check(args[2]) && PyBytes_GET_SIZE(args[2]) == 1) { + c = PyBytes_AS_STRING(args[2])[0]; + } + else if (PyByteArray_Check(args[2]) && PyByteArray_GET_SIZE(args[2]) == 1) { + c = PyByteArray_AS_STRING(args[2])[0]; + } + else { + _PyArg_BadArgument("test_char_converter", 3, "a byte string of length 1", args[2]); + goto exit; + } + if (nargs < 4) { + goto skip_optional; + } + if (PyBytes_Check(args[3]) && PyBytes_GET_SIZE(args[3]) == 1) { + d = PyBytes_AS_STRING(args[3])[0]; + } + else if (PyByteArray_Check(args[3]) && PyByteArray_GET_SIZE(args[3]) == 1) { + d = PyByteArray_AS_STRING(args[3])[0]; + } + else { + _PyArg_BadArgument("test_char_converter", 4, "a byte string of length 1", args[3]); + goto exit; + } + if (nargs < 5) { + goto skip_optional; + } + if (PyBytes_Check(args[4]) && PyBytes_GET_SIZE(args[4]) == 1) { + e = PyBytes_AS_STRING(args[4])[0]; + } + else if (PyByteArray_Check(args[4]) && PyByteArray_GET_SIZE(args[4]) == 1) { + e = PyByteArray_AS_STRING(args[4])[0]; + } + else { + _PyArg_BadArgument("test_char_converter", 5, "a byte string of length 1", args[4]); + goto exit; + } + if (nargs < 6) { + goto skip_optional; + } + if (PyBytes_Check(args[5]) && PyBytes_GET_SIZE(args[5]) == 1) { + f = PyBytes_AS_STRING(args[5])[0]; + } + else if (PyByteArray_Check(args[5]) && PyByteArray_GET_SIZE(args[5]) == 1) { + f = PyByteArray_AS_STRING(args[5])[0]; + } + else { + _PyArg_BadArgument("test_char_converter", 6, "a byte string of length 1", args[5]); + goto exit; + } + if (nargs < 7) { + goto skip_optional; + } + if (PyBytes_Check(args[6]) && PyBytes_GET_SIZE(args[6]) == 1) { + g = PyBytes_AS_STRING(args[6])[0]; + } + else if (PyByteArray_Check(args[6]) && PyByteArray_GET_SIZE(args[6]) == 1) { + g = PyByteArray_AS_STRING(args[6])[0]; + } + else { + _PyArg_BadArgument("test_char_converter", 7, "a byte string of length 1", args[6]); + goto exit; + } + if (nargs < 8) { + goto skip_optional; + } + if (PyBytes_Check(args[7]) && PyBytes_GET_SIZE(args[7]) == 1) { + h = PyBytes_AS_STRING(args[7])[0]; + } + else if (PyByteArray_Check(args[7]) && PyByteArray_GET_SIZE(args[7]) == 1) { + h = PyByteArray_AS_STRING(args[7])[0]; + } + else { + _PyArg_BadArgument("test_char_converter", 8, "a byte string of length 1", args[7]); + goto exit; + } + if (nargs < 9) { + goto skip_optional; + } + if (PyBytes_Check(args[8]) && PyBytes_GET_SIZE(args[8]) == 1) { + i = PyBytes_AS_STRING(args[8])[0]; + } + else if (PyByteArray_Check(args[8]) && PyByteArray_GET_SIZE(args[8]) == 1) { + i = PyByteArray_AS_STRING(args[8])[0]; + } + else { + _PyArg_BadArgument("test_char_converter", 9, "a byte string of length 1", args[8]); + goto exit; + } + if (nargs < 10) { + goto skip_optional; + } + if (PyBytes_Check(args[9]) && PyBytes_GET_SIZE(args[9]) == 1) { + j = PyBytes_AS_STRING(args[9])[0]; + } + else if (PyByteArray_Check(args[9]) && PyByteArray_GET_SIZE(args[9]) == 1) { + j = PyByteArray_AS_STRING(args[9])[0]; + } + else { + _PyArg_BadArgument("test_char_converter", 10, "a byte string of length 1", args[9]); + goto exit; + } + if (nargs < 11) { + goto skip_optional; + } + if (PyBytes_Check(args[10]) && PyBytes_GET_SIZE(args[10]) == 1) { + k = PyBytes_AS_STRING(args[10])[0]; + } + else if (PyByteArray_Check(args[10]) && PyByteArray_GET_SIZE(args[10]) == 1) { + k = PyByteArray_AS_STRING(args[10])[0]; + } + else { + _PyArg_BadArgument("test_char_converter", 11, "a byte string of length 1", args[10]); + goto exit; + } + if (nargs < 12) { + goto skip_optional; + } + if (PyBytes_Check(args[11]) && PyBytes_GET_SIZE(args[11]) == 1) { + l = PyBytes_AS_STRING(args[11])[0]; + } + else if (PyByteArray_Check(args[11]) && PyByteArray_GET_SIZE(args[11]) == 1) { + l = PyByteArray_AS_STRING(args[11])[0]; + } + else { + _PyArg_BadArgument("test_char_converter", 12, "a byte string of length 1", args[11]); + goto exit; + } + if (nargs < 13) { + goto skip_optional; + } + if (PyBytes_Check(args[12]) && PyBytes_GET_SIZE(args[12]) == 1) { + m = PyBytes_AS_STRING(args[12])[0]; + } + else if (PyByteArray_Check(args[12]) && PyByteArray_GET_SIZE(args[12]) == 1) { + m = PyByteArray_AS_STRING(args[12])[0]; + } + else { + _PyArg_BadArgument("test_char_converter", 13, "a byte string of length 1", args[12]); + goto exit; + } + if (nargs < 14) { + goto skip_optional; + } + if (PyBytes_Check(args[13]) && PyBytes_GET_SIZE(args[13]) == 1) { + n = PyBytes_AS_STRING(args[13])[0]; + } + else if (PyByteArray_Check(args[13]) && PyByteArray_GET_SIZE(args[13]) == 1) { + n = PyByteArray_AS_STRING(args[13])[0]; + } + else { + _PyArg_BadArgument("test_char_converter", 14, "a byte string of length 1", args[13]); goto exit; } +skip_optional: return_value = test_char_converter_impl(module, a, b, c, d, e, f, g, h, i, j, k, l, m, n); exit: @@ -411,7 +677,7 @@ static PyObject * test_char_converter_impl(PyObject *module, char a, char b, char c, char d, char e, char f, char g, char h, char i, char j, char k, char l, char m, char n) -/*[clinic end generated code: output=14c61e8ee78f3d47 input=e42330417a44feac]*/ +/*[clinic end generated code: output=e041d687555e0a5d input=e42330417a44feac]*/ /*[clinic input] test_unsigned_char_converter @@ -443,10 +709,81 @@ test_unsigned_char_converter(PyObject *module, PyObject *const *args, Py_ssize_t unsigned char b = 34; unsigned char c = 56; - if (!_PyArg_ParseStack(args, nargs, "|bbB:test_unsigned_char_converter", - &a, &b, &c)) { + if (!_PyArg_CheckPositional("test_unsigned_char_converter", nargs, 0, 3)) { + goto exit; + } + if (nargs < 1) { + goto skip_optional; + } + if (PyFloat_Check(args[0])) { + PyErr_SetString(PyExc_TypeError, + "integer argument expected, got float" ); + goto exit; + } + { + long ival = PyLong_AsLong(args[0]); + if (ival == -1 && PyErr_Occurred()) { + goto exit; + } + else if (ival < 0) { + PyErr_SetString(PyExc_OverflowError, + "unsigned byte integer is less than minimum"); + goto exit; + } + else if (ival > UCHAR_MAX) { + PyErr_SetString(PyExc_OverflowError, + "unsigned byte integer is greater than maximum"); + goto exit; + } + else { + a = (unsigned char) ival; + } + } + if (nargs < 2) { + goto skip_optional; + } + if (PyFloat_Check(args[1])) { + PyErr_SetString(PyExc_TypeError, + "integer argument expected, got float" ); goto exit; } + { + long ival = PyLong_AsLong(args[1]); + if (ival == -1 && PyErr_Occurred()) { + goto exit; + } + else if (ival < 0) { + PyErr_SetString(PyExc_OverflowError, + "unsigned byte integer is less than minimum"); + goto exit; + } + else if (ival > UCHAR_MAX) { + PyErr_SetString(PyExc_OverflowError, + "unsigned byte integer is greater than maximum"); + goto exit; + } + else { + b = (unsigned char) ival; + } + } + if (nargs < 3) { + goto skip_optional; + } + if (PyFloat_Check(args[2])) { + PyErr_SetString(PyExc_TypeError, + "integer argument expected, got float" ); + goto exit; + } + { + long ival = PyLong_AsUnsignedLongMask(args[2]); + if (ival == -1 && PyErr_Occurred()) { + goto exit; + } + else { + c = (unsigned char) ival; + } + } +skip_optional: return_value = test_unsigned_char_converter_impl(module, a, b, c); exit: @@ -456,7 +793,7 @@ exit: static PyObject * test_unsigned_char_converter_impl(PyObject *module, unsigned char a, unsigned char b, unsigned char c) -/*[clinic end generated code: output=a4fd7ad26077dcfc input=021414060993e289]*/ +/*[clinic end generated code: output=ebf905c5c9414762 input=021414060993e289]*/ /*[clinic input] test_short_converter @@ -483,10 +820,37 @@ test_short_converter(PyObject *module, PyObject *const *args, Py_ssize_t nargs) PyObject *return_value = NULL; short a = 12; - if (!_PyArg_ParseStack(args, nargs, "|h:test_short_converter", - &a)) { + if (!_PyArg_CheckPositional("test_short_converter", nargs, 0, 1)) { + goto exit; + } + if (nargs < 1) { + goto skip_optional; + } + if (PyFloat_Check(args[0])) { + PyErr_SetString(PyExc_TypeError, + "integer argument expected, got float" ); goto exit; } + { + long ival = PyLong_AsLong(args[0]); + if (ival == -1 && PyErr_Occurred()) { + goto exit; + } + else if (ival < SHRT_MIN) { + PyErr_SetString(PyExc_OverflowError, + "signed short integer is less than minimum"); + goto exit; + } + else if (ival > SHRT_MAX) { + PyErr_SetString(PyExc_OverflowError, + "signed short integer is greater than maximum"); + goto exit; + } + else { + a = (short) ival; + } + } +skip_optional: return_value = test_short_converter_impl(module, a); exit: @@ -495,7 +859,7 @@ exit: static PyObject * test_short_converter_impl(PyObject *module, short a) -/*[clinic end generated code: output=08259a92c12890b1 input=6a8a7a509a498ff4]*/ +/*[clinic end generated code: output=86fe1a1496a7ff20 input=6a8a7a509a498ff4]*/ /*[clinic input] test_unsigned_short_converter @@ -527,10 +891,34 @@ test_unsigned_short_converter(PyObject *module, PyObject *const *args, Py_ssize_ unsigned short b = 34; unsigned short c = 56; - if (!_PyArg_ParseStack(args, nargs, "|O&O&H:test_unsigned_short_converter", - _PyLong_UnsignedShort_Converter, &a, _PyLong_UnsignedShort_Converter, &b, &c)) { + if (!_PyArg_CheckPositional("test_unsigned_short_converter", nargs, 0, 3)) { + goto exit; + } + if (nargs < 1) { + goto skip_optional; + } + if (!_PyLong_UnsignedShort_Converter(args[0], &a)) { + goto exit; + } + if (nargs < 2) { + goto skip_optional; + } + if (!_PyLong_UnsignedShort_Converter(args[1], &b)) { + goto exit; + } + if (nargs < 3) { + goto skip_optional; + } + if (PyFloat_Check(args[2])) { + PyErr_SetString(PyExc_TypeError, + "integer argument expected, got float" ); goto exit; } + c = (unsigned short)PyLong_AsUnsignedLongMask(args[2]); + if (c == (unsigned short)-1 && PyErr_Occurred()) { + goto exit; + } +skip_optional: return_value = test_unsigned_short_converter_impl(module, a, b, c); exit: @@ -540,7 +928,7 @@ exit: static PyObject * test_unsigned_short_converter_impl(PyObject *module, unsigned short a, unsigned short b, unsigned short c) -/*[clinic end generated code: output=d714bceb98c1fe84 input=cdfd8eff3d9176b4]*/ +/*[clinic end generated code: output=3779fe104319e3ae input=cdfd8eff3d9176b4]*/ /*[clinic input] test_int_converter @@ -573,10 +961,61 @@ test_int_converter(PyObject *module, PyObject *const *args, Py_ssize_t nargs) int c = 45; myenum d = 67; - if (!_PyArg_ParseStack(args, nargs, "|iiCi:test_int_converter", - &a, &b, &c, &d)) { + if (!_PyArg_CheckPositional("test_int_converter", nargs, 0, 4)) { + goto exit; + } + if (nargs < 1) { + goto skip_optional; + } + if (PyFloat_Check(args[0])) { + PyErr_SetString(PyExc_TypeError, + "integer argument expected, got float" ); + goto exit; + } + a = _PyLong_AsInt(args[0]); + if (a == -1 && PyErr_Occurred()) { + goto exit; + } + if (nargs < 2) { + goto skip_optional; + } + if (PyFloat_Check(args[1])) { + PyErr_SetString(PyExc_TypeError, + "integer argument expected, got float" ); goto exit; } + b = _PyLong_AsInt(args[1]); + if (b == -1 && PyErr_Occurred()) { + goto exit; + } + if (nargs < 3) { + goto skip_optional; + } + if (!PyUnicode_Check(args[2])) { + _PyArg_BadArgument("test_int_converter", 3, "a unicode character", args[2]); + goto exit; + } + if (PyUnicode_READY(args[2])) { + goto exit; + } + if (PyUnicode_GET_LENGTH(args[2]) != 1) { + _PyArg_BadArgument("test_int_converter", 3, "a unicode character", args[2]); + goto exit; + } + c = PyUnicode_READ_CHAR(args[2], 0); + if (nargs < 4) { + goto skip_optional; + } + if (PyFloat_Check(args[3])) { + PyErr_SetString(PyExc_TypeError, + "integer argument expected, got float" ); + goto exit; + } + d = _PyLong_AsInt(args[3]); + if (d == -1 && PyErr_Occurred()) { + goto exit; + } +skip_optional: return_value = test_int_converter_impl(module, a, b, c, d); exit: @@ -585,7 +1024,7 @@ exit: static PyObject * test_int_converter_impl(PyObject *module, int a, int b, int c, myenum d) -/*[clinic end generated code: output=1995799d5264e4a5 input=d20541fc1ca0553e]*/ +/*[clinic end generated code: output=de74e24e85a669a5 input=d20541fc1ca0553e]*/ /*[clinic input] test_unsigned_int_converter @@ -617,10 +1056,34 @@ test_unsigned_int_converter(PyObject *module, PyObject *const *args, Py_ssize_t unsigned int b = 34; unsigned int c = 56; - if (!_PyArg_ParseStack(args, nargs, "|O&O&I:test_unsigned_int_converter", - _PyLong_UnsignedInt_Converter, &a, _PyLong_UnsignedInt_Converter, &b, &c)) { + if (!_PyArg_CheckPositional("test_unsigned_int_converter", nargs, 0, 3)) { goto exit; } + if (nargs < 1) { + goto skip_optional; + } + if (!_PyLong_UnsignedInt_Converter(args[0], &a)) { + goto exit; + } + if (nargs < 2) { + goto skip_optional; + } + if (!_PyLong_UnsignedInt_Converter(args[1], &b)) { + goto exit; + } + if (nargs < 3) { + goto skip_optional; + } + if (PyFloat_Check(args[2])) { + PyErr_SetString(PyExc_TypeError, + "integer argument expected, got float" ); + goto exit; + } + c = (unsigned int)PyLong_AsUnsignedLongMask(args[2]); + if (c == (unsigned int)-1 && PyErr_Occurred()) { + goto exit; + } +skip_optional: return_value = test_unsigned_int_converter_impl(module, a, b, c); exit: @@ -630,7 +1093,7 @@ exit: static PyObject * test_unsigned_int_converter_impl(PyObject *module, unsigned int a, unsigned int b, unsigned int c) -/*[clinic end generated code: output=3cdb7a3ef2b369a3 input=5533534828b62fc0]*/ +/*[clinic end generated code: output=189176ce67c7d2e7 input=5533534828b62fc0]*/ /*[clinic input] test_long_converter @@ -657,10 +1120,22 @@ test_long_converter(PyObject *module, PyObject *const *args, Py_ssize_t nargs) PyObject *return_value = NULL; long a = 12; - if (!_PyArg_ParseStack(args, nargs, "|l:test_long_converter", - &a)) { + if (!_PyArg_CheckPositional("test_long_converter", nargs, 0, 1)) { + goto exit; + } + if (nargs < 1) { + goto skip_optional; + } + if (PyFloat_Check(args[0])) { + PyErr_SetString(PyExc_TypeError, + "integer argument expected, got float" ); goto exit; } + a = PyLong_AsLong(args[0]); + if (a == -1 && PyErr_Occurred()) { + goto exit; + } +skip_optional: return_value = test_long_converter_impl(module, a); exit: @@ -669,7 +1144,7 @@ exit: static PyObject * test_long_converter_impl(PyObject *module, long a) -/*[clinic end generated code: output=65a8092a1dc10de6 input=d2179e3c9cdcde89]*/ +/*[clinic end generated code: output=44cd8823f59d116b input=d2179e3c9cdcde89]*/ /*[clinic input] test_unsigned_long_converter @@ -701,10 +1176,30 @@ test_unsigned_long_converter(PyObject *module, PyObject *const *args, Py_ssize_t unsigned long b = 34; unsigned long c = 56; - if (!_PyArg_ParseStack(args, nargs, "|O&O&k:test_unsigned_long_converter", - _PyLong_UnsignedLong_Converter, &a, _PyLong_UnsignedLong_Converter, &b, &c)) { + if (!_PyArg_CheckPositional("test_unsigned_long_converter", nargs, 0, 3)) { + goto exit; + } + if (nargs < 1) { + goto skip_optional; + } + if (!_PyLong_UnsignedLong_Converter(args[0], &a)) { + goto exit; + } + if (nargs < 2) { + goto skip_optional; + } + if (!_PyLong_UnsignedLong_Converter(args[1], &b)) { + goto exit; + } + if (nargs < 3) { + goto skip_optional; + } + if (!PyLong_Check(args[2])) { + _PyArg_BadArgument("test_unsigned_long_converter", 3, "int", args[2]); goto exit; } + c = PyLong_AsUnsignedLongMask(args[2]); +skip_optional: return_value = test_unsigned_long_converter_impl(module, a, b, c); exit: @@ -714,7 +1209,7 @@ exit: static PyObject * test_unsigned_long_converter_impl(PyObject *module, unsigned long a, unsigned long b, unsigned long c) -/*[clinic end generated code: output=81fd66d8981abeb2 input=f450d94cae1ef73b]*/ +/*[clinic end generated code: output=1c05c871c0309e08 input=f450d94cae1ef73b]*/ /*[clinic input] test_long_long_converter @@ -741,10 +1236,22 @@ test_long_long_converter(PyObject *module, PyObject *const *args, Py_ssize_t nar PyObject *return_value = NULL; long long a = 12; - if (!_PyArg_ParseStack(args, nargs, "|L:test_long_long_converter", - &a)) { + if (!_PyArg_CheckPositional("test_long_long_converter", nargs, 0, 1)) { + goto exit; + } + if (nargs < 1) { + goto skip_optional; + } + if (PyFloat_Check(args[0])) { + PyErr_SetString(PyExc_TypeError, + "integer argument expected, got float" ); goto exit; } + a = PyLong_AsLongLong(args[0]); + if (a == (PY_LONG_LONG)-1 && PyErr_Occurred()) { + goto exit; + } +skip_optional: return_value = test_long_long_converter_impl(module, a); exit: @@ -753,7 +1260,7 @@ exit: static PyObject * test_long_long_converter_impl(PyObject *module, long long a) -/*[clinic end generated code: output=9412f5800661b0c0 input=d5fc81577ff4dd02]*/ +/*[clinic end generated code: output=3e8083f3aee4f18a input=d5fc81577ff4dd02]*/ /*[clinic input] test_unsigned_long_long_converter @@ -787,10 +1294,30 @@ test_unsigned_long_long_converter(PyObject *module, PyObject *const *args, Py_ss unsigned long long b = 34; unsigned long long c = 56; - if (!_PyArg_ParseStack(args, nargs, "|O&O&K:test_unsigned_long_long_converter", - _PyLong_UnsignedLongLong_Converter, &a, _PyLong_UnsignedLongLong_Converter, &b, &c)) { + if (!_PyArg_CheckPositional("test_unsigned_long_long_converter", nargs, 0, 3)) { + goto exit; + } + if (nargs < 1) { + goto skip_optional; + } + if (!_PyLong_UnsignedLongLong_Converter(args[0], &a)) { + goto exit; + } + if (nargs < 2) { + goto skip_optional; + } + if (!_PyLong_UnsignedLongLong_Converter(args[1], &b)) { + goto exit; + } + if (nargs < 3) { + goto skip_optional; + } + if (!PyLong_Check(args[2])) { + _PyArg_BadArgument("test_unsigned_long_long_converter", 3, "int", args[2]); goto exit; } + c = PyLong_AsUnsignedLongLongMask(args[2]); +skip_optional: return_value = test_unsigned_long_long_converter_impl(module, a, b, c); exit: @@ -802,7 +1329,7 @@ test_unsigned_long_long_converter_impl(PyObject *module, unsigned long long a, unsigned long long b, unsigned long long c) -/*[clinic end generated code: output=f5a57e0ccc048a32 input=a15115dc41866ff4]*/ +/*[clinic end generated code: output=0a9b17fb824e28eb input=a15115dc41866ff4]*/ /*[clinic input] test_Py_ssize_t_converter @@ -834,10 +1361,56 @@ test_Py_ssize_t_converter(PyObject *module, PyObject *const *args, Py_ssize_t na Py_ssize_t b = 34; Py_ssize_t c = 56; - if (!_PyArg_ParseStack(args, nargs, "|nnO&:test_Py_ssize_t_converter", - &a, &b, _Py_convert_optional_to_ssize_t, &c)) { + if (!_PyArg_CheckPositional("test_Py_ssize_t_converter", nargs, 0, 3)) { + goto exit; + } + if (nargs < 1) { + goto skip_optional; + } + if (PyFloat_Check(args[0])) { + PyErr_SetString(PyExc_TypeError, + "integer argument expected, got float" ); goto exit; } + { + Py_ssize_t ival = -1; + PyObject *iobj = PyNumber_Index(args[0]); + if (iobj != NULL) { + ival = PyLong_AsSsize_t(iobj); + Py_DECREF(iobj); + } + if (ival == -1 && PyErr_Occurred()) { + goto exit; + } + a = ival; + } + if (nargs < 2) { + goto skip_optional; + } + if (PyFloat_Check(args[1])) { + PyErr_SetString(PyExc_TypeError, + "integer argument expected, got float" ); + goto exit; + } + { + Py_ssize_t ival = -1; + PyObject *iobj = PyNumber_Index(args[1]); + if (iobj != NULL) { + ival = PyLong_AsSsize_t(iobj); + Py_DECREF(iobj); + } + if (ival == -1 && PyErr_Occurred()) { + goto exit; + } + b = ival; + } + if (nargs < 3) { + goto skip_optional; + } + if (!_Py_convert_optional_to_ssize_t(args[2], &c)) { + goto exit; + } +skip_optional: return_value = test_Py_ssize_t_converter_impl(module, a, b, c); exit: @@ -847,7 +1420,7 @@ exit: static PyObject * test_Py_ssize_t_converter_impl(PyObject *module, Py_ssize_t a, Py_ssize_t b, Py_ssize_t c) -/*[clinic end generated code: output=50bc360c57b40e3f input=3855f184bb3f299d]*/ +/*[clinic end generated code: output=a46d2aaf40c10398 input=3855f184bb3f299d]*/ /*[clinic input] test_slice_index_converter @@ -879,10 +1452,28 @@ test_slice_index_converter(PyObject *module, PyObject *const *args, Py_ssize_t n Py_ssize_t b = 34; Py_ssize_t c = 56; - if (!_PyArg_ParseStack(args, nargs, "|O&O&O&:test_slice_index_converter", - _PyEval_SliceIndex, &a, _PyEval_SliceIndexNotNone, &b, _PyEval_SliceIndex, &c)) { + if (!_PyArg_CheckPositional("test_slice_index_converter", nargs, 0, 3)) { + goto exit; + } + if (nargs < 1) { + goto skip_optional; + } + if (!_PyEval_SliceIndex(args[0], &a)) { + goto exit; + } + if (nargs < 2) { + goto skip_optional; + } + if (!_PyEval_SliceIndexNotNone(args[1], &b)) { goto exit; } + if (nargs < 3) { + goto skip_optional; + } + if (!_PyEval_SliceIndex(args[2], &c)) { + goto exit; + } +skip_optional: return_value = test_slice_index_converter_impl(module, a, b, c); exit: @@ -892,7 +1483,7 @@ exit: static PyObject * test_slice_index_converter_impl(PyObject *module, Py_ssize_t a, Py_ssize_t b, Py_ssize_t c) -/*[clinic end generated code: output=33b9152bf3c5963a input=edeadb0ee126f531]*/ +/*[clinic end generated code: output=2148703cd3c6e941 input=edeadb0ee126f531]*/ /*[clinic input] test_size_t_converter @@ -919,10 +1510,16 @@ test_size_t_converter(PyObject *module, PyObject *const *args, Py_ssize_t nargs) PyObject *return_value = NULL; size_t a = 12; - if (!_PyArg_ParseStack(args, nargs, "|O&:test_size_t_converter", - _PyLong_Size_t_Converter, &a)) { + if (!_PyArg_CheckPositional("test_size_t_converter", nargs, 0, 1)) { goto exit; } + if (nargs < 1) { + goto skip_optional; + } + if (!_PyLong_Size_t_Converter(args[0], &a)) { + goto exit; + } +skip_optional: return_value = test_size_t_converter_impl(module, a); exit: @@ -931,7 +1528,7 @@ exit: static PyObject * test_size_t_converter_impl(PyObject *module, size_t a) -/*[clinic end generated code: output=02fe6bf4c5a139d5 input=52e93a0fed0f1fb3]*/ +/*[clinic end generated code: output=8a91a9ca8a92dabb input=52e93a0fed0f1fb3]*/ /*[clinic input] test_float_converter @@ -958,10 +1555,17 @@ test_float_converter(PyObject *module, PyObject *const *args, Py_ssize_t nargs) PyObject *return_value = NULL; float a = 12.5; - if (!_PyArg_ParseStack(args, nargs, "|f:test_float_converter", - &a)) { + if (!_PyArg_CheckPositional("test_float_converter", nargs, 0, 1)) { + goto exit; + } + if (nargs < 1) { + goto skip_optional; + } + a = (float) PyFloat_AsDouble(args[0]); + if (PyErr_Occurred()) { goto exit; } +skip_optional: return_value = test_float_converter_impl(module, a); exit: @@ -970,7 +1574,7 @@ exit: static PyObject * test_float_converter_impl(PyObject *module, float a) -/*[clinic end generated code: output=a0689b5ee76673ad input=259c0d98eca35034]*/ +/*[clinic end generated code: output=8293566b2ec1fc52 input=259c0d98eca35034]*/ /*[clinic input] test_double_converter @@ -997,10 +1601,17 @@ test_double_converter(PyObject *module, PyObject *const *args, Py_ssize_t nargs) PyObject *return_value = NULL; double a = 12.5; - if (!_PyArg_ParseStack(args, nargs, "|d:test_double_converter", - &a)) { + if (!_PyArg_CheckPositional("test_double_converter", nargs, 0, 1)) { + goto exit; + } + if (nargs < 1) { + goto skip_optional; + } + a = PyFloat_AsDouble(args[0]); + if (PyErr_Occurred()) { goto exit; } +skip_optional: return_value = test_double_converter_impl(module, a); exit: @@ -1009,7 +1620,7 @@ exit: static PyObject * test_double_converter_impl(PyObject *module, double a) -/*[clinic end generated code: output=b78d23bd37b40681 input=c6a9945706a41c27]*/ +/*[clinic end generated code: output=487081a9b8da67ab input=c6a9945706a41c27]*/ /*[clinic input] test_Py_complex_converter diff --git a/Modules/_io/clinic/bufferedio.c.h b/Modules/_io/clinic/bufferedio.c.h index 6569e02012bf..6345b9e16e85 100644 --- a/Modules/_io/clinic/bufferedio.c.h +++ b/Modules/_io/clinic/bufferedio.c.h @@ -21,11 +21,11 @@ _io__BufferedIOBase_readinto(PyObject *self, PyObject *arg) if (PyObject_GetBuffer(arg, &buffer, PyBUF_WRITABLE) < 0) { PyErr_Clear(); - _PyArg_BadArgument("readinto", "read-write bytes-like object", arg); + _PyArg_BadArgument("readinto", 0, "read-write bytes-like object", arg); goto exit; } if (!PyBuffer_IsContiguous(&buffer, 'C')) { - _PyArg_BadArgument("readinto", "contiguous buffer", arg); + _PyArg_BadArgument("readinto", 0, "contiguous buffer", arg); goto exit; } return_value = _io__BufferedIOBase_readinto_impl(self, &buffer); @@ -58,11 +58,11 @@ _io__BufferedIOBase_readinto1(PyObject *self, PyObject *arg) if (PyObject_GetBuffer(arg, &buffer, PyBUF_WRITABLE) < 0) { PyErr_Clear(); - _PyArg_BadArgument("readinto1", "read-write bytes-like object", arg); + _PyArg_BadArgument("readinto1", 0, "read-write bytes-like object", arg); goto exit; } if (!PyBuffer_IsContiguous(&buffer, 'C')) { - _PyArg_BadArgument("readinto1", "contiguous buffer", arg); + _PyArg_BadArgument("readinto1", 0, "contiguous buffer", arg); goto exit; } return_value = _io__BufferedIOBase_readinto1_impl(self, &buffer); @@ -114,10 +114,30 @@ _io__Buffered_peek(buffered *self, PyObject *const *args, Py_ssize_t nargs) PyObject *return_value = NULL; Py_ssize_t size = 0; - if (!_PyArg_ParseStack(args, nargs, "|n:peek", - &size)) { + if (!_PyArg_CheckPositional("peek", nargs, 0, 1)) { goto exit; } + if (nargs < 1) { + goto skip_optional; + } + if (PyFloat_Check(args[0])) { + PyErr_SetString(PyExc_TypeError, + "integer argument expected, got float" ); + goto exit; + } + { + Py_ssize_t ival = -1; + PyObject *iobj = PyNumber_Index(args[0]); + if (iobj != NULL) { + ival = PyLong_AsSsize_t(iobj); + Py_DECREF(iobj); + } + if (ival == -1 && PyErr_Occurred()) { + goto exit; + } + size = ival; + } +skip_optional: return_value = _io__Buffered_peek_impl(self, size); exit: @@ -141,10 +161,16 @@ _io__Buffered_read(buffered *self, PyObject *const *args, Py_ssize_t nargs) PyObject *return_value = NULL; Py_ssize_t n = -1; - if (!_PyArg_ParseStack(args, nargs, "|O&:read", - _Py_convert_optional_to_ssize_t, &n)) { + if (!_PyArg_CheckPositional("read", nargs, 0, 1)) { + goto exit; + } + if (nargs < 1) { + goto skip_optional; + } + if (!_Py_convert_optional_to_ssize_t(args[0], &n)) { goto exit; } +skip_optional: return_value = _io__Buffered_read_impl(self, n); exit: @@ -168,10 +194,30 @@ _io__Buffered_read1(buffered *self, PyObject *const *args, Py_ssize_t nargs) PyObject *return_value = NULL; Py_ssize_t n = -1; - if (!_PyArg_ParseStack(args, nargs, "|n:read1", - &n)) { + if (!_PyArg_CheckPositional("read1", nargs, 0, 1)) { + goto exit; + } + if (nargs < 1) { + goto skip_optional; + } + if (PyFloat_Check(args[0])) { + PyErr_SetString(PyExc_TypeError, + "integer argument expected, got float" ); goto exit; } + { + Py_ssize_t ival = -1; + PyObject *iobj = PyNumber_Index(args[0]); + if (iobj != NULL) { + ival = PyLong_AsSsize_t(iobj); + Py_DECREF(iobj); + } + if (ival == -1 && PyErr_Occurred()) { + goto exit; + } + n = ival; + } +skip_optional: return_value = _io__Buffered_read1_impl(self, n); exit: @@ -197,11 +243,11 @@ _io__Buffered_readinto(buffered *self, PyObject *arg) if (PyObject_GetBuffer(arg, &buffer, PyBUF_WRITABLE) < 0) { PyErr_Clear(); - _PyArg_BadArgument("readinto", "read-write bytes-like object", arg); + _PyArg_BadArgument("readinto", 0, "read-write bytes-like object", arg); goto exit; } if (!PyBuffer_IsContiguous(&buffer, 'C')) { - _PyArg_BadArgument("readinto", "contiguous buffer", arg); + _PyArg_BadArgument("readinto", 0, "contiguous buffer", arg); goto exit; } return_value = _io__Buffered_readinto_impl(self, &buffer); @@ -234,11 +280,11 @@ _io__Buffered_readinto1(buffered *self, PyObject *arg) if (PyObject_GetBuffer(arg, &buffer, PyBUF_WRITABLE) < 0) { PyErr_Clear(); - _PyArg_BadArgument("readinto1", "read-write bytes-like object", arg); + _PyArg_BadArgument("readinto1", 0, "read-write bytes-like object", arg); goto exit; } if (!PyBuffer_IsContiguous(&buffer, 'C')) { - _PyArg_BadArgument("readinto1", "contiguous buffer", arg); + _PyArg_BadArgument("readinto1", 0, "contiguous buffer", arg); goto exit; } return_value = _io__Buffered_readinto1_impl(self, &buffer); @@ -269,10 +315,16 @@ _io__Buffered_readline(buffered *self, PyObject *const *args, Py_ssize_t nargs) PyObject *return_value = NULL; Py_ssize_t size = -1; - if (!_PyArg_ParseStack(args, nargs, "|O&:readline", - _Py_convert_optional_to_ssize_t, &size)) { + if (!_PyArg_CheckPositional("readline", nargs, 0, 1)) { goto exit; } + if (nargs < 1) { + goto skip_optional; + } + if (!_Py_convert_optional_to_ssize_t(args[0], &size)) { + goto exit; + } +skip_optional: return_value = _io__Buffered_readline_impl(self, size); exit: @@ -297,10 +349,23 @@ _io__Buffered_seek(buffered *self, PyObject *const *args, Py_ssize_t nargs) PyObject *targetobj; int whence = 0; - if (!_PyArg_ParseStack(args, nargs, "O|i:seek", - &targetobj, &whence)) { + if (!_PyArg_CheckPositional("seek", nargs, 1, 2)) { + goto exit; + } + targetobj = args[0]; + if (nargs < 2) { + goto skip_optional; + } + if (PyFloat_Check(args[1])) { + PyErr_SetString(PyExc_TypeError, + "integer argument expected, got float" ); goto exit; } + whence = _PyLong_AsInt(args[1]); + if (whence == -1 && PyErr_Occurred()) { + goto exit; + } +skip_optional: return_value = _io__Buffered_seek_impl(self, targetobj, whence); exit: @@ -418,7 +483,7 @@ _io_BufferedWriter_write(buffered *self, PyObject *arg) goto exit; } if (!PyBuffer_IsContiguous(&buffer, 'C')) { - _PyArg_BadArgument("write", "contiguous buffer", arg); + _PyArg_BadArgument("write", 0, "contiguous buffer", arg); goto exit; } return_value = _io_BufferedWriter_write_impl(self, &buffer); @@ -462,10 +527,32 @@ _io_BufferedRWPair___init__(PyObject *self, PyObject *args, PyObject *kwargs) !_PyArg_NoKeywords("BufferedRWPair", kwargs)) { goto exit; } - if (!PyArg_ParseTuple(args, "OO|n:BufferedRWPair", - &reader, &writer, &buffer_size)) { + if (!_PyArg_CheckPositional("BufferedRWPair", PyTuple_GET_SIZE(args), 2, 3)) { goto exit; } + reader = PyTuple_GET_ITEM(args, 0); + writer = PyTuple_GET_ITEM(args, 1); + if (PyTuple_GET_SIZE(args) < 3) { + goto skip_optional; + } + if (PyFloat_Check(PyTuple_GET_ITEM(args, 2))) { + PyErr_SetString(PyExc_TypeError, + "integer argument expected, got float" ); + goto exit; + } + { + Py_ssize_t ival = -1; + PyObject *iobj = PyNumber_Index(PyTuple_GET_ITEM(args, 2)); + if (iobj != NULL) { + ival = PyLong_AsSsize_t(iobj); + Py_DECREF(iobj); + } + if (ival == -1 && PyErr_Occurred()) { + goto exit; + } + buffer_size = ival; + } +skip_optional: return_value = _io_BufferedRWPair___init___impl((rwpair *)self, reader, writer, buffer_size); exit: @@ -504,4 +591,4 @@ _io_BufferedRandom___init__(PyObject *self, PyObject *args, PyObject *kwargs) exit: return return_value; } -/*[clinic end generated code: output=40de95d461a20782 input=a9049054013a1b77]*/ +/*[clinic end generated code: output=a85f61f495feff5c input=a9049054013a1b77]*/ diff --git a/Modules/_io/clinic/bytesio.c.h b/Modules/_io/clinic/bytesio.c.h index 07c8866126e5..558841697b3a 100644 --- a/Modules/_io/clinic/bytesio.c.h +++ b/Modules/_io/clinic/bytesio.c.h @@ -169,10 +169,16 @@ _io_BytesIO_read(bytesio *self, PyObject *const *args, Py_ssize_t nargs) PyObject *return_value = NULL; Py_ssize_t size = -1; - if (!_PyArg_ParseStack(args, nargs, "|O&:read", - _Py_convert_optional_to_ssize_t, &size)) { + if (!_PyArg_CheckPositional("read", nargs, 0, 1)) { goto exit; } + if (nargs < 1) { + goto skip_optional; + } + if (!_Py_convert_optional_to_ssize_t(args[0], &size)) { + goto exit; + } +skip_optional: return_value = _io_BytesIO_read_impl(self, size); exit: @@ -200,10 +206,16 @@ _io_BytesIO_read1(bytesio *self, PyObject *const *args, Py_ssize_t nargs) PyObject *return_value = NULL; Py_ssize_t size = -1; - if (!_PyArg_ParseStack(args, nargs, "|O&:read1", - _Py_convert_optional_to_ssize_t, &size)) { + if (!_PyArg_CheckPositional("read1", nargs, 0, 1)) { + goto exit; + } + if (nargs < 1) { + goto skip_optional; + } + if (!_Py_convert_optional_to_ssize_t(args[0], &size)) { goto exit; } +skip_optional: return_value = _io_BytesIO_read1_impl(self, size); exit: @@ -232,10 +244,16 @@ _io_BytesIO_readline(bytesio *self, PyObject *const *args, Py_ssize_t nargs) PyObject *return_value = NULL; Py_ssize_t size = -1; - if (!_PyArg_ParseStack(args, nargs, "|O&:readline", - _Py_convert_optional_to_ssize_t, &size)) { + if (!_PyArg_CheckPositional("readline", nargs, 0, 1)) { goto exit; } + if (nargs < 1) { + goto skip_optional; + } + if (!_Py_convert_optional_to_ssize_t(args[0], &size)) { + goto exit; + } +skip_optional: return_value = _io_BytesIO_readline_impl(self, size); exit: @@ -298,11 +316,11 @@ _io_BytesIO_readinto(bytesio *self, PyObject *arg) if (PyObject_GetBuffer(arg, &buffer, PyBUF_WRITABLE) < 0) { PyErr_Clear(); - _PyArg_BadArgument("readinto", "read-write bytes-like object", arg); + _PyArg_BadArgument("readinto", 0, "read-write bytes-like object", arg); goto exit; } if (!PyBuffer_IsContiguous(&buffer, 'C')) { - _PyArg_BadArgument("readinto", "contiguous buffer", arg); + _PyArg_BadArgument("readinto", 0, "contiguous buffer", arg); goto exit; } return_value = _io_BytesIO_readinto_impl(self, &buffer); @@ -337,10 +355,16 @@ _io_BytesIO_truncate(bytesio *self, PyObject *const *args, Py_ssize_t nargs) PyObject *return_value = NULL; Py_ssize_t size = self->pos; - if (!_PyArg_ParseStack(args, nargs, "|O&:truncate", - _Py_convert_optional_to_ssize_t, &size)) { + if (!_PyArg_CheckPositional("truncate", nargs, 0, 1)) { + goto exit; + } + if (nargs < 1) { + goto skip_optional; + } + if (!_Py_convert_optional_to_ssize_t(args[0], &size)) { goto exit; } +skip_optional: return_value = _io_BytesIO_truncate_impl(self, size); exit: @@ -372,10 +396,39 @@ _io_BytesIO_seek(bytesio *self, PyObject *const *args, Py_ssize_t nargs) Py_ssize_t pos; int whence = 0; - if (!_PyArg_ParseStack(args, nargs, "n|i:seek", - &pos, &whence)) { + if (!_PyArg_CheckPositional("seek", nargs, 1, 2)) { + goto exit; + } + if (PyFloat_Check(args[0])) { + PyErr_SetString(PyExc_TypeError, + "integer argument expected, got float" ); + goto exit; + } + { + Py_ssize_t ival = -1; + PyObject *iobj = PyNumber_Index(args[0]); + if (iobj != NULL) { + ival = PyLong_AsSsize_t(iobj); + Py_DECREF(iobj); + } + if (ival == -1 && PyErr_Occurred()) { + goto exit; + } + pos = ival; + } + if (nargs < 2) { + goto skip_optional; + } + if (PyFloat_Check(args[1])) { + PyErr_SetString(PyExc_TypeError, + "integer argument expected, got float" ); + goto exit; + } + whence = _PyLong_AsInt(args[1]); + if (whence == -1 && PyErr_Occurred()) { goto exit; } +skip_optional: return_value = _io_BytesIO_seek_impl(self, pos, whence); exit: @@ -450,4 +503,4 @@ _io_BytesIO___init__(PyObject *self, PyObject *args, PyObject *kwargs) exit: return return_value; } -/*[clinic end generated code: output=f6e720f38fc6e3cd input=a9049054013a1b77]*/ +/*[clinic end generated code: output=5c68eb481fa960bf input=a9049054013a1b77]*/ diff --git a/Modules/_io/clinic/fileio.c.h b/Modules/_io/clinic/fileio.c.h index bf40f7fc4a63..280549e996d8 100644 --- a/Modules/_io/clinic/fileio.c.h +++ b/Modules/_io/clinic/fileio.c.h @@ -158,11 +158,11 @@ _io_FileIO_readinto(fileio *self, PyObject *arg) if (PyObject_GetBuffer(arg, &buffer, PyBUF_WRITABLE) < 0) { PyErr_Clear(); - _PyArg_BadArgument("readinto", "read-write bytes-like object", arg); + _PyArg_BadArgument("readinto", 0, "read-write bytes-like object", arg); goto exit; } if (!PyBuffer_IsContiguous(&buffer, 'C')) { - _PyArg_BadArgument("readinto", "contiguous buffer", arg); + _PyArg_BadArgument("readinto", 0, "contiguous buffer", arg); goto exit; } return_value = _io_FileIO_readinto_impl(self, &buffer); @@ -219,10 +219,16 @@ _io_FileIO_read(fileio *self, PyObject *const *args, Py_ssize_t nargs) PyObject *return_value = NULL; Py_ssize_t size = -1; - if (!_PyArg_ParseStack(args, nargs, "|O&:read", - _Py_convert_optional_to_ssize_t, &size)) { + if (!_PyArg_CheckPositional("read", nargs, 0, 1)) { goto exit; } + if (nargs < 1) { + goto skip_optional; + } + if (!_Py_convert_optional_to_ssize_t(args[0], &size)) { + goto exit; + } +skip_optional: return_value = _io_FileIO_read_impl(self, size); exit: @@ -255,7 +261,7 @@ _io_FileIO_write(fileio *self, PyObject *arg) goto exit; } if (!PyBuffer_IsContiguous(&b, 'C')) { - _PyArg_BadArgument("write", "contiguous buffer", arg); + _PyArg_BadArgument("write", 0, "contiguous buffer", arg); goto exit; } return_value = _io_FileIO_write_impl(self, &b); @@ -296,10 +302,23 @@ _io_FileIO_seek(fileio *self, PyObject *const *args, Py_ssize_t nargs) PyObject *pos; int whence = 0; - if (!_PyArg_ParseStack(args, nargs, "O|i:seek", - &pos, &whence)) { + if (!_PyArg_CheckPositional("seek", nargs, 1, 2)) { + goto exit; + } + pos = args[0]; + if (nargs < 2) { + goto skip_optional; + } + if (PyFloat_Check(args[1])) { + PyErr_SetString(PyExc_TypeError, + "integer argument expected, got float" ); + goto exit; + } + whence = _PyLong_AsInt(args[1]); + if (whence == -1 && PyErr_Occurred()) { goto exit; } +skip_optional: return_value = _io_FileIO_seek_impl(self, pos, whence); exit: @@ -383,4 +402,4 @@ _io_FileIO_isatty(fileio *self, PyObject *Py_UNUSED(ignored)) #ifndef _IO_FILEIO_TRUNCATE_METHODDEF #define _IO_FILEIO_TRUNCATE_METHODDEF #endif /* !defined(_IO_FILEIO_TRUNCATE_METHODDEF) */ -/*[clinic end generated code: output=8be0ea9a5ac7aa43 input=a9049054013a1b77]*/ +/*[clinic end generated code: output=4cf4e5f0cd656b11 input=a9049054013a1b77]*/ diff --git a/Modules/_io/clinic/iobase.c.h b/Modules/_io/clinic/iobase.c.h index 68eaaddce22d..a5c8eea3ec3a 100644 --- a/Modules/_io/clinic/iobase.c.h +++ b/Modules/_io/clinic/iobase.c.h @@ -185,10 +185,16 @@ _io__IOBase_readline(PyObject *self, PyObject *const *args, Py_ssize_t nargs) PyObject *return_value = NULL; Py_ssize_t limit = -1; - if (!_PyArg_ParseStack(args, nargs, "|O&:readline", - _Py_convert_optional_to_ssize_t, &limit)) { + if (!_PyArg_CheckPositional("readline", nargs, 0, 1)) { goto exit; } + if (nargs < 1) { + goto skip_optional; + } + if (!_Py_convert_optional_to_ssize_t(args[0], &limit)) { + goto exit; + } +skip_optional: return_value = _io__IOBase_readline_impl(self, limit); exit: @@ -217,10 +223,16 @@ _io__IOBase_readlines(PyObject *self, PyObject *const *args, Py_ssize_t nargs) PyObject *return_value = NULL; Py_ssize_t hint = -1; - if (!_PyArg_ParseStack(args, nargs, "|O&:readlines", - _Py_convert_optional_to_ssize_t, &hint)) { + if (!_PyArg_CheckPositional("readlines", nargs, 0, 1)) { + goto exit; + } + if (nargs < 1) { + goto skip_optional; + } + if (!_Py_convert_optional_to_ssize_t(args[0], &hint)) { goto exit; } +skip_optional: return_value = _io__IOBase_readlines_impl(self, hint); exit: @@ -252,10 +264,30 @@ _io__RawIOBase_read(PyObject *self, PyObject *const *args, Py_ssize_t nargs) PyObject *return_value = NULL; Py_ssize_t n = -1; - if (!_PyArg_ParseStack(args, nargs, "|n:read", - &n)) { + if (!_PyArg_CheckPositional("read", nargs, 0, 1)) { goto exit; } + if (nargs < 1) { + goto skip_optional; + } + if (PyFloat_Check(args[0])) { + PyErr_SetString(PyExc_TypeError, + "integer argument expected, got float" ); + goto exit; + } + { + Py_ssize_t ival = -1; + PyObject *iobj = PyNumber_Index(args[0]); + if (iobj != NULL) { + ival = PyLong_AsSsize_t(iobj); + Py_DECREF(iobj); + } + if (ival == -1 && PyErr_Occurred()) { + goto exit; + } + n = ival; + } +skip_optional: return_value = _io__RawIOBase_read_impl(self, n); exit: @@ -279,4 +311,4 @@ _io__RawIOBase_readall(PyObject *self, PyObject *Py_UNUSED(ignored)) { return _io__RawIOBase_readall_impl(self); } -/*[clinic end generated code: output=cde4b0e96a4e69e3 input=a9049054013a1b77]*/ +/*[clinic end generated code: output=60e43a7cbd9f314e input=a9049054013a1b77]*/ diff --git a/Modules/_io/clinic/stringio.c.h b/Modules/_io/clinic/stringio.c.h index 3181688f109f..1757d83ef7e1 100644 --- a/Modules/_io/clinic/stringio.c.h +++ b/Modules/_io/clinic/stringio.c.h @@ -59,10 +59,16 @@ _io_StringIO_read(stringio *self, PyObject *const *args, Py_ssize_t nargs) PyObject *return_value = NULL; Py_ssize_t size = -1; - if (!_PyArg_ParseStack(args, nargs, "|O&:read", - _Py_convert_optional_to_ssize_t, &size)) { + if (!_PyArg_CheckPositional("read", nargs, 0, 1)) { goto exit; } + if (nargs < 1) { + goto skip_optional; + } + if (!_Py_convert_optional_to_ssize_t(args[0], &size)) { + goto exit; + } +skip_optional: return_value = _io_StringIO_read_impl(self, size); exit: @@ -89,10 +95,16 @@ _io_StringIO_readline(stringio *self, PyObject *const *args, Py_ssize_t nargs) PyObject *return_value = NULL; Py_ssize_t size = -1; - if (!_PyArg_ParseStack(args, nargs, "|O&:readline", - _Py_convert_optional_to_ssize_t, &size)) { + if (!_PyArg_CheckPositional("readline", nargs, 0, 1)) { goto exit; } + if (nargs < 1) { + goto skip_optional; + } + if (!_Py_convert_optional_to_ssize_t(args[0], &size)) { + goto exit; + } +skip_optional: return_value = _io_StringIO_readline_impl(self, size); exit: @@ -121,10 +133,16 @@ _io_StringIO_truncate(stringio *self, PyObject *const *args, Py_ssize_t nargs) PyObject *return_value = NULL; Py_ssize_t size = self->pos; - if (!_PyArg_ParseStack(args, nargs, "|O&:truncate", - _Py_convert_optional_to_ssize_t, &size)) { + if (!_PyArg_CheckPositional("truncate", nargs, 0, 1)) { + goto exit; + } + if (nargs < 1) { + goto skip_optional; + } + if (!_Py_convert_optional_to_ssize_t(args[0], &size)) { goto exit; } +skip_optional: return_value = _io_StringIO_truncate_impl(self, size); exit: @@ -156,10 +174,39 @@ _io_StringIO_seek(stringio *self, PyObject *const *args, Py_ssize_t nargs) Py_ssize_t pos; int whence = 0; - if (!_PyArg_ParseStack(args, nargs, "n|i:seek", - &pos, &whence)) { + if (!_PyArg_CheckPositional("seek", nargs, 1, 2)) { + goto exit; + } + if (PyFloat_Check(args[0])) { + PyErr_SetString(PyExc_TypeError, + "integer argument expected, got float" ); + goto exit; + } + { + Py_ssize_t ival = -1; + PyObject *iobj = PyNumber_Index(args[0]); + if (iobj != NULL) { + ival = PyLong_AsSsize_t(iobj); + Py_DECREF(iobj); + } + if (ival == -1 && PyErr_Occurred()) { + goto exit; + } + pos = ival; + } + if (nargs < 2) { + goto skip_optional; + } + if (PyFloat_Check(args[1])) { + PyErr_SetString(PyExc_TypeError, + "integer argument expected, got float" ); + goto exit; + } + whence = _PyLong_AsInt(args[1]); + if (whence == -1 && PyErr_Occurred()) { goto exit; } +skip_optional: return_value = _io_StringIO_seek_impl(self, pos, whence); exit: @@ -286,4 +333,4 @@ _io_StringIO_seekable(stringio *self, PyObject *Py_UNUSED(ignored)) { return _io_StringIO_seekable_impl(self); } -/*[clinic end generated code: output=00c3c7a1c6ea6773 input=a9049054013a1b77]*/ +/*[clinic end generated code: output=db5e51dcc4dae8d5 input=a9049054013a1b77]*/ diff --git a/Modules/_io/clinic/textio.c.h b/Modules/_io/clinic/textio.c.h index f056416e33a7..0ff0324f5a4f 100644 --- a/Modules/_io/clinic/textio.c.h +++ b/Modules/_io/clinic/textio.c.h @@ -251,7 +251,7 @@ _io_TextIOWrapper_write(textio *self, PyObject *arg) PyObject *text; if (!PyUnicode_Check(arg)) { - _PyArg_BadArgument("write", "str", arg); + _PyArg_BadArgument("write", 0, "str", arg); goto exit; } if (PyUnicode_READY(arg) == -1) { @@ -281,10 +281,16 @@ _io_TextIOWrapper_read(textio *self, PyObject *const *args, Py_ssize_t nargs) PyObject *return_value = NULL; Py_ssize_t n = -1; - if (!_PyArg_ParseStack(args, nargs, "|O&:read", - _Py_convert_optional_to_ssize_t, &n)) { + if (!_PyArg_CheckPositional("read", nargs, 0, 1)) { goto exit; } + if (nargs < 1) { + goto skip_optional; + } + if (!_Py_convert_optional_to_ssize_t(args[0], &n)) { + goto exit; + } +skip_optional: return_value = _io_TextIOWrapper_read_impl(self, n); exit: @@ -308,10 +314,30 @@ _io_TextIOWrapper_readline(textio *self, PyObject *const *args, Py_ssize_t nargs PyObject *return_value = NULL; Py_ssize_t size = -1; - if (!_PyArg_ParseStack(args, nargs, "|n:readline", - &size)) { + if (!_PyArg_CheckPositional("readline", nargs, 0, 1)) { + goto exit; + } + if (nargs < 1) { + goto skip_optional; + } + if (PyFloat_Check(args[0])) { + PyErr_SetString(PyExc_TypeError, + "integer argument expected, got float" ); goto exit; } + { + Py_ssize_t ival = -1; + PyObject *iobj = PyNumber_Index(args[0]); + if (iobj != NULL) { + ival = PyLong_AsSsize_t(iobj); + Py_DECREF(iobj); + } + if (ival == -1 && PyErr_Occurred()) { + goto exit; + } + size = ival; + } +skip_optional: return_value = _io_TextIOWrapper_readline_impl(self, size); exit: @@ -336,10 +362,23 @@ _io_TextIOWrapper_seek(textio *self, PyObject *const *args, Py_ssize_t nargs) PyObject *cookieObj; int whence = 0; - if (!_PyArg_ParseStack(args, nargs, "O|i:seek", - &cookieObj, &whence)) { + if (!_PyArg_CheckPositional("seek", nargs, 1, 2)) { + goto exit; + } + cookieObj = args[0]; + if (nargs < 2) { + goto skip_optional; + } + if (PyFloat_Check(args[1])) { + PyErr_SetString(PyExc_TypeError, + "integer argument expected, got float" ); + goto exit; + } + whence = _PyLong_AsInt(args[1]); + if (whence == -1 && PyErr_Occurred()) { goto exit; } +skip_optional: return_value = _io_TextIOWrapper_seek_impl(self, cookieObj, whence); exit: @@ -509,4 +548,4 @@ _io_TextIOWrapper_close(textio *self, PyObject *Py_UNUSED(ignored)) { return _io_TextIOWrapper_close_impl(self); } -/*[clinic end generated code: output=b933f08c2f2d85cd input=a9049054013a1b77]*/ +/*[clinic end generated code: output=8bdd1035bf878d6f input=a9049054013a1b77]*/ diff --git a/Modules/_io/clinic/winconsoleio.c.h b/Modules/_io/clinic/winconsoleio.c.h index 65cac66c5d16..e7cd854ae25d 100644 --- a/Modules/_io/clinic/winconsoleio.c.h +++ b/Modules/_io/clinic/winconsoleio.c.h @@ -158,11 +158,11 @@ _io__WindowsConsoleIO_readinto(winconsoleio *self, PyObject *arg) if (PyObject_GetBuffer(arg, &buffer, PyBUF_WRITABLE) < 0) { PyErr_Clear(); - _PyArg_BadArgument("readinto", "read-write bytes-like object", arg); + _PyArg_BadArgument("readinto", 0, "read-write bytes-like object", arg); goto exit; } if (!PyBuffer_IsContiguous(&buffer, 'C')) { - _PyArg_BadArgument("readinto", "contiguous buffer", arg); + _PyArg_BadArgument("readinto", 0, "contiguous buffer", arg); goto exit; } return_value = _io__WindowsConsoleIO_readinto_impl(self, &buffer); @@ -226,10 +226,16 @@ _io__WindowsConsoleIO_read(winconsoleio *self, PyObject *const *args, Py_ssize_t PyObject *return_value = NULL; Py_ssize_t size = -1; - if (!_PyArg_ParseStack(args, nargs, "|O&:read", - _Py_convert_optional_to_ssize_t, &size)) { + if (!_PyArg_CheckPositional("read", nargs, 0, 1)) { goto exit; } + if (nargs < 1) { + goto skip_optional; + } + if (!_Py_convert_optional_to_ssize_t(args[0], &size)) { + goto exit; + } +skip_optional: return_value = _io__WindowsConsoleIO_read_impl(self, size); exit: @@ -265,7 +271,7 @@ _io__WindowsConsoleIO_write(winconsoleio *self, PyObject *arg) goto exit; } if (!PyBuffer_IsContiguous(&b, 'C')) { - _PyArg_BadArgument("write", "contiguous buffer", arg); + _PyArg_BadArgument("write", 0, "contiguous buffer", arg); goto exit; } return_value = _io__WindowsConsoleIO_write_impl(self, &b); @@ -338,4 +344,4 @@ _io__WindowsConsoleIO_isatty(winconsoleio *self, PyObject *Py_UNUSED(ignored)) #ifndef _IO__WINDOWSCONSOLEIO_ISATTY_METHODDEF #define _IO__WINDOWSCONSOLEIO_ISATTY_METHODDEF #endif /* !defined(_IO__WINDOWSCONSOLEIO_ISATTY_METHODDEF) */ -/*[clinic end generated code: output=4337e8de65915a1e input=a9049054013a1b77]*/ +/*[clinic end generated code: output=ab0f0ee8062eecb3 input=a9049054013a1b77]*/ diff --git a/Modules/cjkcodecs/clinic/multibytecodec.c.h b/Modules/cjkcodecs/clinic/multibytecodec.c.h index 1fef185b513d..871bf339afc1 100644 --- a/Modules/cjkcodecs/clinic/multibytecodec.c.h +++ b/Modules/cjkcodecs/clinic/multibytecodec.c.h @@ -151,7 +151,7 @@ _multibytecodec_MultibyteIncrementalEncoder_setstate(MultibyteIncrementalEncoder PyLongObject *statelong; if (!PyLong_Check(arg)) { - _PyArg_BadArgument("setstate", "int", arg); + _PyArg_BadArgument("setstate", 0, "int", arg); goto exit; } statelong = (PyLongObject *)arg; @@ -251,7 +251,7 @@ _multibytecodec_MultibyteIncrementalDecoder_setstate(MultibyteIncrementalDecoder PyObject *state; if (!PyTuple_Check(arg)) { - _PyArg_BadArgument("setstate", "tuple", arg); + _PyArg_BadArgument("setstate", 0, "tuple", arg); goto exit; } state = arg; @@ -422,4 +422,4 @@ PyDoc_STRVAR(_multibytecodec___create_codec__doc__, #define _MULTIBYTECODEC___CREATE_CODEC_METHODDEF \ {"__create_codec", (PyCFunction)_multibytecodec___create_codec, METH_O, _multibytecodec___create_codec__doc__}, -/*[clinic end generated code: output=a94364d0965adf1d input=a9049054013a1b77]*/ +/*[clinic end generated code: output=2ed7030b28a79029 input=a9049054013a1b77]*/ diff --git a/Modules/clinic/_bz2module.c.h b/Modules/clinic/_bz2module.c.h index dce746da413e..fb6433380541 100644 --- a/Modules/clinic/_bz2module.c.h +++ b/Modules/clinic/_bz2module.c.h @@ -29,7 +29,7 @@ _bz2_BZ2Compressor_compress(BZ2Compressor *self, PyObject *arg) goto exit; } if (!PyBuffer_IsContiguous(&data, 'C')) { - _PyArg_BadArgument("compress", "contiguous buffer", arg); + _PyArg_BadArgument("compress", 0, "contiguous buffer", arg); goto exit; } return_value = _bz2_BZ2Compressor_compress_impl(self, &data); @@ -89,10 +89,22 @@ _bz2_BZ2Compressor___init__(PyObject *self, PyObject *args, PyObject *kwargs) !_PyArg_NoKeywords("BZ2Compressor", kwargs)) { goto exit; } - if (!PyArg_ParseTuple(args, "|i:BZ2Compressor", - &compresslevel)) { + if (!_PyArg_CheckPositional("BZ2Compressor", PyTuple_GET_SIZE(args), 0, 1)) { goto exit; } + if (PyTuple_GET_SIZE(args) < 1) { + goto skip_optional; + } + if (PyFloat_Check(PyTuple_GET_ITEM(args, 0))) { + PyErr_SetString(PyExc_TypeError, + "integer argument expected, got float" ); + goto exit; + } + compresslevel = _PyLong_AsInt(PyTuple_GET_ITEM(args, 0)); + if (compresslevel == -1 && PyErr_Occurred()) { + goto exit; + } +skip_optional: return_value = _bz2_BZ2Compressor___init___impl((BZ2Compressor *)self, compresslevel); exit: @@ -178,4 +190,4 @@ _bz2_BZ2Decompressor___init__(PyObject *self, PyObject *args, PyObject *kwargs) exit: return return_value; } -/*[clinic end generated code: output=8549cccdb82f57d9 input=a9049054013a1b77]*/ +/*[clinic end generated code: output=892c6133e97ff840 input=a9049054013a1b77]*/ diff --git a/Modules/clinic/_codecsmodule.c.h b/Modules/clinic/_codecsmodule.c.h index 360a1341701b..a8223ad0a1d3 100644 --- a/Modules/clinic/_codecsmodule.c.h +++ b/Modules/clinic/_codecsmodule.c.h @@ -34,7 +34,7 @@ _codecs_lookup(PyObject *module, PyObject *arg) const char *encoding; if (!PyUnicode_Check(arg)) { - _PyArg_BadArgument("lookup", "str", arg); + _PyArg_BadArgument("lookup", 0, "str", arg); goto exit; } Py_ssize_t encoding_length; @@ -149,7 +149,7 @@ _codecs__forget_codec(PyObject *module, PyObject *arg) const char *encoding; if (!PyUnicode_Check(arg)) { - _PyArg_BadArgument("_forget_codec", "str", arg); + _PyArg_BadArgument("_forget_codec", 0, "str", arg); goto exit; } Py_ssize_t encoding_length; @@ -186,10 +186,48 @@ _codecs_escape_decode(PyObject *module, PyObject *const *args, Py_ssize_t nargs) Py_buffer data = {NULL, NULL}; const char *errors = NULL; - if (!_PyArg_ParseStack(args, nargs, "s*|z:escape_decode", - &data, &errors)) { + if (!_PyArg_CheckPositional("escape_decode", nargs, 1, 2)) { goto exit; } + if (PyUnicode_Check(args[0])) { + Py_ssize_t len; + const char *ptr = PyUnicode_AsUTF8AndSize(args[0], &len); + if (ptr == NULL) { + goto exit; + } + PyBuffer_FillInfo(&data, args[0], (void *)ptr, len, 1, 0); + } + else { /* any bytes-like object */ + if (PyObject_GetBuffer(args[0], &data, PyBUF_SIMPLE) != 0) { + goto exit; + } + if (!PyBuffer_IsContiguous(&data, 'C')) { + _PyArg_BadArgument("escape_decode", 1, "contiguous buffer", args[0]); + goto exit; + } + } + if (nargs < 2) { + goto skip_optional; + } + if (args[1] == Py_None) { + errors = NULL; + } + else if (PyUnicode_Check(args[1])) { + Py_ssize_t errors_length; + errors = PyUnicode_AsUTF8AndSize(args[1], &errors_length); + if (errors == NULL) { + goto exit; + } + if (strlen(errors) != (size_t)errors_length) { + PyErr_SetString(PyExc_ValueError, "embedded null character"); + goto exit; + } + } + else { + _PyArg_BadArgument("escape_decode", 2, "str or None", args[1]); + goto exit; + } +skip_optional: return_value = _codecs_escape_decode_impl(module, &data, errors); exit: @@ -220,10 +258,36 @@ _codecs_escape_encode(PyObject *module, PyObject *const *args, Py_ssize_t nargs) PyObject *data; const char *errors = NULL; - if (!_PyArg_ParseStack(args, nargs, "O!|z:escape_encode", - &PyBytes_Type, &data, &errors)) { + if (!_PyArg_CheckPositional("escape_encode", nargs, 1, 2)) { + goto exit; + } + if (!PyBytes_Check(args[0])) { + _PyArg_BadArgument("escape_encode", 1, "bytes", args[0]); + goto exit; + } + data = args[0]; + if (nargs < 2) { + goto skip_optional; + } + if (args[1] == Py_None) { + errors = NULL; + } + else if (PyUnicode_Check(args[1])) { + Py_ssize_t errors_length; + errors = PyUnicode_AsUTF8AndSize(args[1], &errors_length); + if (errors == NULL) { + goto exit; + } + if (strlen(errors) != (size_t)errors_length) { + PyErr_SetString(PyExc_ValueError, "embedded null character"); + goto exit; + } + } + else { + _PyArg_BadArgument("escape_encode", 2, "str or None", args[1]); goto exit; } +skip_optional: return_value = _codecs_escape_encode_impl(module, data, errors); exit: @@ -249,10 +313,32 @@ _codecs_unicode_internal_decode(PyObject *module, PyObject *const *args, Py_ssiz PyObject *obj; const char *errors = NULL; - if (!_PyArg_ParseStack(args, nargs, "O|z:unicode_internal_decode", - &obj, &errors)) { + if (!_PyArg_CheckPositional("unicode_internal_decode", nargs, 1, 2)) { + goto exit; + } + obj = args[0]; + if (nargs < 2) { + goto skip_optional; + } + if (args[1] == Py_None) { + errors = NULL; + } + else if (PyUnicode_Check(args[1])) { + Py_ssize_t errors_length; + errors = PyUnicode_AsUTF8AndSize(args[1], &errors_length); + if (errors == NULL) { + goto exit; + } + if (strlen(errors) != (size_t)errors_length) { + PyErr_SetString(PyExc_ValueError, "embedded null character"); + goto exit; + } + } + else { + _PyArg_BadArgument("unicode_internal_decode", 2, "str or None", args[1]); goto exit; } +skip_optional: return_value = _codecs_unicode_internal_decode_impl(module, obj, errors); exit: @@ -279,10 +365,50 @@ _codecs_utf_7_decode(PyObject *module, PyObject *const *args, Py_ssize_t nargs) const char *errors = NULL; int final = 0; - if (!_PyArg_ParseStack(args, nargs, "y*|zi:utf_7_decode", - &data, &errors, &final)) { + if (!_PyArg_CheckPositional("utf_7_decode", nargs, 1, 3)) { + goto exit; + } + if (PyObject_GetBuffer(args[0], &data, PyBUF_SIMPLE) != 0) { goto exit; } + if (!PyBuffer_IsContiguous(&data, 'C')) { + _PyArg_BadArgument("utf_7_decode", 1, "contiguous buffer", args[0]); + goto exit; + } + if (nargs < 2) { + goto skip_optional; + } + if (args[1] == Py_None) { + errors = NULL; + } + else if (PyUnicode_Check(args[1])) { + Py_ssize_t errors_length; + errors = PyUnicode_AsUTF8AndSize(args[1], &errors_length); + if (errors == NULL) { + goto exit; + } + if (strlen(errors) != (size_t)errors_length) { + PyErr_SetString(PyExc_ValueError, "embedded null character"); + goto exit; + } + } + else { + _PyArg_BadArgument("utf_7_decode", 2, "str or None", args[1]); + goto exit; + } + if (nargs < 3) { + goto skip_optional; + } + if (PyFloat_Check(args[2])) { + PyErr_SetString(PyExc_TypeError, + "integer argument expected, got float" ); + goto exit; + } + final = _PyLong_AsInt(args[2]); + if (final == -1 && PyErr_Occurred()) { + goto exit; + } +skip_optional: return_value = _codecs_utf_7_decode_impl(module, &data, errors, final); exit: @@ -314,10 +440,50 @@ _codecs_utf_8_decode(PyObject *module, PyObject *const *args, Py_ssize_t nargs) const char *errors = NULL; int final = 0; - if (!_PyArg_ParseStack(args, nargs, "y*|zi:utf_8_decode", - &data, &errors, &final)) { + if (!_PyArg_CheckPositional("utf_8_decode", nargs, 1, 3)) { goto exit; } + if (PyObject_GetBuffer(args[0], &data, PyBUF_SIMPLE) != 0) { + goto exit; + } + if (!PyBuffer_IsContiguous(&data, 'C')) { + _PyArg_BadArgument("utf_8_decode", 1, "contiguous buffer", args[0]); + goto exit; + } + if (nargs < 2) { + goto skip_optional; + } + if (args[1] == Py_None) { + errors = NULL; + } + else if (PyUnicode_Check(args[1])) { + Py_ssize_t errors_length; + errors = PyUnicode_AsUTF8AndSize(args[1], &errors_length); + if (errors == NULL) { + goto exit; + } + if (strlen(errors) != (size_t)errors_length) { + PyErr_SetString(PyExc_ValueError, "embedded null character"); + goto exit; + } + } + else { + _PyArg_BadArgument("utf_8_decode", 2, "str or None", args[1]); + goto exit; + } + if (nargs < 3) { + goto skip_optional; + } + if (PyFloat_Check(args[2])) { + PyErr_SetString(PyExc_TypeError, + "integer argument expected, got float" ); + goto exit; + } + final = _PyLong_AsInt(args[2]); + if (final == -1 && PyErr_Occurred()) { + goto exit; + } +skip_optional: return_value = _codecs_utf_8_decode_impl(module, &data, errors, final); exit: @@ -349,10 +515,50 @@ _codecs_utf_16_decode(PyObject *module, PyObject *const *args, Py_ssize_t nargs) const char *errors = NULL; int final = 0; - if (!_PyArg_ParseStack(args, nargs, "y*|zi:utf_16_decode", - &data, &errors, &final)) { + if (!_PyArg_CheckPositional("utf_16_decode", nargs, 1, 3)) { + goto exit; + } + if (PyObject_GetBuffer(args[0], &data, PyBUF_SIMPLE) != 0) { + goto exit; + } + if (!PyBuffer_IsContiguous(&data, 'C')) { + _PyArg_BadArgument("utf_16_decode", 1, "contiguous buffer", args[0]); + goto exit; + } + if (nargs < 2) { + goto skip_optional; + } + if (args[1] == Py_None) { + errors = NULL; + } + else if (PyUnicode_Check(args[1])) { + Py_ssize_t errors_length; + errors = PyUnicode_AsUTF8AndSize(args[1], &errors_length); + if (errors == NULL) { + goto exit; + } + if (strlen(errors) != (size_t)errors_length) { + PyErr_SetString(PyExc_ValueError, "embedded null character"); + goto exit; + } + } + else { + _PyArg_BadArgument("utf_16_decode", 2, "str or None", args[1]); + goto exit; + } + if (nargs < 3) { + goto skip_optional; + } + if (PyFloat_Check(args[2])) { + PyErr_SetString(PyExc_TypeError, + "integer argument expected, got float" ); goto exit; } + final = _PyLong_AsInt(args[2]); + if (final == -1 && PyErr_Occurred()) { + goto exit; + } +skip_optional: return_value = _codecs_utf_16_decode_impl(module, &data, errors, final); exit: @@ -384,10 +590,50 @@ _codecs_utf_16_le_decode(PyObject *module, PyObject *const *args, Py_ssize_t nar const char *errors = NULL; int final = 0; - if (!_PyArg_ParseStack(args, nargs, "y*|zi:utf_16_le_decode", - &data, &errors, &final)) { + if (!_PyArg_CheckPositional("utf_16_le_decode", nargs, 1, 3)) { + goto exit; + } + if (PyObject_GetBuffer(args[0], &data, PyBUF_SIMPLE) != 0) { + goto exit; + } + if (!PyBuffer_IsContiguous(&data, 'C')) { + _PyArg_BadArgument("utf_16_le_decode", 1, "contiguous buffer", args[0]); + goto exit; + } + if (nargs < 2) { + goto skip_optional; + } + if (args[1] == Py_None) { + errors = NULL; + } + else if (PyUnicode_Check(args[1])) { + Py_ssize_t errors_length; + errors = PyUnicode_AsUTF8AndSize(args[1], &errors_length); + if (errors == NULL) { + goto exit; + } + if (strlen(errors) != (size_t)errors_length) { + PyErr_SetString(PyExc_ValueError, "embedded null character"); + goto exit; + } + } + else { + _PyArg_BadArgument("utf_16_le_decode", 2, "str or None", args[1]); + goto exit; + } + if (nargs < 3) { + goto skip_optional; + } + if (PyFloat_Check(args[2])) { + PyErr_SetString(PyExc_TypeError, + "integer argument expected, got float" ); + goto exit; + } + final = _PyLong_AsInt(args[2]); + if (final == -1 && PyErr_Occurred()) { goto exit; } +skip_optional: return_value = _codecs_utf_16_le_decode_impl(module, &data, errors, final); exit: @@ -419,10 +665,50 @@ _codecs_utf_16_be_decode(PyObject *module, PyObject *const *args, Py_ssize_t nar const char *errors = NULL; int final = 0; - if (!_PyArg_ParseStack(args, nargs, "y*|zi:utf_16_be_decode", - &data, &errors, &final)) { + if (!_PyArg_CheckPositional("utf_16_be_decode", nargs, 1, 3)) { + goto exit; + } + if (PyObject_GetBuffer(args[0], &data, PyBUF_SIMPLE) != 0) { goto exit; } + if (!PyBuffer_IsContiguous(&data, 'C')) { + _PyArg_BadArgument("utf_16_be_decode", 1, "contiguous buffer", args[0]); + goto exit; + } + if (nargs < 2) { + goto skip_optional; + } + if (args[1] == Py_None) { + errors = NULL; + } + else if (PyUnicode_Check(args[1])) { + Py_ssize_t errors_length; + errors = PyUnicode_AsUTF8AndSize(args[1], &errors_length); + if (errors == NULL) { + goto exit; + } + if (strlen(errors) != (size_t)errors_length) { + PyErr_SetString(PyExc_ValueError, "embedded null character"); + goto exit; + } + } + else { + _PyArg_BadArgument("utf_16_be_decode", 2, "str or None", args[1]); + goto exit; + } + if (nargs < 3) { + goto skip_optional; + } + if (PyFloat_Check(args[2])) { + PyErr_SetString(PyExc_TypeError, + "integer argument expected, got float" ); + goto exit; + } + final = _PyLong_AsInt(args[2]); + if (final == -1 && PyErr_Occurred()) { + goto exit; + } +skip_optional: return_value = _codecs_utf_16_be_decode_impl(module, &data, errors, final); exit: @@ -456,10 +742,62 @@ _codecs_utf_16_ex_decode(PyObject *module, PyObject *const *args, Py_ssize_t nar int byteorder = 0; int final = 0; - if (!_PyArg_ParseStack(args, nargs, "y*|zii:utf_16_ex_decode", - &data, &errors, &byteorder, &final)) { + if (!_PyArg_CheckPositional("utf_16_ex_decode", nargs, 1, 4)) { + goto exit; + } + if (PyObject_GetBuffer(args[0], &data, PyBUF_SIMPLE) != 0) { + goto exit; + } + if (!PyBuffer_IsContiguous(&data, 'C')) { + _PyArg_BadArgument("utf_16_ex_decode", 1, "contiguous buffer", args[0]); + goto exit; + } + if (nargs < 2) { + goto skip_optional; + } + if (args[1] == Py_None) { + errors = NULL; + } + else if (PyUnicode_Check(args[1])) { + Py_ssize_t errors_length; + errors = PyUnicode_AsUTF8AndSize(args[1], &errors_length); + if (errors == NULL) { + goto exit; + } + if (strlen(errors) != (size_t)errors_length) { + PyErr_SetString(PyExc_ValueError, "embedded null character"); + goto exit; + } + } + else { + _PyArg_BadArgument("utf_16_ex_decode", 2, "str or None", args[1]); goto exit; } + if (nargs < 3) { + goto skip_optional; + } + if (PyFloat_Check(args[2])) { + PyErr_SetString(PyExc_TypeError, + "integer argument expected, got float" ); + goto exit; + } + byteorder = _PyLong_AsInt(args[2]); + if (byteorder == -1 && PyErr_Occurred()) { + goto exit; + } + if (nargs < 4) { + goto skip_optional; + } + if (PyFloat_Check(args[3])) { + PyErr_SetString(PyExc_TypeError, + "integer argument expected, got float" ); + goto exit; + } + final = _PyLong_AsInt(args[3]); + if (final == -1 && PyErr_Occurred()) { + goto exit; + } +skip_optional: return_value = _codecs_utf_16_ex_decode_impl(module, &data, errors, byteorder, final); exit: @@ -491,10 +829,50 @@ _codecs_utf_32_decode(PyObject *module, PyObject *const *args, Py_ssize_t nargs) const char *errors = NULL; int final = 0; - if (!_PyArg_ParseStack(args, nargs, "y*|zi:utf_32_decode", - &data, &errors, &final)) { + if (!_PyArg_CheckPositional("utf_32_decode", nargs, 1, 3)) { + goto exit; + } + if (PyObject_GetBuffer(args[0], &data, PyBUF_SIMPLE) != 0) { + goto exit; + } + if (!PyBuffer_IsContiguous(&data, 'C')) { + _PyArg_BadArgument("utf_32_decode", 1, "contiguous buffer", args[0]); + goto exit; + } + if (nargs < 2) { + goto skip_optional; + } + if (args[1] == Py_None) { + errors = NULL; + } + else if (PyUnicode_Check(args[1])) { + Py_ssize_t errors_length; + errors = PyUnicode_AsUTF8AndSize(args[1], &errors_length); + if (errors == NULL) { + goto exit; + } + if (strlen(errors) != (size_t)errors_length) { + PyErr_SetString(PyExc_ValueError, "embedded null character"); + goto exit; + } + } + else { + _PyArg_BadArgument("utf_32_decode", 2, "str or None", args[1]); + goto exit; + } + if (nargs < 3) { + goto skip_optional; + } + if (PyFloat_Check(args[2])) { + PyErr_SetString(PyExc_TypeError, + "integer argument expected, got float" ); goto exit; } + final = _PyLong_AsInt(args[2]); + if (final == -1 && PyErr_Occurred()) { + goto exit; + } +skip_optional: return_value = _codecs_utf_32_decode_impl(module, &data, errors, final); exit: @@ -526,10 +904,50 @@ _codecs_utf_32_le_decode(PyObject *module, PyObject *const *args, Py_ssize_t nar const char *errors = NULL; int final = 0; - if (!_PyArg_ParseStack(args, nargs, "y*|zi:utf_32_le_decode", - &data, &errors, &final)) { + if (!_PyArg_CheckPositional("utf_32_le_decode", nargs, 1, 3)) { + goto exit; + } + if (PyObject_GetBuffer(args[0], &data, PyBUF_SIMPLE) != 0) { + goto exit; + } + if (!PyBuffer_IsContiguous(&data, 'C')) { + _PyArg_BadArgument("utf_32_le_decode", 1, "contiguous buffer", args[0]); + goto exit; + } + if (nargs < 2) { + goto skip_optional; + } + if (args[1] == Py_None) { + errors = NULL; + } + else if (PyUnicode_Check(args[1])) { + Py_ssize_t errors_length; + errors = PyUnicode_AsUTF8AndSize(args[1], &errors_length); + if (errors == NULL) { + goto exit; + } + if (strlen(errors) != (size_t)errors_length) { + PyErr_SetString(PyExc_ValueError, "embedded null character"); + goto exit; + } + } + else { + _PyArg_BadArgument("utf_32_le_decode", 2, "str or None", args[1]); goto exit; } + if (nargs < 3) { + goto skip_optional; + } + if (PyFloat_Check(args[2])) { + PyErr_SetString(PyExc_TypeError, + "integer argument expected, got float" ); + goto exit; + } + final = _PyLong_AsInt(args[2]); + if (final == -1 && PyErr_Occurred()) { + goto exit; + } +skip_optional: return_value = _codecs_utf_32_le_decode_impl(module, &data, errors, final); exit: @@ -561,10 +979,50 @@ _codecs_utf_32_be_decode(PyObject *module, PyObject *const *args, Py_ssize_t nar const char *errors = NULL; int final = 0; - if (!_PyArg_ParseStack(args, nargs, "y*|zi:utf_32_be_decode", - &data, &errors, &final)) { + if (!_PyArg_CheckPositional("utf_32_be_decode", nargs, 1, 3)) { + goto exit; + } + if (PyObject_GetBuffer(args[0], &data, PyBUF_SIMPLE) != 0) { goto exit; } + if (!PyBuffer_IsContiguous(&data, 'C')) { + _PyArg_BadArgument("utf_32_be_decode", 1, "contiguous buffer", args[0]); + goto exit; + } + if (nargs < 2) { + goto skip_optional; + } + if (args[1] == Py_None) { + errors = NULL; + } + else if (PyUnicode_Check(args[1])) { + Py_ssize_t errors_length; + errors = PyUnicode_AsUTF8AndSize(args[1], &errors_length); + if (errors == NULL) { + goto exit; + } + if (strlen(errors) != (size_t)errors_length) { + PyErr_SetString(PyExc_ValueError, "embedded null character"); + goto exit; + } + } + else { + _PyArg_BadArgument("utf_32_be_decode", 2, "str or None", args[1]); + goto exit; + } + if (nargs < 3) { + goto skip_optional; + } + if (PyFloat_Check(args[2])) { + PyErr_SetString(PyExc_TypeError, + "integer argument expected, got float" ); + goto exit; + } + final = _PyLong_AsInt(args[2]); + if (final == -1 && PyErr_Occurred()) { + goto exit; + } +skip_optional: return_value = _codecs_utf_32_be_decode_impl(module, &data, errors, final); exit: @@ -598,10 +1056,62 @@ _codecs_utf_32_ex_decode(PyObject *module, PyObject *const *args, Py_ssize_t nar int byteorder = 0; int final = 0; - if (!_PyArg_ParseStack(args, nargs, "y*|zii:utf_32_ex_decode", - &data, &errors, &byteorder, &final)) { + if (!_PyArg_CheckPositional("utf_32_ex_decode", nargs, 1, 4)) { goto exit; } + if (PyObject_GetBuffer(args[0], &data, PyBUF_SIMPLE) != 0) { + goto exit; + } + if (!PyBuffer_IsContiguous(&data, 'C')) { + _PyArg_BadArgument("utf_32_ex_decode", 1, "contiguous buffer", args[0]); + goto exit; + } + if (nargs < 2) { + goto skip_optional; + } + if (args[1] == Py_None) { + errors = NULL; + } + else if (PyUnicode_Check(args[1])) { + Py_ssize_t errors_length; + errors = PyUnicode_AsUTF8AndSize(args[1], &errors_length); + if (errors == NULL) { + goto exit; + } + if (strlen(errors) != (size_t)errors_length) { + PyErr_SetString(PyExc_ValueError, "embedded null character"); + goto exit; + } + } + else { + _PyArg_BadArgument("utf_32_ex_decode", 2, "str or None", args[1]); + goto exit; + } + if (nargs < 3) { + goto skip_optional; + } + if (PyFloat_Check(args[2])) { + PyErr_SetString(PyExc_TypeError, + "integer argument expected, got float" ); + goto exit; + } + byteorder = _PyLong_AsInt(args[2]); + if (byteorder == -1 && PyErr_Occurred()) { + goto exit; + } + if (nargs < 4) { + goto skip_optional; + } + if (PyFloat_Check(args[3])) { + PyErr_SetString(PyExc_TypeError, + "integer argument expected, got float" ); + goto exit; + } + final = _PyLong_AsInt(args[3]); + if (final == -1 && PyErr_Occurred()) { + goto exit; + } +skip_optional: return_value = _codecs_utf_32_ex_decode_impl(module, &data, errors, byteorder, final); exit: @@ -632,10 +1142,48 @@ _codecs_unicode_escape_decode(PyObject *module, PyObject *const *args, Py_ssize_ Py_buffer data = {NULL, NULL}; const char *errors = NULL; - if (!_PyArg_ParseStack(args, nargs, "s*|z:unicode_escape_decode", - &data, &errors)) { + if (!_PyArg_CheckPositional("unicode_escape_decode", nargs, 1, 2)) { + goto exit; + } + if (PyUnicode_Check(args[0])) { + Py_ssize_t len; + const char *ptr = PyUnicode_AsUTF8AndSize(args[0], &len); + if (ptr == NULL) { + goto exit; + } + PyBuffer_FillInfo(&data, args[0], (void *)ptr, len, 1, 0); + } + else { /* any bytes-like object */ + if (PyObject_GetBuffer(args[0], &data, PyBUF_SIMPLE) != 0) { + goto exit; + } + if (!PyBuffer_IsContiguous(&data, 'C')) { + _PyArg_BadArgument("unicode_escape_decode", 1, "contiguous buffer", args[0]); + goto exit; + } + } + if (nargs < 2) { + goto skip_optional; + } + if (args[1] == Py_None) { + errors = NULL; + } + else if (PyUnicode_Check(args[1])) { + Py_ssize_t errors_length; + errors = PyUnicode_AsUTF8AndSize(args[1], &errors_length); + if (errors == NULL) { + goto exit; + } + if (strlen(errors) != (size_t)errors_length) { + PyErr_SetString(PyExc_ValueError, "embedded null character"); + goto exit; + } + } + else { + _PyArg_BadArgument("unicode_escape_decode", 2, "str or None", args[1]); goto exit; } +skip_optional: return_value = _codecs_unicode_escape_decode_impl(module, &data, errors); exit: @@ -666,10 +1214,48 @@ _codecs_raw_unicode_escape_decode(PyObject *module, PyObject *const *args, Py_ss Py_buffer data = {NULL, NULL}; const char *errors = NULL; - if (!_PyArg_ParseStack(args, nargs, "s*|z:raw_unicode_escape_decode", - &data, &errors)) { + if (!_PyArg_CheckPositional("raw_unicode_escape_decode", nargs, 1, 2)) { + goto exit; + } + if (PyUnicode_Check(args[0])) { + Py_ssize_t len; + const char *ptr = PyUnicode_AsUTF8AndSize(args[0], &len); + if (ptr == NULL) { + goto exit; + } + PyBuffer_FillInfo(&data, args[0], (void *)ptr, len, 1, 0); + } + else { /* any bytes-like object */ + if (PyObject_GetBuffer(args[0], &data, PyBUF_SIMPLE) != 0) { + goto exit; + } + if (!PyBuffer_IsContiguous(&data, 'C')) { + _PyArg_BadArgument("raw_unicode_escape_decode", 1, "contiguous buffer", args[0]); + goto exit; + } + } + if (nargs < 2) { + goto skip_optional; + } + if (args[1] == Py_None) { + errors = NULL; + } + else if (PyUnicode_Check(args[1])) { + Py_ssize_t errors_length; + errors = PyUnicode_AsUTF8AndSize(args[1], &errors_length); + if (errors == NULL) { + goto exit; + } + if (strlen(errors) != (size_t)errors_length) { + PyErr_SetString(PyExc_ValueError, "embedded null character"); + goto exit; + } + } + else { + _PyArg_BadArgument("raw_unicode_escape_decode", 2, "str or None", args[1]); goto exit; } +skip_optional: return_value = _codecs_raw_unicode_escape_decode_impl(module, &data, errors); exit: @@ -700,10 +1286,38 @@ _codecs_latin_1_decode(PyObject *module, PyObject *const *args, Py_ssize_t nargs Py_buffer data = {NULL, NULL}; const char *errors = NULL; - if (!_PyArg_ParseStack(args, nargs, "y*|z:latin_1_decode", - &data, &errors)) { + if (!_PyArg_CheckPositional("latin_1_decode", nargs, 1, 2)) { + goto exit; + } + if (PyObject_GetBuffer(args[0], &data, PyBUF_SIMPLE) != 0) { + goto exit; + } + if (!PyBuffer_IsContiguous(&data, 'C')) { + _PyArg_BadArgument("latin_1_decode", 1, "contiguous buffer", args[0]); goto exit; } + if (nargs < 2) { + goto skip_optional; + } + if (args[1] == Py_None) { + errors = NULL; + } + else if (PyUnicode_Check(args[1])) { + Py_ssize_t errors_length; + errors = PyUnicode_AsUTF8AndSize(args[1], &errors_length); + if (errors == NULL) { + goto exit; + } + if (strlen(errors) != (size_t)errors_length) { + PyErr_SetString(PyExc_ValueError, "embedded null character"); + goto exit; + } + } + else { + _PyArg_BadArgument("latin_1_decode", 2, "str or None", args[1]); + goto exit; + } +skip_optional: return_value = _codecs_latin_1_decode_impl(module, &data, errors); exit: @@ -734,10 +1348,38 @@ _codecs_ascii_decode(PyObject *module, PyObject *const *args, Py_ssize_t nargs) Py_buffer data = {NULL, NULL}; const char *errors = NULL; - if (!_PyArg_ParseStack(args, nargs, "y*|z:ascii_decode", - &data, &errors)) { + if (!_PyArg_CheckPositional("ascii_decode", nargs, 1, 2)) { + goto exit; + } + if (PyObject_GetBuffer(args[0], &data, PyBUF_SIMPLE) != 0) { + goto exit; + } + if (!PyBuffer_IsContiguous(&data, 'C')) { + _PyArg_BadArgument("ascii_decode", 1, "contiguous buffer", args[0]); + goto exit; + } + if (nargs < 2) { + goto skip_optional; + } + if (args[1] == Py_None) { + errors = NULL; + } + else if (PyUnicode_Check(args[1])) { + Py_ssize_t errors_length; + errors = PyUnicode_AsUTF8AndSize(args[1], &errors_length); + if (errors == NULL) { + goto exit; + } + if (strlen(errors) != (size_t)errors_length) { + PyErr_SetString(PyExc_ValueError, "embedded null character"); + goto exit; + } + } + else { + _PyArg_BadArgument("ascii_decode", 2, "str or None", args[1]); goto exit; } +skip_optional: return_value = _codecs_ascii_decode_impl(module, &data, errors); exit: @@ -769,10 +1411,42 @@ _codecs_charmap_decode(PyObject *module, PyObject *const *args, Py_ssize_t nargs const char *errors = NULL; PyObject *mapping = NULL; - if (!_PyArg_ParseStack(args, nargs, "y*|zO:charmap_decode", - &data, &errors, &mapping)) { + if (!_PyArg_CheckPositional("charmap_decode", nargs, 1, 3)) { + goto exit; + } + if (PyObject_GetBuffer(args[0], &data, PyBUF_SIMPLE) != 0) { goto exit; } + if (!PyBuffer_IsContiguous(&data, 'C')) { + _PyArg_BadArgument("charmap_decode", 1, "contiguous buffer", args[0]); + goto exit; + } + if (nargs < 2) { + goto skip_optional; + } + if (args[1] == Py_None) { + errors = NULL; + } + else if (PyUnicode_Check(args[1])) { + Py_ssize_t errors_length; + errors = PyUnicode_AsUTF8AndSize(args[1], &errors_length); + if (errors == NULL) { + goto exit; + } + if (strlen(errors) != (size_t)errors_length) { + PyErr_SetString(PyExc_ValueError, "embedded null character"); + goto exit; + } + } + else { + _PyArg_BadArgument("charmap_decode", 2, "str or None", args[1]); + goto exit; + } + if (nargs < 3) { + goto skip_optional; + } + mapping = args[2]; +skip_optional: return_value = _codecs_charmap_decode_impl(module, &data, errors, mapping); exit: @@ -806,10 +1480,50 @@ _codecs_mbcs_decode(PyObject *module, PyObject *const *args, Py_ssize_t nargs) const char *errors = NULL; int final = 0; - if (!_PyArg_ParseStack(args, nargs, "y*|zi:mbcs_decode", - &data, &errors, &final)) { + if (!_PyArg_CheckPositional("mbcs_decode", nargs, 1, 3)) { + goto exit; + } + if (PyObject_GetBuffer(args[0], &data, PyBUF_SIMPLE) != 0) { + goto exit; + } + if (!PyBuffer_IsContiguous(&data, 'C')) { + _PyArg_BadArgument("mbcs_decode", 1, "contiguous buffer", args[0]); goto exit; } + if (nargs < 2) { + goto skip_optional; + } + if (args[1] == Py_None) { + errors = NULL; + } + else if (PyUnicode_Check(args[1])) { + Py_ssize_t errors_length; + errors = PyUnicode_AsUTF8AndSize(args[1], &errors_length); + if (errors == NULL) { + goto exit; + } + if (strlen(errors) != (size_t)errors_length) { + PyErr_SetString(PyExc_ValueError, "embedded null character"); + goto exit; + } + } + else { + _PyArg_BadArgument("mbcs_decode", 2, "str or None", args[1]); + goto exit; + } + if (nargs < 3) { + goto skip_optional; + } + if (PyFloat_Check(args[2])) { + PyErr_SetString(PyExc_TypeError, + "integer argument expected, got float" ); + goto exit; + } + final = _PyLong_AsInt(args[2]); + if (final == -1 && PyErr_Occurred()) { + goto exit; + } +skip_optional: return_value = _codecs_mbcs_decode_impl(module, &data, errors, final); exit: @@ -845,10 +1559,50 @@ _codecs_oem_decode(PyObject *module, PyObject *const *args, Py_ssize_t nargs) const char *errors = NULL; int final = 0; - if (!_PyArg_ParseStack(args, nargs, "y*|zi:oem_decode", - &data, &errors, &final)) { + if (!_PyArg_CheckPositional("oem_decode", nargs, 1, 3)) { + goto exit; + } + if (PyObject_GetBuffer(args[0], &data, PyBUF_SIMPLE) != 0) { + goto exit; + } + if (!PyBuffer_IsContiguous(&data, 'C')) { + _PyArg_BadArgument("oem_decode", 1, "contiguous buffer", args[0]); + goto exit; + } + if (nargs < 2) { + goto skip_optional; + } + if (args[1] == Py_None) { + errors = NULL; + } + else if (PyUnicode_Check(args[1])) { + Py_ssize_t errors_length; + errors = PyUnicode_AsUTF8AndSize(args[1], &errors_length); + if (errors == NULL) { + goto exit; + } + if (strlen(errors) != (size_t)errors_length) { + PyErr_SetString(PyExc_ValueError, "embedded null character"); + goto exit; + } + } + else { + _PyArg_BadArgument("oem_decode", 2, "str or None", args[1]); + goto exit; + } + if (nargs < 3) { + goto skip_optional; + } + if (PyFloat_Check(args[2])) { + PyErr_SetString(PyExc_TypeError, + "integer argument expected, got float" ); + goto exit; + } + final = _PyLong_AsInt(args[2]); + if (final == -1 && PyErr_Occurred()) { goto exit; } +skip_optional: return_value = _codecs_oem_decode_impl(module, &data, errors, final); exit: @@ -885,10 +1639,59 @@ _codecs_code_page_decode(PyObject *module, PyObject *const *args, Py_ssize_t nar const char *errors = NULL; int final = 0; - if (!_PyArg_ParseStack(args, nargs, "iy*|zi:code_page_decode", - &codepage, &data, &errors, &final)) { + if (!_PyArg_CheckPositional("code_page_decode", nargs, 2, 4)) { + goto exit; + } + if (PyFloat_Check(args[0])) { + PyErr_SetString(PyExc_TypeError, + "integer argument expected, got float" ); goto exit; } + codepage = _PyLong_AsInt(args[0]); + if (codepage == -1 && PyErr_Occurred()) { + goto exit; + } + if (PyObject_GetBuffer(args[1], &data, PyBUF_SIMPLE) != 0) { + goto exit; + } + if (!PyBuffer_IsContiguous(&data, 'C')) { + _PyArg_BadArgument("code_page_decode", 2, "contiguous buffer", args[1]); + goto exit; + } + if (nargs < 3) { + goto skip_optional; + } + if (args[2] == Py_None) { + errors = NULL; + } + else if (PyUnicode_Check(args[2])) { + Py_ssize_t errors_length; + errors = PyUnicode_AsUTF8AndSize(args[2], &errors_length); + if (errors == NULL) { + goto exit; + } + if (strlen(errors) != (size_t)errors_length) { + PyErr_SetString(PyExc_ValueError, "embedded null character"); + goto exit; + } + } + else { + _PyArg_BadArgument("code_page_decode", 3, "str or None", args[2]); + goto exit; + } + if (nargs < 4) { + goto skip_optional; + } + if (PyFloat_Check(args[3])) { + PyErr_SetString(PyExc_TypeError, + "integer argument expected, got float" ); + goto exit; + } + final = _PyLong_AsInt(args[3]); + if (final == -1 && PyErr_Occurred()) { + goto exit; + } +skip_optional: return_value = _codecs_code_page_decode_impl(module, codepage, &data, errors, final); exit: @@ -921,10 +1724,48 @@ _codecs_readbuffer_encode(PyObject *module, PyObject *const *args, Py_ssize_t na Py_buffer data = {NULL, NULL}; const char *errors = NULL; - if (!_PyArg_ParseStack(args, nargs, "s*|z:readbuffer_encode", - &data, &errors)) { + if (!_PyArg_CheckPositional("readbuffer_encode", nargs, 1, 2)) { + goto exit; + } + if (PyUnicode_Check(args[0])) { + Py_ssize_t len; + const char *ptr = PyUnicode_AsUTF8AndSize(args[0], &len); + if (ptr == NULL) { + goto exit; + } + PyBuffer_FillInfo(&data, args[0], (void *)ptr, len, 1, 0); + } + else { /* any bytes-like object */ + if (PyObject_GetBuffer(args[0], &data, PyBUF_SIMPLE) != 0) { + goto exit; + } + if (!PyBuffer_IsContiguous(&data, 'C')) { + _PyArg_BadArgument("readbuffer_encode", 1, "contiguous buffer", args[0]); + goto exit; + } + } + if (nargs < 2) { + goto skip_optional; + } + if (args[1] == Py_None) { + errors = NULL; + } + else if (PyUnicode_Check(args[1])) { + Py_ssize_t errors_length; + errors = PyUnicode_AsUTF8AndSize(args[1], &errors_length); + if (errors == NULL) { + goto exit; + } + if (strlen(errors) != (size_t)errors_length) { + PyErr_SetString(PyExc_ValueError, "embedded null character"); + goto exit; + } + } + else { + _PyArg_BadArgument("readbuffer_encode", 2, "str or None", args[1]); goto exit; } +skip_optional: return_value = _codecs_readbuffer_encode_impl(module, &data, errors); exit: @@ -955,10 +1796,32 @@ _codecs_unicode_internal_encode(PyObject *module, PyObject *const *args, Py_ssiz PyObject *obj; const char *errors = NULL; - if (!_PyArg_ParseStack(args, nargs, "O|z:unicode_internal_encode", - &obj, &errors)) { + if (!_PyArg_CheckPositional("unicode_internal_encode", nargs, 1, 2)) { goto exit; } + obj = args[0]; + if (nargs < 2) { + goto skip_optional; + } + if (args[1] == Py_None) { + errors = NULL; + } + else if (PyUnicode_Check(args[1])) { + Py_ssize_t errors_length; + errors = PyUnicode_AsUTF8AndSize(args[1], &errors_length); + if (errors == NULL) { + goto exit; + } + if (strlen(errors) != (size_t)errors_length) { + PyErr_SetString(PyExc_ValueError, "embedded null character"); + goto exit; + } + } + else { + _PyArg_BadArgument("unicode_internal_encode", 2, "str or None", args[1]); + goto exit; + } +skip_optional: return_value = _codecs_unicode_internal_encode_impl(module, obj, errors); exit: @@ -984,10 +1847,39 @@ _codecs_utf_7_encode(PyObject *module, PyObject *const *args, Py_ssize_t nargs) PyObject *str; const char *errors = NULL; - if (!_PyArg_ParseStack(args, nargs, "U|z:utf_7_encode", - &str, &errors)) { + if (!_PyArg_CheckPositional("utf_7_encode", nargs, 1, 2)) { + goto exit; + } + if (!PyUnicode_Check(args[0])) { + _PyArg_BadArgument("utf_7_encode", 1, "str", args[0]); goto exit; } + if (PyUnicode_READY(args[0]) == -1) { + goto exit; + } + str = args[0]; + if (nargs < 2) { + goto skip_optional; + } + if (args[1] == Py_None) { + errors = NULL; + } + else if (PyUnicode_Check(args[1])) { + Py_ssize_t errors_length; + errors = PyUnicode_AsUTF8AndSize(args[1], &errors_length); + if (errors == NULL) { + goto exit; + } + if (strlen(errors) != (size_t)errors_length) { + PyErr_SetString(PyExc_ValueError, "embedded null character"); + goto exit; + } + } + else { + _PyArg_BadArgument("utf_7_encode", 2, "str or None", args[1]); + goto exit; + } +skip_optional: return_value = _codecs_utf_7_encode_impl(module, str, errors); exit: @@ -1013,10 +1905,39 @@ _codecs_utf_8_encode(PyObject *module, PyObject *const *args, Py_ssize_t nargs) PyObject *str; const char *errors = NULL; - if (!_PyArg_ParseStack(args, nargs, "U|z:utf_8_encode", - &str, &errors)) { + if (!_PyArg_CheckPositional("utf_8_encode", nargs, 1, 2)) { + goto exit; + } + if (!PyUnicode_Check(args[0])) { + _PyArg_BadArgument("utf_8_encode", 1, "str", args[0]); + goto exit; + } + if (PyUnicode_READY(args[0]) == -1) { + goto exit; + } + str = args[0]; + if (nargs < 2) { + goto skip_optional; + } + if (args[1] == Py_None) { + errors = NULL; + } + else if (PyUnicode_Check(args[1])) { + Py_ssize_t errors_length; + errors = PyUnicode_AsUTF8AndSize(args[1], &errors_length); + if (errors == NULL) { + goto exit; + } + if (strlen(errors) != (size_t)errors_length) { + PyErr_SetString(PyExc_ValueError, "embedded null character"); + goto exit; + } + } + else { + _PyArg_BadArgument("utf_8_encode", 2, "str or None", args[1]); goto exit; } +skip_optional: return_value = _codecs_utf_8_encode_impl(module, str, errors); exit: @@ -1043,10 +1964,51 @@ _codecs_utf_16_encode(PyObject *module, PyObject *const *args, Py_ssize_t nargs) const char *errors = NULL; int byteorder = 0; - if (!_PyArg_ParseStack(args, nargs, "U|zi:utf_16_encode", - &str, &errors, &byteorder)) { + if (!_PyArg_CheckPositional("utf_16_encode", nargs, 1, 3)) { goto exit; } + if (!PyUnicode_Check(args[0])) { + _PyArg_BadArgument("utf_16_encode", 1, "str", args[0]); + goto exit; + } + if (PyUnicode_READY(args[0]) == -1) { + goto exit; + } + str = args[0]; + if (nargs < 2) { + goto skip_optional; + } + if (args[1] == Py_None) { + errors = NULL; + } + else if (PyUnicode_Check(args[1])) { + Py_ssize_t errors_length; + errors = PyUnicode_AsUTF8AndSize(args[1], &errors_length); + if (errors == NULL) { + goto exit; + } + if (strlen(errors) != (size_t)errors_length) { + PyErr_SetString(PyExc_ValueError, "embedded null character"); + goto exit; + } + } + else { + _PyArg_BadArgument("utf_16_encode", 2, "str or None", args[1]); + goto exit; + } + if (nargs < 3) { + goto skip_optional; + } + if (PyFloat_Check(args[2])) { + PyErr_SetString(PyExc_TypeError, + "integer argument expected, got float" ); + goto exit; + } + byteorder = _PyLong_AsInt(args[2]); + if (byteorder == -1 && PyErr_Occurred()) { + goto exit; + } +skip_optional: return_value = _codecs_utf_16_encode_impl(module, str, errors, byteorder); exit: @@ -1072,10 +2034,39 @@ _codecs_utf_16_le_encode(PyObject *module, PyObject *const *args, Py_ssize_t nar PyObject *str; const char *errors = NULL; - if (!_PyArg_ParseStack(args, nargs, "U|z:utf_16_le_encode", - &str, &errors)) { + if (!_PyArg_CheckPositional("utf_16_le_encode", nargs, 1, 2)) { + goto exit; + } + if (!PyUnicode_Check(args[0])) { + _PyArg_BadArgument("utf_16_le_encode", 1, "str", args[0]); + goto exit; + } + if (PyUnicode_READY(args[0]) == -1) { goto exit; } + str = args[0]; + if (nargs < 2) { + goto skip_optional; + } + if (args[1] == Py_None) { + errors = NULL; + } + else if (PyUnicode_Check(args[1])) { + Py_ssize_t errors_length; + errors = PyUnicode_AsUTF8AndSize(args[1], &errors_length); + if (errors == NULL) { + goto exit; + } + if (strlen(errors) != (size_t)errors_length) { + PyErr_SetString(PyExc_ValueError, "embedded null character"); + goto exit; + } + } + else { + _PyArg_BadArgument("utf_16_le_encode", 2, "str or None", args[1]); + goto exit; + } +skip_optional: return_value = _codecs_utf_16_le_encode_impl(module, str, errors); exit: @@ -1101,10 +2092,39 @@ _codecs_utf_16_be_encode(PyObject *module, PyObject *const *args, Py_ssize_t nar PyObject *str; const char *errors = NULL; - if (!_PyArg_ParseStack(args, nargs, "U|z:utf_16_be_encode", - &str, &errors)) { + if (!_PyArg_CheckPositional("utf_16_be_encode", nargs, 1, 2)) { goto exit; } + if (!PyUnicode_Check(args[0])) { + _PyArg_BadArgument("utf_16_be_encode", 1, "str", args[0]); + goto exit; + } + if (PyUnicode_READY(args[0]) == -1) { + goto exit; + } + str = args[0]; + if (nargs < 2) { + goto skip_optional; + } + if (args[1] == Py_None) { + errors = NULL; + } + else if (PyUnicode_Check(args[1])) { + Py_ssize_t errors_length; + errors = PyUnicode_AsUTF8AndSize(args[1], &errors_length); + if (errors == NULL) { + goto exit; + } + if (strlen(errors) != (size_t)errors_length) { + PyErr_SetString(PyExc_ValueError, "embedded null character"); + goto exit; + } + } + else { + _PyArg_BadArgument("utf_16_be_encode", 2, "str or None", args[1]); + goto exit; + } +skip_optional: return_value = _codecs_utf_16_be_encode_impl(module, str, errors); exit: @@ -1131,10 +2151,51 @@ _codecs_utf_32_encode(PyObject *module, PyObject *const *args, Py_ssize_t nargs) const char *errors = NULL; int byteorder = 0; - if (!_PyArg_ParseStack(args, nargs, "U|zi:utf_32_encode", - &str, &errors, &byteorder)) { + if (!_PyArg_CheckPositional("utf_32_encode", nargs, 1, 3)) { + goto exit; + } + if (!PyUnicode_Check(args[0])) { + _PyArg_BadArgument("utf_32_encode", 1, "str", args[0]); + goto exit; + } + if (PyUnicode_READY(args[0]) == -1) { + goto exit; + } + str = args[0]; + if (nargs < 2) { + goto skip_optional; + } + if (args[1] == Py_None) { + errors = NULL; + } + else if (PyUnicode_Check(args[1])) { + Py_ssize_t errors_length; + errors = PyUnicode_AsUTF8AndSize(args[1], &errors_length); + if (errors == NULL) { + goto exit; + } + if (strlen(errors) != (size_t)errors_length) { + PyErr_SetString(PyExc_ValueError, "embedded null character"); + goto exit; + } + } + else { + _PyArg_BadArgument("utf_32_encode", 2, "str or None", args[1]); goto exit; } + if (nargs < 3) { + goto skip_optional; + } + if (PyFloat_Check(args[2])) { + PyErr_SetString(PyExc_TypeError, + "integer argument expected, got float" ); + goto exit; + } + byteorder = _PyLong_AsInt(args[2]); + if (byteorder == -1 && PyErr_Occurred()) { + goto exit; + } +skip_optional: return_value = _codecs_utf_32_encode_impl(module, str, errors, byteorder); exit: @@ -1160,10 +2221,39 @@ _codecs_utf_32_le_encode(PyObject *module, PyObject *const *args, Py_ssize_t nar PyObject *str; const char *errors = NULL; - if (!_PyArg_ParseStack(args, nargs, "U|z:utf_32_le_encode", - &str, &errors)) { + if (!_PyArg_CheckPositional("utf_32_le_encode", nargs, 1, 2)) { + goto exit; + } + if (!PyUnicode_Check(args[0])) { + _PyArg_BadArgument("utf_32_le_encode", 1, "str", args[0]); + goto exit; + } + if (PyUnicode_READY(args[0]) == -1) { + goto exit; + } + str = args[0]; + if (nargs < 2) { + goto skip_optional; + } + if (args[1] == Py_None) { + errors = NULL; + } + else if (PyUnicode_Check(args[1])) { + Py_ssize_t errors_length; + errors = PyUnicode_AsUTF8AndSize(args[1], &errors_length); + if (errors == NULL) { + goto exit; + } + if (strlen(errors) != (size_t)errors_length) { + PyErr_SetString(PyExc_ValueError, "embedded null character"); + goto exit; + } + } + else { + _PyArg_BadArgument("utf_32_le_encode", 2, "str or None", args[1]); goto exit; } +skip_optional: return_value = _codecs_utf_32_le_encode_impl(module, str, errors); exit: @@ -1189,10 +2279,39 @@ _codecs_utf_32_be_encode(PyObject *module, PyObject *const *args, Py_ssize_t nar PyObject *str; const char *errors = NULL; - if (!_PyArg_ParseStack(args, nargs, "U|z:utf_32_be_encode", - &str, &errors)) { + if (!_PyArg_CheckPositional("utf_32_be_encode", nargs, 1, 2)) { + goto exit; + } + if (!PyUnicode_Check(args[0])) { + _PyArg_BadArgument("utf_32_be_encode", 1, "str", args[0]); + goto exit; + } + if (PyUnicode_READY(args[0]) == -1) { goto exit; } + str = args[0]; + if (nargs < 2) { + goto skip_optional; + } + if (args[1] == Py_None) { + errors = NULL; + } + else if (PyUnicode_Check(args[1])) { + Py_ssize_t errors_length; + errors = PyUnicode_AsUTF8AndSize(args[1], &errors_length); + if (errors == NULL) { + goto exit; + } + if (strlen(errors) != (size_t)errors_length) { + PyErr_SetString(PyExc_ValueError, "embedded null character"); + goto exit; + } + } + else { + _PyArg_BadArgument("utf_32_be_encode", 2, "str or None", args[1]); + goto exit; + } +skip_optional: return_value = _codecs_utf_32_be_encode_impl(module, str, errors); exit: @@ -1218,10 +2337,39 @@ _codecs_unicode_escape_encode(PyObject *module, PyObject *const *args, Py_ssize_ PyObject *str; const char *errors = NULL; - if (!_PyArg_ParseStack(args, nargs, "U|z:unicode_escape_encode", - &str, &errors)) { + if (!_PyArg_CheckPositional("unicode_escape_encode", nargs, 1, 2)) { goto exit; } + if (!PyUnicode_Check(args[0])) { + _PyArg_BadArgument("unicode_escape_encode", 1, "str", args[0]); + goto exit; + } + if (PyUnicode_READY(args[0]) == -1) { + goto exit; + } + str = args[0]; + if (nargs < 2) { + goto skip_optional; + } + if (args[1] == Py_None) { + errors = NULL; + } + else if (PyUnicode_Check(args[1])) { + Py_ssize_t errors_length; + errors = PyUnicode_AsUTF8AndSize(args[1], &errors_length); + if (errors == NULL) { + goto exit; + } + if (strlen(errors) != (size_t)errors_length) { + PyErr_SetString(PyExc_ValueError, "embedded null character"); + goto exit; + } + } + else { + _PyArg_BadArgument("unicode_escape_encode", 2, "str or None", args[1]); + goto exit; + } +skip_optional: return_value = _codecs_unicode_escape_encode_impl(module, str, errors); exit: @@ -1247,10 +2395,39 @@ _codecs_raw_unicode_escape_encode(PyObject *module, PyObject *const *args, Py_ss PyObject *str; const char *errors = NULL; - if (!_PyArg_ParseStack(args, nargs, "U|z:raw_unicode_escape_encode", - &str, &errors)) { + if (!_PyArg_CheckPositional("raw_unicode_escape_encode", nargs, 1, 2)) { + goto exit; + } + if (!PyUnicode_Check(args[0])) { + _PyArg_BadArgument("raw_unicode_escape_encode", 1, "str", args[0]); goto exit; } + if (PyUnicode_READY(args[0]) == -1) { + goto exit; + } + str = args[0]; + if (nargs < 2) { + goto skip_optional; + } + if (args[1] == Py_None) { + errors = NULL; + } + else if (PyUnicode_Check(args[1])) { + Py_ssize_t errors_length; + errors = PyUnicode_AsUTF8AndSize(args[1], &errors_length); + if (errors == NULL) { + goto exit; + } + if (strlen(errors) != (size_t)errors_length) { + PyErr_SetString(PyExc_ValueError, "embedded null character"); + goto exit; + } + } + else { + _PyArg_BadArgument("raw_unicode_escape_encode", 2, "str or None", args[1]); + goto exit; + } +skip_optional: return_value = _codecs_raw_unicode_escape_encode_impl(module, str, errors); exit: @@ -1276,10 +2453,39 @@ _codecs_latin_1_encode(PyObject *module, PyObject *const *args, Py_ssize_t nargs PyObject *str; const char *errors = NULL; - if (!_PyArg_ParseStack(args, nargs, "U|z:latin_1_encode", - &str, &errors)) { + if (!_PyArg_CheckPositional("latin_1_encode", nargs, 1, 2)) { goto exit; } + if (!PyUnicode_Check(args[0])) { + _PyArg_BadArgument("latin_1_encode", 1, "str", args[0]); + goto exit; + } + if (PyUnicode_READY(args[0]) == -1) { + goto exit; + } + str = args[0]; + if (nargs < 2) { + goto skip_optional; + } + if (args[1] == Py_None) { + errors = NULL; + } + else if (PyUnicode_Check(args[1])) { + Py_ssize_t errors_length; + errors = PyUnicode_AsUTF8AndSize(args[1], &errors_length); + if (errors == NULL) { + goto exit; + } + if (strlen(errors) != (size_t)errors_length) { + PyErr_SetString(PyExc_ValueError, "embedded null character"); + goto exit; + } + } + else { + _PyArg_BadArgument("latin_1_encode", 2, "str or None", args[1]); + goto exit; + } +skip_optional: return_value = _codecs_latin_1_encode_impl(module, str, errors); exit: @@ -1305,10 +2511,39 @@ _codecs_ascii_encode(PyObject *module, PyObject *const *args, Py_ssize_t nargs) PyObject *str; const char *errors = NULL; - if (!_PyArg_ParseStack(args, nargs, "U|z:ascii_encode", - &str, &errors)) { + if (!_PyArg_CheckPositional("ascii_encode", nargs, 1, 2)) { + goto exit; + } + if (!PyUnicode_Check(args[0])) { + _PyArg_BadArgument("ascii_encode", 1, "str", args[0]); + goto exit; + } + if (PyUnicode_READY(args[0]) == -1) { goto exit; } + str = args[0]; + if (nargs < 2) { + goto skip_optional; + } + if (args[1] == Py_None) { + errors = NULL; + } + else if (PyUnicode_Check(args[1])) { + Py_ssize_t errors_length; + errors = PyUnicode_AsUTF8AndSize(args[1], &errors_length); + if (errors == NULL) { + goto exit; + } + if (strlen(errors) != (size_t)errors_length) { + PyErr_SetString(PyExc_ValueError, "embedded null character"); + goto exit; + } + } + else { + _PyArg_BadArgument("ascii_encode", 2, "str or None", args[1]); + goto exit; + } +skip_optional: return_value = _codecs_ascii_encode_impl(module, str, errors); exit: @@ -1335,10 +2570,43 @@ _codecs_charmap_encode(PyObject *module, PyObject *const *args, Py_ssize_t nargs const char *errors = NULL; PyObject *mapping = NULL; - if (!_PyArg_ParseStack(args, nargs, "U|zO:charmap_encode", - &str, &errors, &mapping)) { + if (!_PyArg_CheckPositional("charmap_encode", nargs, 1, 3)) { + goto exit; + } + if (!PyUnicode_Check(args[0])) { + _PyArg_BadArgument("charmap_encode", 1, "str", args[0]); + goto exit; + } + if (PyUnicode_READY(args[0]) == -1) { + goto exit; + } + str = args[0]; + if (nargs < 2) { + goto skip_optional; + } + if (args[1] == Py_None) { + errors = NULL; + } + else if (PyUnicode_Check(args[1])) { + Py_ssize_t errors_length; + errors = PyUnicode_AsUTF8AndSize(args[1], &errors_length); + if (errors == NULL) { + goto exit; + } + if (strlen(errors) != (size_t)errors_length) { + PyErr_SetString(PyExc_ValueError, "embedded null character"); + goto exit; + } + } + else { + _PyArg_BadArgument("charmap_encode", 2, "str or None", args[1]); goto exit; } + if (nargs < 3) { + goto skip_optional; + } + mapping = args[2]; +skip_optional: return_value = _codecs_charmap_encode_impl(module, str, errors, mapping); exit: @@ -1363,7 +2631,7 @@ _codecs_charmap_build(PyObject *module, PyObject *arg) PyObject *map; if (!PyUnicode_Check(arg)) { - _PyArg_BadArgument("charmap_build", "str", arg); + _PyArg_BadArgument("charmap_build", 0, "str", arg); goto exit; } if (PyUnicode_READY(arg) == -1) { @@ -1396,10 +2664,39 @@ _codecs_mbcs_encode(PyObject *module, PyObject *const *args, Py_ssize_t nargs) PyObject *str; const char *errors = NULL; - if (!_PyArg_ParseStack(args, nargs, "U|z:mbcs_encode", - &str, &errors)) { + if (!_PyArg_CheckPositional("mbcs_encode", nargs, 1, 2)) { goto exit; } + if (!PyUnicode_Check(args[0])) { + _PyArg_BadArgument("mbcs_encode", 1, "str", args[0]); + goto exit; + } + if (PyUnicode_READY(args[0]) == -1) { + goto exit; + } + str = args[0]; + if (nargs < 2) { + goto skip_optional; + } + if (args[1] == Py_None) { + errors = NULL; + } + else if (PyUnicode_Check(args[1])) { + Py_ssize_t errors_length; + errors = PyUnicode_AsUTF8AndSize(args[1], &errors_length); + if (errors == NULL) { + goto exit; + } + if (strlen(errors) != (size_t)errors_length) { + PyErr_SetString(PyExc_ValueError, "embedded null character"); + goto exit; + } + } + else { + _PyArg_BadArgument("mbcs_encode", 2, "str or None", args[1]); + goto exit; + } +skip_optional: return_value = _codecs_mbcs_encode_impl(module, str, errors); exit: @@ -1428,10 +2725,39 @@ _codecs_oem_encode(PyObject *module, PyObject *const *args, Py_ssize_t nargs) PyObject *str; const char *errors = NULL; - if (!_PyArg_ParseStack(args, nargs, "U|z:oem_encode", - &str, &errors)) { + if (!_PyArg_CheckPositional("oem_encode", nargs, 1, 2)) { + goto exit; + } + if (!PyUnicode_Check(args[0])) { + _PyArg_BadArgument("oem_encode", 1, "str", args[0]); + goto exit; + } + if (PyUnicode_READY(args[0]) == -1) { goto exit; } + str = args[0]; + if (nargs < 2) { + goto skip_optional; + } + if (args[1] == Py_None) { + errors = NULL; + } + else if (PyUnicode_Check(args[1])) { + Py_ssize_t errors_length; + errors = PyUnicode_AsUTF8AndSize(args[1], &errors_length); + if (errors == NULL) { + goto exit; + } + if (strlen(errors) != (size_t)errors_length) { + PyErr_SetString(PyExc_ValueError, "embedded null character"); + goto exit; + } + } + else { + _PyArg_BadArgument("oem_encode", 2, "str or None", args[1]); + goto exit; + } +skip_optional: return_value = _codecs_oem_encode_impl(module, str, errors); exit: @@ -1462,10 +2788,48 @@ _codecs_code_page_encode(PyObject *module, PyObject *const *args, Py_ssize_t nar PyObject *str; const char *errors = NULL; - if (!_PyArg_ParseStack(args, nargs, "iU|z:code_page_encode", - &code_page, &str, &errors)) { + if (!_PyArg_CheckPositional("code_page_encode", nargs, 2, 3)) { + goto exit; + } + if (PyFloat_Check(args[0])) { + PyErr_SetString(PyExc_TypeError, + "integer argument expected, got float" ); + goto exit; + } + code_page = _PyLong_AsInt(args[0]); + if (code_page == -1 && PyErr_Occurred()) { + goto exit; + } + if (!PyUnicode_Check(args[1])) { + _PyArg_BadArgument("code_page_encode", 2, "str", args[1]); + goto exit; + } + if (PyUnicode_READY(args[1]) == -1) { goto exit; } + str = args[1]; + if (nargs < 3) { + goto skip_optional; + } + if (args[2] == Py_None) { + errors = NULL; + } + else if (PyUnicode_Check(args[2])) { + Py_ssize_t errors_length; + errors = PyUnicode_AsUTF8AndSize(args[2], &errors_length); + if (errors == NULL) { + goto exit; + } + if (strlen(errors) != (size_t)errors_length) { + PyErr_SetString(PyExc_ValueError, "embedded null character"); + goto exit; + } + } + else { + _PyArg_BadArgument("code_page_encode", 3, "str or None", args[2]); + goto exit; + } +skip_optional: return_value = _codecs_code_page_encode_impl(module, code_page, str, errors); exit: @@ -1498,10 +2862,23 @@ _codecs_register_error(PyObject *module, PyObject *const *args, Py_ssize_t nargs const char *errors; PyObject *handler; - if (!_PyArg_ParseStack(args, nargs, "sO:register_error", - &errors, &handler)) { + if (!_PyArg_CheckPositional("register_error", nargs, 2, 2)) { + goto exit; + } + if (!PyUnicode_Check(args[0])) { + _PyArg_BadArgument("register_error", 1, "str", args[0]); + goto exit; + } + Py_ssize_t errors_length; + errors = PyUnicode_AsUTF8AndSize(args[0], &errors_length); + if (errors == NULL) { + goto exit; + } + if (strlen(errors) != (size_t)errors_length) { + PyErr_SetString(PyExc_ValueError, "embedded null character"); goto exit; } + handler = args[1]; return_value = _codecs_register_error_impl(module, errors, handler); exit: @@ -1530,7 +2907,7 @@ _codecs_lookup_error(PyObject *module, PyObject *arg) const char *name; if (!PyUnicode_Check(arg)) { - _PyArg_BadArgument("lookup_error", "str", arg); + _PyArg_BadArgument("lookup_error", 0, "str", arg); goto exit; } Py_ssize_t name_length; @@ -1571,4 +2948,4 @@ _codecs_lookup_error(PyObject *module, PyObject *arg) #ifndef _CODECS_CODE_PAGE_ENCODE_METHODDEF #define _CODECS_CODE_PAGE_ENCODE_METHODDEF #endif /* !defined(_CODECS_CODE_PAGE_ENCODE_METHODDEF) */ -/*[clinic end generated code: output=c2d2b917b78a4c45 input=a9049054013a1b77]*/ +/*[clinic end generated code: output=85ea29db163ee74c input=a9049054013a1b77]*/ diff --git a/Modules/clinic/_collectionsmodule.c.h b/Modules/clinic/_collectionsmodule.c.h index 12626c1f6618..ed3b1b50f9b5 100644 --- a/Modules/clinic/_collectionsmodule.c.h +++ b/Modules/clinic/_collectionsmodule.c.h @@ -16,13 +16,30 @@ tuplegetter_new(PyTypeObject *type, PyObject *args, PyObject *kwargs) !_PyArg_NoKeywords("_tuplegetter", kwargs)) { goto exit; } - if (!PyArg_ParseTuple(args, "nO:_tuplegetter", - &index, &doc)) { + if (!_PyArg_CheckPositional("_tuplegetter", PyTuple_GET_SIZE(args), 2, 2)) { goto exit; } + if (PyFloat_Check(PyTuple_GET_ITEM(args, 0))) { + PyErr_SetString(PyExc_TypeError, + "integer argument expected, got float" ); + goto exit; + } + { + Py_ssize_t ival = -1; + PyObject *iobj = PyNumber_Index(PyTuple_GET_ITEM(args, 0)); + if (iobj != NULL) { + ival = PyLong_AsSsize_t(iobj); + Py_DECREF(iobj); + } + if (ival == -1 && PyErr_Occurred()) { + goto exit; + } + index = ival; + } + doc = PyTuple_GET_ITEM(args, 1); return_value = tuplegetter_new_impl(type, index, doc); exit: return return_value; } -/*[clinic end generated code: output=83746071eacc28d3 input=a9049054013a1b77]*/ +/*[clinic end generated code: output=51bd572577ca7111 input=a9049054013a1b77]*/ diff --git a/Modules/clinic/_cryptmodule.c.h b/Modules/clinic/_cryptmodule.c.h index baa31e2d2fbe..2fcb0c1bf12c 100644 --- a/Modules/clinic/_cryptmodule.c.h +++ b/Modules/clinic/_cryptmodule.c.h @@ -26,8 +26,33 @@ crypt_crypt(PyObject *module, PyObject *const *args, Py_ssize_t nargs) const char *word; const char *salt; - if (!_PyArg_ParseStack(args, nargs, "ss:crypt", - &word, &salt)) { + if (!_PyArg_CheckPositional("crypt", nargs, 2, 2)) { + goto exit; + } + if (!PyUnicode_Check(args[0])) { + _PyArg_BadArgument("crypt", 1, "str", args[0]); + goto exit; + } + Py_ssize_t word_length; + word = PyUnicode_AsUTF8AndSize(args[0], &word_length); + if (word == NULL) { + goto exit; + } + if (strlen(word) != (size_t)word_length) { + PyErr_SetString(PyExc_ValueError, "embedded null character"); + goto exit; + } + if (!PyUnicode_Check(args[1])) { + _PyArg_BadArgument("crypt", 2, "str", args[1]); + goto exit; + } + Py_ssize_t salt_length; + salt = PyUnicode_AsUTF8AndSize(args[1], &salt_length); + if (salt == NULL) { + goto exit; + } + if (strlen(salt) != (size_t)salt_length) { + PyErr_SetString(PyExc_ValueError, "embedded null character"); goto exit; } return_value = crypt_crypt_impl(module, word, salt); @@ -35,4 +60,4 @@ crypt_crypt(PyObject *module, PyObject *const *args, Py_ssize_t nargs) exit: return return_value; } -/*[clinic end generated code: output=79001dbfdd623ff9 input=a9049054013a1b77]*/ +/*[clinic end generated code: output=3f75d4d4be4dddbb input=a9049054013a1b77]*/ diff --git a/Modules/clinic/_curses_panel.c.h b/Modules/clinic/_curses_panel.c.h index 2e569c546bdf..ed59c3ba3984 100644 --- a/Modules/clinic/_curses_panel.c.h +++ b/Modules/clinic/_curses_panel.c.h @@ -149,8 +149,25 @@ _curses_panel_panel_move(PyCursesPanelObject *self, PyObject *const *args, Py_ss int y; int x; - if (!_PyArg_ParseStack(args, nargs, "ii:move", - &y, &x)) { + if (!_PyArg_CheckPositional("move", nargs, 2, 2)) { + goto exit; + } + if (PyFloat_Check(args[0])) { + PyErr_SetString(PyExc_TypeError, + "integer argument expected, got float" ); + goto exit; + } + y = _PyLong_AsInt(args[0]); + if (y == -1 && PyErr_Occurred()) { + goto exit; + } + if (PyFloat_Check(args[1])) { + PyErr_SetString(PyExc_TypeError, + "integer argument expected, got float" ); + goto exit; + } + x = _PyLong_AsInt(args[1]); + if (x == -1 && PyErr_Occurred()) { goto exit; } return_value = _curses_panel_panel_move_impl(self, y, x); @@ -197,7 +214,7 @@ _curses_panel_panel_replace(PyCursesPanelObject *self, PyObject *arg) PyCursesWindowObject *win; if (!PyObject_TypeCheck(arg, &PyCursesWindow_Type)) { - _PyArg_BadArgument("replace", (&PyCursesWindow_Type)->tp_name, arg); + _PyArg_BadArgument("replace", 0, (&PyCursesWindow_Type)->tp_name, arg); goto exit; } win = (PyCursesWindowObject *)arg; @@ -271,7 +288,7 @@ _curses_panel_new_panel(PyObject *module, PyObject *arg) PyCursesWindowObject *win; if (!PyObject_TypeCheck(arg, &PyCursesWindow_Type)) { - _PyArg_BadArgument("new_panel", (&PyCursesWindow_Type)->tp_name, arg); + _PyArg_BadArgument("new_panel", 0, (&PyCursesWindow_Type)->tp_name, arg); goto exit; } win = (PyCursesWindowObject *)arg; @@ -318,4 +335,4 @@ _curses_panel_update_panels(PyObject *module, PyObject *Py_UNUSED(ignored)) { return _curses_panel_update_panels_impl(module); } -/*[clinic end generated code: output=4b211b4015e29100 input=a9049054013a1b77]*/ +/*[clinic end generated code: output=ac1f56e6c3d4cc57 input=a9049054013a1b77]*/ diff --git a/Modules/clinic/_cursesmodule.c.h b/Modules/clinic/_cursesmodule.c.h index 01c70976c42e..0988581363e0 100644 --- a/Modules/clinic/_cursesmodule.c.h +++ b/Modules/clinic/_cursesmodule.c.h @@ -245,10 +245,23 @@ _curses_window_bkgd(PyCursesWindowObject *self, PyObject *const *args, Py_ssize_ PyObject *ch; long attr = A_NORMAL; - if (!_PyArg_ParseStack(args, nargs, "O|l:bkgd", - &ch, &attr)) { + if (!_PyArg_CheckPositional("bkgd", nargs, 1, 2)) { goto exit; } + ch = args[0]; + if (nargs < 2) { + goto skip_optional; + } + if (PyFloat_Check(args[1])) { + PyErr_SetString(PyExc_TypeError, + "integer argument expected, got float" ); + goto exit; + } + attr = PyLong_AsLong(args[1]); + if (attr == -1 && PyErr_Occurred()) { + goto exit; + } +skip_optional: return_value = _curses_window_bkgd_impl(self, ch, attr); exit: @@ -379,10 +392,23 @@ _curses_window_bkgdset(PyCursesWindowObject *self, PyObject *const *args, Py_ssi PyObject *ch; long attr = A_NORMAL; - if (!_PyArg_ParseStack(args, nargs, "O|l:bkgdset", - &ch, &attr)) { + if (!_PyArg_CheckPositional("bkgdset", nargs, 1, 2)) { + goto exit; + } + ch = args[0]; + if (nargs < 2) { + goto skip_optional; + } + if (PyFloat_Check(args[1])) { + PyErr_SetString(PyExc_TypeError, + "integer argument expected, got float" ); + goto exit; + } + attr = PyLong_AsLong(args[1]); + if (attr == -1 && PyErr_Occurred()) { goto exit; } +skip_optional: return_value = _curses_window_bkgdset_impl(self, ch, attr); exit: @@ -623,10 +649,23 @@ _curses_window_echochar(PyCursesWindowObject *self, PyObject *const *args, Py_ss PyObject *ch; long attr = A_NORMAL; - if (!_PyArg_ParseStack(args, nargs, "O|l:echochar", - &ch, &attr)) { + if (!_PyArg_CheckPositional("echochar", nargs, 1, 2)) { + goto exit; + } + ch = args[0]; + if (nargs < 2) { + goto skip_optional; + } + if (PyFloat_Check(args[1])) { + PyErr_SetString(PyExc_TypeError, + "integer argument expected, got float" ); + goto exit; + } + attr = PyLong_AsLong(args[1]); + if (attr == -1 && PyErr_Occurred()) { goto exit; } +skip_optional: return_value = _curses_window_echochar_impl(self, ch, attr); exit: @@ -660,8 +699,25 @@ _curses_window_enclose(PyCursesWindowObject *self, PyObject *const *args, Py_ssi int x; long _return_value; - if (!_PyArg_ParseStack(args, nargs, "ii:enclose", - &y, &x)) { + if (!_PyArg_CheckPositional("enclose", nargs, 2, 2)) { + goto exit; + } + if (PyFloat_Check(args[0])) { + PyErr_SetString(PyExc_TypeError, + "integer argument expected, got float" ); + goto exit; + } + y = _PyLong_AsInt(args[0]); + if (y == -1 && PyErr_Occurred()) { + goto exit; + } + if (PyFloat_Check(args[1])) { + PyErr_SetString(PyExc_TypeError, + "integer argument expected, got float" ); + goto exit; + } + x = _PyLong_AsInt(args[1]); + if (x == -1 && PyErr_Occurred()) { goto exit; } _return_value = _curses_window_enclose_impl(self, y, x); @@ -1462,8 +1518,25 @@ _curses_window_redrawln(PyCursesWindowObject *self, PyObject *const *args, Py_ss int beg; int num; - if (!_PyArg_ParseStack(args, nargs, "ii:redrawln", - &beg, &num)) { + if (!_PyArg_CheckPositional("redrawln", nargs, 2, 2)) { + goto exit; + } + if (PyFloat_Check(args[0])) { + PyErr_SetString(PyExc_TypeError, + "integer argument expected, got float" ); + goto exit; + } + beg = _PyLong_AsInt(args[0]); + if (beg == -1 && PyErr_Occurred()) { + goto exit; + } + if (PyFloat_Check(args[1])) { + PyErr_SetString(PyExc_TypeError, + "integer argument expected, got float" ); + goto exit; + } + num = _PyLong_AsInt(args[1]); + if (num == -1 && PyErr_Occurred()) { goto exit; } return_value = _curses_window_redrawln_impl(self, beg, num); @@ -1554,8 +1627,25 @@ _curses_window_setscrreg(PyCursesWindowObject *self, PyObject *const *args, Py_s int top; int bottom; - if (!_PyArg_ParseStack(args, nargs, "ii:setscrreg", - &top, &bottom)) { + if (!_PyArg_CheckPositional("setscrreg", nargs, 2, 2)) { + goto exit; + } + if (PyFloat_Check(args[0])) { + PyErr_SetString(PyExc_TypeError, + "integer argument expected, got float" ); + goto exit; + } + top = _PyLong_AsInt(args[0]); + if (top == -1 && PyErr_Occurred()) { + goto exit; + } + if (PyFloat_Check(args[1])) { + PyErr_SetString(PyExc_TypeError, + "integer argument expected, got float" ); + goto exit; + } + bottom = _PyLong_AsInt(args[1]); + if (bottom == -1 && PyErr_Occurred()) { goto exit; } return_value = _curses_window_setscrreg_impl(self, top, bottom); @@ -1878,10 +1968,22 @@ _curses_cbreak(PyObject *module, PyObject *const *args, Py_ssize_t nargs) PyObject *return_value = NULL; int flag = 1; - if (!_PyArg_ParseStack(args, nargs, "|i:cbreak", - &flag)) { + if (!_PyArg_CheckPositional("cbreak", nargs, 0, 1)) { + goto exit; + } + if (nargs < 1) { + goto skip_optional; + } + if (PyFloat_Check(args[0])) { + PyErr_SetString(PyExc_TypeError, + "integer argument expected, got float" ); + goto exit; + } + flag = _PyLong_AsInt(args[0]); + if (flag == -1 && PyErr_Occurred()) { goto exit; } +skip_optional: return_value = _curses_cbreak_impl(module, flag); exit: @@ -2158,10 +2260,22 @@ _curses_echo(PyObject *module, PyObject *const *args, Py_ssize_t nargs) PyObject *return_value = NULL; int flag = 1; - if (!_PyArg_ParseStack(args, nargs, "|i:echo", - &flag)) { + if (!_PyArg_CheckPositional("echo", nargs, 0, 1)) { goto exit; } + if (nargs < 1) { + goto skip_optional; + } + if (PyFloat_Check(args[0])) { + PyErr_SetString(PyExc_TypeError, + "integer argument expected, got float" ); + goto exit; + } + flag = _PyLong_AsInt(args[0]); + if (flag == -1 && PyErr_Occurred()) { + goto exit; + } +skip_optional: return_value = _curses_echo_impl(module, flag); exit: @@ -2321,10 +2435,65 @@ _curses_ungetmouse(PyObject *module, PyObject *const *args, Py_ssize_t nargs) int z; unsigned long bstate; - if (!_PyArg_ParseStack(args, nargs, "hiiik:ungetmouse", - &id, &x, &y, &z, &bstate)) { + if (!_PyArg_CheckPositional("ungetmouse", nargs, 5, 5)) { + goto exit; + } + if (PyFloat_Check(args[0])) { + PyErr_SetString(PyExc_TypeError, + "integer argument expected, got float" ); goto exit; } + { + long ival = PyLong_AsLong(args[0]); + if (ival == -1 && PyErr_Occurred()) { + goto exit; + } + else if (ival < SHRT_MIN) { + PyErr_SetString(PyExc_OverflowError, + "signed short integer is less than minimum"); + goto exit; + } + else if (ival > SHRT_MAX) { + PyErr_SetString(PyExc_OverflowError, + "signed short integer is greater than maximum"); + goto exit; + } + else { + id = (short) ival; + } + } + if (PyFloat_Check(args[1])) { + PyErr_SetString(PyExc_TypeError, + "integer argument expected, got float" ); + goto exit; + } + x = _PyLong_AsInt(args[1]); + if (x == -1 && PyErr_Occurred()) { + goto exit; + } + if (PyFloat_Check(args[2])) { + PyErr_SetString(PyExc_TypeError, + "integer argument expected, got float" ); + goto exit; + } + y = _PyLong_AsInt(args[2]); + if (y == -1 && PyErr_Occurred()) { + goto exit; + } + if (PyFloat_Check(args[3])) { + PyErr_SetString(PyExc_TypeError, + "integer argument expected, got float" ); + goto exit; + } + z = _PyLong_AsInt(args[3]); + if (z == -1 && PyErr_Occurred()) { + goto exit; + } + if (!PyLong_Check(args[4])) { + _PyArg_BadArgument("ungetmouse", 5, "int", args[4]); + goto exit; + } + bstate = PyLong_AsUnsignedLongMask(args[4]); return_value = _curses_ungetmouse_impl(module, id, x, y, z, bstate); exit: @@ -2527,10 +2696,105 @@ _curses_init_color(PyObject *module, PyObject *const *args, Py_ssize_t nargs) short g; short b; - if (!_PyArg_ParseStack(args, nargs, "hhhh:init_color", - &color_number, &r, &g, &b)) { + if (!_PyArg_CheckPositional("init_color", nargs, 4, 4)) { + goto exit; + } + if (PyFloat_Check(args[0])) { + PyErr_SetString(PyExc_TypeError, + "integer argument expected, got float" ); + goto exit; + } + { + long ival = PyLong_AsLong(args[0]); + if (ival == -1 && PyErr_Occurred()) { + goto exit; + } + else if (ival < SHRT_MIN) { + PyErr_SetString(PyExc_OverflowError, + "signed short integer is less than minimum"); + goto exit; + } + else if (ival > SHRT_MAX) { + PyErr_SetString(PyExc_OverflowError, + "signed short integer is greater than maximum"); + goto exit; + } + else { + color_number = (short) ival; + } + } + if (PyFloat_Check(args[1])) { + PyErr_SetString(PyExc_TypeError, + "integer argument expected, got float" ); + goto exit; + } + { + long ival = PyLong_AsLong(args[1]); + if (ival == -1 && PyErr_Occurred()) { + goto exit; + } + else if (ival < SHRT_MIN) { + PyErr_SetString(PyExc_OverflowError, + "signed short integer is less than minimum"); + goto exit; + } + else if (ival > SHRT_MAX) { + PyErr_SetString(PyExc_OverflowError, + "signed short integer is greater than maximum"); + goto exit; + } + else { + r = (short) ival; + } + } + if (PyFloat_Check(args[2])) { + PyErr_SetString(PyExc_TypeError, + "integer argument expected, got float" ); + goto exit; + } + { + long ival = PyLong_AsLong(args[2]); + if (ival == -1 && PyErr_Occurred()) { + goto exit; + } + else if (ival < SHRT_MIN) { + PyErr_SetString(PyExc_OverflowError, + "signed short integer is less than minimum"); + goto exit; + } + else if (ival > SHRT_MAX) { + PyErr_SetString(PyExc_OverflowError, + "signed short integer is greater than maximum"); + goto exit; + } + else { + g = (short) ival; + } + } + if (PyFloat_Check(args[3])) { + PyErr_SetString(PyExc_TypeError, + "integer argument expected, got float" ); goto exit; } + { + long ival = PyLong_AsLong(args[3]); + if (ival == -1 && PyErr_Occurred()) { + goto exit; + } + else if (ival < SHRT_MIN) { + PyErr_SetString(PyExc_OverflowError, + "signed short integer is less than minimum"); + goto exit; + } + else if (ival > SHRT_MAX) { + PyErr_SetString(PyExc_OverflowError, + "signed short integer is greater than maximum"); + goto exit; + } + else { + b = (short) ival; + } + } return_value = _curses_init_color_impl(module, color_number, r, g, b); exit: @@ -2568,10 +2832,81 @@ _curses_init_pair(PyObject *module, PyObject *const *args, Py_ssize_t nargs) short fg; short bg; - if (!_PyArg_ParseStack(args, nargs, "hhh:init_pair", - &pair_number, &fg, &bg)) { + if (!_PyArg_CheckPositional("init_pair", nargs, 3, 3)) { + goto exit; + } + if (PyFloat_Check(args[0])) { + PyErr_SetString(PyExc_TypeError, + "integer argument expected, got float" ); + goto exit; + } + { + long ival = PyLong_AsLong(args[0]); + if (ival == -1 && PyErr_Occurred()) { + goto exit; + } + else if (ival < SHRT_MIN) { + PyErr_SetString(PyExc_OverflowError, + "signed short integer is less than minimum"); + goto exit; + } + else if (ival > SHRT_MAX) { + PyErr_SetString(PyExc_OverflowError, + "signed short integer is greater than maximum"); + goto exit; + } + else { + pair_number = (short) ival; + } + } + if (PyFloat_Check(args[1])) { + PyErr_SetString(PyExc_TypeError, + "integer argument expected, got float" ); + goto exit; + } + { + long ival = PyLong_AsLong(args[1]); + if (ival == -1 && PyErr_Occurred()) { + goto exit; + } + else if (ival < SHRT_MIN) { + PyErr_SetString(PyExc_OverflowError, + "signed short integer is less than minimum"); + goto exit; + } + else if (ival > SHRT_MAX) { + PyErr_SetString(PyExc_OverflowError, + "signed short integer is greater than maximum"); + goto exit; + } + else { + fg = (short) ival; + } + } + if (PyFloat_Check(args[2])) { + PyErr_SetString(PyExc_TypeError, + "integer argument expected, got float" ); goto exit; } + { + long ival = PyLong_AsLong(args[2]); + if (ival == -1 && PyErr_Occurred()) { + goto exit; + } + else if (ival < SHRT_MIN) { + PyErr_SetString(PyExc_OverflowError, + "signed short integer is less than minimum"); + goto exit; + } + else if (ival > SHRT_MAX) { + PyErr_SetString(PyExc_OverflowError, + "signed short integer is greater than maximum"); + goto exit; + } + else { + bg = (short) ival; + } + } return_value = _curses_init_pair_impl(module, pair_number, fg, bg); exit: @@ -2712,8 +3047,25 @@ _curses_is_term_resized(PyObject *module, PyObject *const *args, Py_ssize_t narg int nlines; int ncols; - if (!_PyArg_ParseStack(args, nargs, "ii:is_term_resized", - &nlines, &ncols)) { + if (!_PyArg_CheckPositional("is_term_resized", nargs, 2, 2)) { + goto exit; + } + if (PyFloat_Check(args[0])) { + PyErr_SetString(PyExc_TypeError, + "integer argument expected, got float" ); + goto exit; + } + nlines = _PyLong_AsInt(args[0]); + if (nlines == -1 && PyErr_Occurred()) { + goto exit; + } + if (PyFloat_Check(args[1])) { + PyErr_SetString(PyExc_TypeError, + "integer argument expected, got float" ); + goto exit; + } + ncols = _PyLong_AsInt(args[1]); + if (ncols == -1 && PyErr_Occurred()) { goto exit; } return_value = _curses_is_term_resized_impl(module, nlines, ncols); @@ -2905,7 +3257,7 @@ _curses_mousemask(PyObject *module, PyObject *arg) unsigned long newmask; if (!PyLong_Check(arg)) { - _PyArg_BadArgument("mousemask", "int", arg); + _PyArg_BadArgument("mousemask", 0, "int", arg); goto exit; } newmask = PyLong_AsUnsignedLongMask(arg); @@ -2977,8 +3329,25 @@ _curses_newpad(PyObject *module, PyObject *const *args, Py_ssize_t nargs) int nlines; int ncols; - if (!_PyArg_ParseStack(args, nargs, "ii:newpad", - &nlines, &ncols)) { + if (!_PyArg_CheckPositional("newpad", nargs, 2, 2)) { + goto exit; + } + if (PyFloat_Check(args[0])) { + PyErr_SetString(PyExc_TypeError, + "integer argument expected, got float" ); + goto exit; + } + nlines = _PyLong_AsInt(args[0]); + if (nlines == -1 && PyErr_Occurred()) { + goto exit; + } + if (PyFloat_Check(args[1])) { + PyErr_SetString(PyExc_TypeError, + "integer argument expected, got float" ); + goto exit; + } + ncols = _PyLong_AsInt(args[1]); + if (ncols == -1 && PyErr_Occurred()) { goto exit; } return_value = _curses_newpad_impl(module, nlines, ncols); @@ -3066,10 +3435,22 @@ _curses_nl(PyObject *module, PyObject *const *args, Py_ssize_t nargs) PyObject *return_value = NULL; int flag = 1; - if (!_PyArg_ParseStack(args, nargs, "|i:nl", - &flag)) { + if (!_PyArg_CheckPositional("nl", nargs, 0, 1)) { + goto exit; + } + if (nargs < 1) { + goto skip_optional; + } + if (PyFloat_Check(args[0])) { + PyErr_SetString(PyExc_TypeError, + "integer argument expected, got float" ); + goto exit; + } + flag = _PyLong_AsInt(args[0]); + if (flag == -1 && PyErr_Occurred()) { goto exit; } +skip_optional: return_value = _curses_nl_impl(module, flag); exit: @@ -3317,10 +3698,22 @@ _curses_qiflush(PyObject *module, PyObject *const *args, Py_ssize_t nargs) PyObject *return_value = NULL; int flag = 1; - if (!_PyArg_ParseStack(args, nargs, "|i:qiflush", - &flag)) { + if (!_PyArg_CheckPositional("qiflush", nargs, 0, 1)) { goto exit; } + if (nargs < 1) { + goto skip_optional; + } + if (PyFloat_Check(args[0])) { + PyErr_SetString(PyExc_TypeError, + "integer argument expected, got float" ); + goto exit; + } + flag = _PyLong_AsInt(args[0]); + if (flag == -1 && PyErr_Occurred()) { + goto exit; + } +skip_optional: return_value = _curses_qiflush_impl(module, flag); exit: @@ -3383,10 +3776,22 @@ _curses_raw(PyObject *module, PyObject *const *args, Py_ssize_t nargs) PyObject *return_value = NULL; int flag = 1; - if (!_PyArg_ParseStack(args, nargs, "|i:raw", - &flag)) { + if (!_PyArg_CheckPositional("raw", nargs, 0, 1)) { goto exit; } + if (nargs < 1) { + goto skip_optional; + } + if (PyFloat_Check(args[0])) { + PyErr_SetString(PyExc_TypeError, + "integer argument expected, got float" ); + goto exit; + } + flag = _PyLong_AsInt(args[0]); + if (flag == -1 && PyErr_Occurred()) { + goto exit; + } +skip_optional: return_value = _curses_raw_impl(module, flag); exit: @@ -3476,8 +3881,25 @@ _curses_resizeterm(PyObject *module, PyObject *const *args, Py_ssize_t nargs) int nlines; int ncols; - if (!_PyArg_ParseStack(args, nargs, "ii:resizeterm", - &nlines, &ncols)) { + if (!_PyArg_CheckPositional("resizeterm", nargs, 2, 2)) { + goto exit; + } + if (PyFloat_Check(args[0])) { + PyErr_SetString(PyExc_TypeError, + "integer argument expected, got float" ); + goto exit; + } + nlines = _PyLong_AsInt(args[0]); + if (nlines == -1 && PyErr_Occurred()) { + goto exit; + } + if (PyFloat_Check(args[1])) { + PyErr_SetString(PyExc_TypeError, + "integer argument expected, got float" ); + goto exit; + } + ncols = _PyLong_AsInt(args[1]); + if (ncols == -1 && PyErr_Occurred()) { goto exit; } return_value = _curses_resizeterm_impl(module, nlines, ncols); @@ -3520,8 +3942,25 @@ _curses_resize_term(PyObject *module, PyObject *const *args, Py_ssize_t nargs) int nlines; int ncols; - if (!_PyArg_ParseStack(args, nargs, "ii:resize_term", - &nlines, &ncols)) { + if (!_PyArg_CheckPositional("resize_term", nargs, 2, 2)) { + goto exit; + } + if (PyFloat_Check(args[0])) { + PyErr_SetString(PyExc_TypeError, + "integer argument expected, got float" ); + goto exit; + } + nlines = _PyLong_AsInt(args[0]); + if (nlines == -1 && PyErr_Occurred()) { + goto exit; + } + if (PyFloat_Check(args[1])) { + PyErr_SetString(PyExc_TypeError, + "integer argument expected, got float" ); + goto exit; + } + ncols = _PyLong_AsInt(args[1]); + if (ncols == -1 && PyErr_Occurred()) { goto exit; } return_value = _curses_resize_term_impl(module, nlines, ncols); @@ -3578,8 +4017,25 @@ _curses_setsyx(PyObject *module, PyObject *const *args, Py_ssize_t nargs) int y; int x; - if (!_PyArg_ParseStack(args, nargs, "ii:setsyx", - &y, &x)) { + if (!_PyArg_CheckPositional("setsyx", nargs, 2, 2)) { + goto exit; + } + if (PyFloat_Check(args[0])) { + PyErr_SetString(PyExc_TypeError, + "integer argument expected, got float" ); + goto exit; + } + y = _PyLong_AsInt(args[0]); + if (y == -1 && PyErr_Occurred()) { + goto exit; + } + if (PyFloat_Check(args[1])) { + PyErr_SetString(PyExc_TypeError, + "integer argument expected, got float" ); + goto exit; + } + x = _PyLong_AsInt(args[1]); + if (x == -1 && PyErr_Occurred()) { goto exit; } return_value = _curses_setsyx_impl(module, y, x); @@ -3676,7 +4132,7 @@ _curses_tigetflag(PyObject *module, PyObject *arg) const char *capname; if (!PyUnicode_Check(arg)) { - _PyArg_BadArgument("tigetflag", "str", arg); + _PyArg_BadArgument("tigetflag", 0, "str", arg); goto exit; } Py_ssize_t capname_length; @@ -3719,7 +4175,7 @@ _curses_tigetnum(PyObject *module, PyObject *arg) const char *capname; if (!PyUnicode_Check(arg)) { - _PyArg_BadArgument("tigetnum", "str", arg); + _PyArg_BadArgument("tigetnum", 0, "str", arg); goto exit; } Py_ssize_t capname_length; @@ -3762,7 +4218,7 @@ _curses_tigetstr(PyObject *module, PyObject *arg) const char *capname; if (!PyUnicode_Check(arg)) { - _PyArg_BadArgument("tigetstr", "str", arg); + _PyArg_BadArgument("tigetstr", 0, "str", arg); goto exit; } Py_ssize_t capname_length; @@ -4044,4 +4500,4 @@ _curses_use_default_colors(PyObject *module, PyObject *Py_UNUSED(ignored)) #ifndef _CURSES_USE_DEFAULT_COLORS_METHODDEF #define _CURSES_USE_DEFAULT_COLORS_METHODDEF #endif /* !defined(_CURSES_USE_DEFAULT_COLORS_METHODDEF) */ -/*[clinic end generated code: output=a2bbced3c5d29d64 input=a9049054013a1b77]*/ +/*[clinic end generated code: output=ceb2e32ee1370033 input=a9049054013a1b77]*/ diff --git a/Modules/clinic/_dbmmodule.c.h b/Modules/clinic/_dbmmodule.c.h index c0908623c973..e54c69cf7082 100644 --- a/Modules/clinic/_dbmmodule.c.h +++ b/Modules/clinic/_dbmmodule.c.h @@ -132,13 +132,49 @@ dbmopen(PyObject *module, PyObject *const *args, Py_ssize_t nargs) const char *flags = "r"; int mode = 438; - if (!_PyArg_ParseStack(args, nargs, "U|si:open", - &filename, &flags, &mode)) { + if (!_PyArg_CheckPositional("open", nargs, 1, 3)) { goto exit; } + if (!PyUnicode_Check(args[0])) { + _PyArg_BadArgument("open", 1, "str", args[0]); + goto exit; + } + if (PyUnicode_READY(args[0]) == -1) { + goto exit; + } + filename = args[0]; + if (nargs < 2) { + goto skip_optional; + } + if (!PyUnicode_Check(args[1])) { + _PyArg_BadArgument("open", 2, "str", args[1]); + goto exit; + } + Py_ssize_t flags_length; + flags = PyUnicode_AsUTF8AndSize(args[1], &flags_length); + if (flags == NULL) { + goto exit; + } + if (strlen(flags) != (size_t)flags_length) { + PyErr_SetString(PyExc_ValueError, "embedded null character"); + goto exit; + } + if (nargs < 3) { + goto skip_optional; + } + if (PyFloat_Check(args[2])) { + PyErr_SetString(PyExc_TypeError, + "integer argument expected, got float" ); + goto exit; + } + mode = _PyLong_AsInt(args[2]); + if (mode == -1 && PyErr_Occurred()) { + goto exit; + } +skip_optional: return_value = dbmopen_impl(module, filename, flags, mode); exit: return return_value; } -/*[clinic end generated code: output=e4585e78f5821b5b input=a9049054013a1b77]*/ +/*[clinic end generated code: output=7f5d30ef5d820b8a input=a9049054013a1b77]*/ diff --git a/Modules/clinic/_elementtree.c.h b/Modules/clinic/_elementtree.c.h index e5107eb24fee..896898396403 100644 --- a/Modules/clinic/_elementtree.c.h +++ b/Modules/clinic/_elementtree.c.h @@ -20,7 +20,7 @@ _elementtree_Element_append(ElementObject *self, PyObject *arg) PyObject *subelement; if (!PyObject_TypeCheck(arg, &Element_Type)) { - _PyArg_BadArgument("append", (&Element_Type)->tp_name, arg); + _PyArg_BadArgument("append", 0, (&Element_Type)->tp_name, arg); goto exit; } subelement = arg; @@ -82,7 +82,7 @@ _elementtree_Element___deepcopy__(ElementObject *self, PyObject *arg) PyObject *memo; if (!PyDict_Check(arg)) { - _PyArg_BadArgument("__deepcopy__", "dict", arg); + _PyArg_BadArgument("__deepcopy__", 0, "dict", arg); goto exit; } memo = arg; @@ -420,10 +420,31 @@ _elementtree_Element_insert(ElementObject *self, PyObject *const *args, Py_ssize Py_ssize_t index; PyObject *subelement; - if (!_PyArg_ParseStack(args, nargs, "nO!:insert", - &index, &Element_Type, &subelement)) { + if (!_PyArg_CheckPositional("insert", nargs, 2, 2)) { goto exit; } + if (PyFloat_Check(args[0])) { + PyErr_SetString(PyExc_TypeError, + "integer argument expected, got float" ); + goto exit; + } + { + Py_ssize_t ival = -1; + PyObject *iobj = PyNumber_Index(args[0]); + if (iobj != NULL) { + ival = PyLong_AsSsize_t(iobj); + Py_DECREF(iobj); + } + if (ival == -1 && PyErr_Occurred()) { + goto exit; + } + index = ival; + } + if (!PyObject_TypeCheck(args[1], &Element_Type)) { + _PyArg_BadArgument("insert", 2, (&Element_Type)->tp_name, args[1]); + goto exit; + } + subelement = args[1]; return_value = _elementtree_Element_insert_impl(self, index, subelement); exit: @@ -512,7 +533,7 @@ _elementtree_Element_remove(ElementObject *self, PyObject *arg) PyObject *subelement; if (!PyObject_TypeCheck(arg, &Element_Type)) { - _PyArg_BadArgument("remove", (&Element_Type)->tp_name, arg); + _PyArg_BadArgument("remove", 0, (&Element_Type)->tp_name, arg); goto exit; } subelement = arg; @@ -723,4 +744,4 @@ _elementtree_XMLParser__setevents(XMLParserObject *self, PyObject *const *args, exit: return return_value; } -/*[clinic end generated code: output=398640585689c5ed input=a9049054013a1b77]*/ +/*[clinic end generated code: output=6bbedd24b709dc00 input=a9049054013a1b77]*/ diff --git a/Modules/clinic/_gdbmmodule.c.h b/Modules/clinic/_gdbmmodule.c.h index 5edc440b181e..9475264ef9d9 100644 --- a/Modules/clinic/_gdbmmodule.c.h +++ b/Modules/clinic/_gdbmmodule.c.h @@ -245,13 +245,49 @@ dbmopen(PyObject *module, PyObject *const *args, Py_ssize_t nargs) const char *flags = "r"; int mode = 438; - if (!_PyArg_ParseStack(args, nargs, "U|si:open", - &filename, &flags, &mode)) { + if (!_PyArg_CheckPositional("open", nargs, 1, 3)) { goto exit; } + if (!PyUnicode_Check(args[0])) { + _PyArg_BadArgument("open", 1, "str", args[0]); + goto exit; + } + if (PyUnicode_READY(args[0]) == -1) { + goto exit; + } + filename = args[0]; + if (nargs < 2) { + goto skip_optional; + } + if (!PyUnicode_Check(args[1])) { + _PyArg_BadArgument("open", 2, "str", args[1]); + goto exit; + } + Py_ssize_t flags_length; + flags = PyUnicode_AsUTF8AndSize(args[1], &flags_length); + if (flags == NULL) { + goto exit; + } + if (strlen(flags) != (size_t)flags_length) { + PyErr_SetString(PyExc_ValueError, "embedded null character"); + goto exit; + } + if (nargs < 3) { + goto skip_optional; + } + if (PyFloat_Check(args[2])) { + PyErr_SetString(PyExc_TypeError, + "integer argument expected, got float" ); + goto exit; + } + mode = _PyLong_AsInt(args[2]); + if (mode == -1 && PyErr_Occurred()) { + goto exit; + } +skip_optional: return_value = dbmopen_impl(module, filename, flags, mode); exit: return return_value; } -/*[clinic end generated code: output=5ca4361417bf96cb input=a9049054013a1b77]*/ +/*[clinic end generated code: output=05f06065d2dc1f9e input=a9049054013a1b77]*/ diff --git a/Modules/clinic/_lzmamodule.c.h b/Modules/clinic/_lzmamodule.c.h index 46d2988ad3e1..aeb98a66039e 100644 --- a/Modules/clinic/_lzmamodule.c.h +++ b/Modules/clinic/_lzmamodule.c.h @@ -29,7 +29,7 @@ _lzma_LZMACompressor_compress(Compressor *self, PyObject *arg) goto exit; } if (!PyBuffer_IsContiguous(&data, 'C')) { - _PyArg_BadArgument("compress", "contiguous buffer", arg); + _PyArg_BadArgument("compress", 0, "contiguous buffer", arg); goto exit; } return_value = _lzma_LZMACompressor_compress_impl(self, &data); @@ -252,8 +252,17 @@ _lzma__decode_filter_properties(PyObject *module, PyObject *const *args, Py_ssiz lzma_vli filter_id; Py_buffer encoded_props = {NULL, NULL}; - if (!_PyArg_ParseStack(args, nargs, "O&y*:_decode_filter_properties", - lzma_vli_converter, &filter_id, &encoded_props)) { + if (!_PyArg_CheckPositional("_decode_filter_properties", nargs, 2, 2)) { + goto exit; + } + if (!lzma_vli_converter(args[0], &filter_id)) { + goto exit; + } + if (PyObject_GetBuffer(args[1], &encoded_props, PyBUF_SIMPLE) != 0) { + goto exit; + } + if (!PyBuffer_IsContiguous(&encoded_props, 'C')) { + _PyArg_BadArgument("_decode_filter_properties", 2, "contiguous buffer", args[1]); goto exit; } return_value = _lzma__decode_filter_properties_impl(module, filter_id, &encoded_props); @@ -266,4 +275,4 @@ _lzma__decode_filter_properties(PyObject *module, PyObject *const *args, Py_ssiz return return_value; } -/*[clinic end generated code: output=df061bfc2067a90a input=a9049054013a1b77]*/ +/*[clinic end generated code: output=47e4732df79509ad input=a9049054013a1b77]*/ diff --git a/Modules/clinic/_operator.c.h b/Modules/clinic/_operator.c.h index 9d7b20c1cc0e..72c0e57ea2bb 100644 --- a/Modules/clinic/_operator.c.h +++ b/Modules/clinic/_operator.c.h @@ -1416,10 +1416,31 @@ _operator_length_hint(PyObject *module, PyObject *const *args, Py_ssize_t nargs) Py_ssize_t default_value = 0; Py_ssize_t _return_value; - if (!_PyArg_ParseStack(args, nargs, "O|n:length_hint", - &obj, &default_value)) { + if (!_PyArg_CheckPositional("length_hint", nargs, 1, 2)) { goto exit; } + obj = args[0]; + if (nargs < 2) { + goto skip_optional; + } + if (PyFloat_Check(args[1])) { + PyErr_SetString(PyExc_TypeError, + "integer argument expected, got float" ); + goto exit; + } + { + Py_ssize_t ival = -1; + PyObject *iobj = PyNumber_Index(args[1]); + if (iobj != NULL) { + ival = PyLong_AsSsize_t(iobj); + Py_DECREF(iobj); + } + if (ival == -1 && PyErr_Occurred()) { + goto exit; + } + default_value = ival; + } +skip_optional: _return_value = _operator_length_hint_impl(module, obj, default_value); if ((_return_value == -1) && PyErr_Occurred()) { goto exit; @@ -1469,4 +1490,4 @@ _operator__compare_digest(PyObject *module, PyObject *const *args, Py_ssize_t na exit: return return_value; } -/*[clinic end generated code: output=424b884884ab20b7 input=a9049054013a1b77]*/ +/*[clinic end generated code: output=b382bece80a5a254 input=a9049054013a1b77]*/ diff --git a/Modules/clinic/_ssl.c.h b/Modules/clinic/_ssl.c.h index e149077edee6..3ec30c3c9850 100644 --- a/Modules/clinic/_ssl.c.h +++ b/Modules/clinic/_ssl.c.h @@ -71,10 +71,17 @@ _ssl__SSLSocket_getpeercert(PySSLSocket *self, PyObject *const *args, Py_ssize_t PyObject *return_value = NULL; int binary_mode = 0; - if (!_PyArg_ParseStack(args, nargs, "|p:getpeercert", - &binary_mode)) { + if (!_PyArg_CheckPositional("getpeercert", nargs, 0, 1)) { goto exit; } + if (nargs < 1) { + goto skip_optional; + } + binary_mode = PyObject_IsTrue(args[0]); + if (binary_mode < 0) { + goto exit; + } +skip_optional: return_value = _ssl__SSLSocket_getpeercert_impl(self, binary_mode); exit: @@ -215,7 +222,7 @@ _ssl__SSLSocket_write(PySSLSocket *self, PyObject *arg) goto exit; } if (!PyBuffer_IsContiguous(&b, 'C')) { - _PyArg_BadArgument("write", "contiguous buffer", arg); + _PyArg_BadArgument("write", 0, "contiguous buffer", arg); goto exit; } return_value = _ssl__SSLSocket_write_impl(self, &b); @@ -377,8 +384,16 @@ _ssl__SSLContext(PyTypeObject *type, PyObject *args, PyObject *kwargs) !_PyArg_NoKeywords("_SSLContext", kwargs)) { goto exit; } - if (!PyArg_ParseTuple(args, "i:_SSLContext", - &proto_version)) { + if (!_PyArg_CheckPositional("_SSLContext", PyTuple_GET_SIZE(args), 1, 1)) { + goto exit; + } + if (PyFloat_Check(PyTuple_GET_ITEM(args, 0))) { + PyErr_SetString(PyExc_TypeError, + "integer argument expected, got float" ); + goto exit; + } + proto_version = _PyLong_AsInt(PyTuple_GET_ITEM(args, 0)); + if (proto_version == -1 && PyErr_Occurred()) { goto exit; } return_value = _ssl__SSLContext_impl(type, proto_version); @@ -405,7 +420,7 @@ _ssl__SSLContext_set_ciphers(PySSLContext *self, PyObject *arg) const char *cipherlist; if (!PyUnicode_Check(arg)) { - _PyArg_BadArgument("set_ciphers", "str", arg); + _PyArg_BadArgument("set_ciphers", 0, "str", arg); goto exit; } Py_ssize_t cipherlist_length; @@ -466,7 +481,7 @@ _ssl__SSLContext__set_npn_protocols(PySSLContext *self, PyObject *arg) goto exit; } if (!PyBuffer_IsContiguous(&protos, 'C')) { - _PyArg_BadArgument("_set_npn_protocols", "contiguous buffer", arg); + _PyArg_BadArgument("_set_npn_protocols", 0, "contiguous buffer", arg); goto exit; } return_value = _ssl__SSLContext__set_npn_protocols_impl(self, &protos); @@ -502,7 +517,7 @@ _ssl__SSLContext__set_alpn_protocols(PySSLContext *self, PyObject *arg) goto exit; } if (!PyBuffer_IsContiguous(&protos, 'C')) { - _PyArg_BadArgument("_set_alpn_protocols", "contiguous buffer", arg); + _PyArg_BadArgument("_set_alpn_protocols", 0, "contiguous buffer", arg); goto exit; } return_value = _ssl__SSLContext__set_alpn_protocols_impl(self, &protos); @@ -815,10 +830,22 @@ _ssl_MemoryBIO_read(PySSLMemoryBIO *self, PyObject *const *args, Py_ssize_t narg PyObject *return_value = NULL; int len = -1; - if (!_PyArg_ParseStack(args, nargs, "|i:read", - &len)) { + if (!_PyArg_CheckPositional("read", nargs, 0, 1)) { goto exit; } + if (nargs < 1) { + goto skip_optional; + } + if (PyFloat_Check(args[0])) { + PyErr_SetString(PyExc_TypeError, + "integer argument expected, got float" ); + goto exit; + } + len = _PyLong_AsInt(args[0]); + if (len == -1 && PyErr_Occurred()) { + goto exit; + } +skip_optional: return_value = _ssl_MemoryBIO_read_impl(self, len); exit: @@ -849,7 +876,7 @@ _ssl_MemoryBIO_write(PySSLMemoryBIO *self, PyObject *arg) goto exit; } if (!PyBuffer_IsContiguous(&b, 'C')) { - _PyArg_BadArgument("write", "contiguous buffer", arg); + _PyArg_BadArgument("write", 0, "contiguous buffer", arg); goto exit; } return_value = _ssl_MemoryBIO_write_impl(self, &b); @@ -905,8 +932,28 @@ _ssl_RAND_add(PyObject *module, PyObject *const *args, Py_ssize_t nargs) Py_buffer view = {NULL, NULL}; double entropy; - if (!_PyArg_ParseStack(args, nargs, "s*d:RAND_add", - &view, &entropy)) { + if (!_PyArg_CheckPositional("RAND_add", nargs, 2, 2)) { + goto exit; + } + if (PyUnicode_Check(args[0])) { + Py_ssize_t len; + const char *ptr = PyUnicode_AsUTF8AndSize(args[0], &len); + if (ptr == NULL) { + goto exit; + } + PyBuffer_FillInfo(&view, args[0], (void *)ptr, len, 1, 0); + } + else { /* any bytes-like object */ + if (PyObject_GetBuffer(args[0], &view, PyBUF_SIMPLE) != 0) { + goto exit; + } + if (!PyBuffer_IsContiguous(&view, 'C')) { + _PyArg_BadArgument("RAND_add", 1, "contiguous buffer", args[0]); + goto exit; + } + } + entropy = PyFloat_AsDouble(args[1]); + if (PyErr_Occurred()) { goto exit; } return_value = _ssl_RAND_add_impl(module, &view, entropy); @@ -1237,4 +1284,4 @@ _ssl_enum_crls(PyObject *module, PyObject *const *args, Py_ssize_t nargs, PyObje #ifndef _SSL_ENUM_CRLS_METHODDEF #define _SSL_ENUM_CRLS_METHODDEF #endif /* !defined(_SSL_ENUM_CRLS_METHODDEF) */ -/*[clinic end generated code: output=c2dca2ef4cbef4e2 input=a9049054013a1b77]*/ +/*[clinic end generated code: output=ac3fb15ca27500f2 input=a9049054013a1b77]*/ diff --git a/Modules/clinic/_struct.c.h b/Modules/clinic/_struct.c.h index 3a186e4c98f9..2cc2d216f8b4 100644 --- a/Modules/clinic/_struct.c.h +++ b/Modules/clinic/_struct.c.h @@ -61,7 +61,7 @@ Struct_unpack(PyStructObject *self, PyObject *arg) goto exit; } if (!PyBuffer_IsContiguous(&buffer, 'C')) { - _PyArg_BadArgument("unpack", "contiguous buffer", arg); + _PyArg_BadArgument("unpack", 0, "contiguous buffer", arg); goto exit; } return_value = Struct_unpack_impl(self, &buffer); @@ -209,8 +209,17 @@ unpack(PyObject *module, PyObject *const *args, Py_ssize_t nargs) PyStructObject *s_object = NULL; Py_buffer buffer = {NULL, NULL}; - if (!_PyArg_ParseStack(args, nargs, "O&y*:unpack", - cache_struct_converter, &s_object, &buffer)) { + if (!_PyArg_CheckPositional("unpack", nargs, 2, 2)) { + goto exit; + } + if (!cache_struct_converter(args[0], &s_object)) { + goto exit; + } + if (PyObject_GetBuffer(args[1], &buffer, PyBUF_SIMPLE) != 0) { + goto exit; + } + if (!PyBuffer_IsContiguous(&buffer, 'C')) { + _PyArg_BadArgument("unpack", 2, "contiguous buffer", args[1]); goto exit; } return_value = unpack_impl(module, s_object, &buffer); @@ -295,10 +304,13 @@ iter_unpack(PyObject *module, PyObject *const *args, Py_ssize_t nargs) PyStructObject *s_object = NULL; PyObject *buffer; - if (!_PyArg_ParseStack(args, nargs, "O&O:iter_unpack", - cache_struct_converter, &s_object, &buffer)) { + if (!_PyArg_CheckPositional("iter_unpack", nargs, 2, 2)) { + goto exit; + } + if (!cache_struct_converter(args[0], &s_object)) { goto exit; } + buffer = args[1]; return_value = iter_unpack_impl(module, s_object, buffer); exit: @@ -307,4 +319,4 @@ iter_unpack(PyObject *module, PyObject *const *args, Py_ssize_t nargs) return return_value; } -/*[clinic end generated code: output=01516bea2641fe01 input=a9049054013a1b77]*/ +/*[clinic end generated code: output=ac595db9d2b271aa input=a9049054013a1b77]*/ diff --git a/Modules/clinic/_tkinter.c.h b/Modules/clinic/_tkinter.c.h index 1eedd2bd3bc8..0a3e7ff5fc96 100644 --- a/Modules/clinic/_tkinter.c.h +++ b/Modules/clinic/_tkinter.c.h @@ -20,7 +20,7 @@ _tkinter_tkapp_eval(TkappObject *self, PyObject *arg) const char *script; if (!PyUnicode_Check(arg)) { - _PyArg_BadArgument("eval", "str", arg); + _PyArg_BadArgument("eval", 0, "str", arg); goto exit; } Py_ssize_t script_length; @@ -56,7 +56,7 @@ _tkinter_tkapp_evalfile(TkappObject *self, PyObject *arg) const char *fileName; if (!PyUnicode_Check(arg)) { - _PyArg_BadArgument("evalfile", "str", arg); + _PyArg_BadArgument("evalfile", 0, "str", arg); goto exit; } Py_ssize_t fileName_length; @@ -92,7 +92,7 @@ _tkinter_tkapp_record(TkappObject *self, PyObject *arg) const char *script; if (!PyUnicode_Check(arg)) { - _PyArg_BadArgument("record", "str", arg); + _PyArg_BadArgument("record", 0, "str", arg); goto exit; } Py_ssize_t script_length; @@ -128,7 +128,7 @@ _tkinter_tkapp_adderrorinfo(TkappObject *self, PyObject *arg) const char *msg; if (!PyUnicode_Check(arg)) { - _PyArg_BadArgument("adderrorinfo", "str", arg); + _PyArg_BadArgument("adderrorinfo", 0, "str", arg); goto exit; } Py_ssize_t msg_length; @@ -188,7 +188,7 @@ _tkinter_tkapp_exprstring(TkappObject *self, PyObject *arg) const char *s; if (!PyUnicode_Check(arg)) { - _PyArg_BadArgument("exprstring", "str", arg); + _PyArg_BadArgument("exprstring", 0, "str", arg); goto exit; } Py_ssize_t s_length; @@ -224,7 +224,7 @@ _tkinter_tkapp_exprlong(TkappObject *self, PyObject *arg) const char *s; if (!PyUnicode_Check(arg)) { - _PyArg_BadArgument("exprlong", "str", arg); + _PyArg_BadArgument("exprlong", 0, "str", arg); goto exit; } Py_ssize_t s_length; @@ -260,7 +260,7 @@ _tkinter_tkapp_exprdouble(TkappObject *self, PyObject *arg) const char *s; if (!PyUnicode_Check(arg)) { - _PyArg_BadArgument("exprdouble", "str", arg); + _PyArg_BadArgument("exprdouble", 0, "str", arg); goto exit; } Py_ssize_t s_length; @@ -296,7 +296,7 @@ _tkinter_tkapp_exprboolean(TkappObject *self, PyObject *arg) const char *s; if (!PyUnicode_Check(arg)) { - _PyArg_BadArgument("exprboolean", "str", arg); + _PyArg_BadArgument("exprboolean", 0, "str", arg); goto exit; } Py_ssize_t s_length; @@ -349,10 +349,23 @@ _tkinter_tkapp_createcommand(TkappObject *self, PyObject *const *args, Py_ssize_ const char *name; PyObject *func; - if (!_PyArg_ParseStack(args, nargs, "sO:createcommand", - &name, &func)) { + if (!_PyArg_CheckPositional("createcommand", nargs, 2, 2)) { goto exit; } + if (!PyUnicode_Check(args[0])) { + _PyArg_BadArgument("createcommand", 1, "str", args[0]); + goto exit; + } + Py_ssize_t name_length; + name = PyUnicode_AsUTF8AndSize(args[0], &name_length); + if (name == NULL) { + goto exit; + } + if (strlen(name) != (size_t)name_length) { + PyErr_SetString(PyExc_ValueError, "embedded null character"); + goto exit; + } + func = args[1]; return_value = _tkinter_tkapp_createcommand_impl(self, name, func); exit: @@ -377,7 +390,7 @@ _tkinter_tkapp_deletecommand(TkappObject *self, PyObject *arg) const char *name; if (!PyUnicode_Check(arg)) { - _PyArg_BadArgument("deletecommand", "str", arg); + _PyArg_BadArgument("deletecommand", 0, "str", arg); goto exit; } Py_ssize_t name_length; @@ -417,10 +430,20 @@ _tkinter_tkapp_createfilehandler(TkappObject *self, PyObject *const *args, Py_ss int mask; PyObject *func; - if (!_PyArg_ParseStack(args, nargs, "OiO:createfilehandler", - &file, &mask, &func)) { + if (!_PyArg_CheckPositional("createfilehandler", nargs, 3, 3)) { goto exit; } + file = args[0]; + if (PyFloat_Check(args[1])) { + PyErr_SetString(PyExc_TypeError, + "integer argument expected, got float" ); + goto exit; + } + mask = _PyLong_AsInt(args[1]); + if (mask == -1 && PyErr_Occurred()) { + goto exit; + } + func = args[2]; return_value = _tkinter_tkapp_createfilehandler_impl(self, file, mask, func); exit: @@ -477,10 +500,19 @@ _tkinter_tkapp_createtimerhandler(TkappObject *self, PyObject *const *args, Py_s int milliseconds; PyObject *func; - if (!_PyArg_ParseStack(args, nargs, "iO:createtimerhandler", - &milliseconds, &func)) { + if (!_PyArg_CheckPositional("createtimerhandler", nargs, 2, 2)) { goto exit; } + if (PyFloat_Check(args[0])) { + PyErr_SetString(PyExc_TypeError, + "integer argument expected, got float" ); + goto exit; + } + milliseconds = _PyLong_AsInt(args[0]); + if (milliseconds == -1 && PyErr_Occurred()) { + goto exit; + } + func = args[1]; return_value = _tkinter_tkapp_createtimerhandler_impl(self, milliseconds, func); exit: @@ -504,10 +536,22 @@ _tkinter_tkapp_mainloop(TkappObject *self, PyObject *const *args, Py_ssize_t nar PyObject *return_value = NULL; int threshold = 0; - if (!_PyArg_ParseStack(args, nargs, "|i:mainloop", - &threshold)) { + if (!_PyArg_CheckPositional("mainloop", nargs, 0, 1)) { goto exit; } + if (nargs < 1) { + goto skip_optional; + } + if (PyFloat_Check(args[0])) { + PyErr_SetString(PyExc_TypeError, + "integer argument expected, got float" ); + goto exit; + } + threshold = _PyLong_AsInt(args[0]); + if (threshold == -1 && PyErr_Occurred()) { + goto exit; + } +skip_optional: return_value = _tkinter_tkapp_mainloop_impl(self, threshold); exit: @@ -531,10 +575,22 @@ _tkinter_tkapp_dooneevent(TkappObject *self, PyObject *const *args, Py_ssize_t n PyObject *return_value = NULL; int flags = 0; - if (!_PyArg_ParseStack(args, nargs, "|i:dooneevent", - &flags)) { + if (!_PyArg_CheckPositional("dooneevent", nargs, 0, 1)) { + goto exit; + } + if (nargs < 1) { + goto skip_optional; + } + if (PyFloat_Check(args[0])) { + PyErr_SetString(PyExc_TypeError, + "integer argument expected, got float" ); goto exit; } + flags = _PyLong_AsInt(args[0]); + if (flags == -1 && PyErr_Occurred()) { + goto exit; + } +skip_optional: return_value = _tkinter_tkapp_dooneevent_impl(self, flags); exit: @@ -654,10 +710,132 @@ _tkinter_create(PyObject *module, PyObject *const *args, Py_ssize_t nargs) int sync = 0; const char *use = NULL; - if (!_PyArg_ParseStack(args, nargs, "|zssiiiiz:create", - &screenName, &baseName, &className, &interactive, &wantobjects, &wantTk, &sync, &use)) { + if (!_PyArg_CheckPositional("create", nargs, 0, 8)) { + goto exit; + } + if (nargs < 1) { + goto skip_optional; + } + if (args[0] == Py_None) { + screenName = NULL; + } + else if (PyUnicode_Check(args[0])) { + Py_ssize_t screenName_length; + screenName = PyUnicode_AsUTF8AndSize(args[0], &screenName_length); + if (screenName == NULL) { + goto exit; + } + if (strlen(screenName) != (size_t)screenName_length) { + PyErr_SetString(PyExc_ValueError, "embedded null character"); + goto exit; + } + } + else { + _PyArg_BadArgument("create", 1, "str or None", args[0]); + goto exit; + } + if (nargs < 2) { + goto skip_optional; + } + if (!PyUnicode_Check(args[1])) { + _PyArg_BadArgument("create", 2, "str", args[1]); + goto exit; + } + Py_ssize_t baseName_length; + baseName = PyUnicode_AsUTF8AndSize(args[1], &baseName_length); + if (baseName == NULL) { + goto exit; + } + if (strlen(baseName) != (size_t)baseName_length) { + PyErr_SetString(PyExc_ValueError, "embedded null character"); + goto exit; + } + if (nargs < 3) { + goto skip_optional; + } + if (!PyUnicode_Check(args[2])) { + _PyArg_BadArgument("create", 3, "str", args[2]); + goto exit; + } + Py_ssize_t className_length; + className = PyUnicode_AsUTF8AndSize(args[2], &className_length); + if (className == NULL) { + goto exit; + } + if (strlen(className) != (size_t)className_length) { + PyErr_SetString(PyExc_ValueError, "embedded null character"); + goto exit; + } + if (nargs < 4) { + goto skip_optional; + } + if (PyFloat_Check(args[3])) { + PyErr_SetString(PyExc_TypeError, + "integer argument expected, got float" ); + goto exit; + } + interactive = _PyLong_AsInt(args[3]); + if (interactive == -1 && PyErr_Occurred()) { + goto exit; + } + if (nargs < 5) { + goto skip_optional; + } + if (PyFloat_Check(args[4])) { + PyErr_SetString(PyExc_TypeError, + "integer argument expected, got float" ); + goto exit; + } + wantobjects = _PyLong_AsInt(args[4]); + if (wantobjects == -1 && PyErr_Occurred()) { + goto exit; + } + if (nargs < 6) { + goto skip_optional; + } + if (PyFloat_Check(args[5])) { + PyErr_SetString(PyExc_TypeError, + "integer argument expected, got float" ); + goto exit; + } + wantTk = _PyLong_AsInt(args[5]); + if (wantTk == -1 && PyErr_Occurred()) { + goto exit; + } + if (nargs < 7) { + goto skip_optional; + } + if (PyFloat_Check(args[6])) { + PyErr_SetString(PyExc_TypeError, + "integer argument expected, got float" ); + goto exit; + } + sync = _PyLong_AsInt(args[6]); + if (sync == -1 && PyErr_Occurred()) { + goto exit; + } + if (nargs < 8) { + goto skip_optional; + } + if (args[7] == Py_None) { + use = NULL; + } + else if (PyUnicode_Check(args[7])) { + Py_ssize_t use_length; + use = PyUnicode_AsUTF8AndSize(args[7], &use_length); + if (use == NULL) { + goto exit; + } + if (strlen(use) != (size_t)use_length) { + PyErr_SetString(PyExc_ValueError, "embedded null character"); + goto exit; + } + } + else { + _PyArg_BadArgument("create", 8, "str or None", args[7]); goto exit; } +skip_optional: return_value = _tkinter_create_impl(module, screenName, baseName, className, interactive, wantobjects, wantTk, sync, use); exit: @@ -734,4 +912,4 @@ _tkinter_getbusywaitinterval(PyObject *module, PyObject *Py_UNUSED(ignored)) #ifndef _TKINTER_TKAPP_DELETEFILEHANDLER_METHODDEF #define _TKINTER_TKAPP_DELETEFILEHANDLER_METHODDEF #endif /* !defined(_TKINTER_TKAPP_DELETEFILEHANDLER_METHODDEF) */ -/*[clinic end generated code: output=d84b0e794824c511 input=a9049054013a1b77]*/ +/*[clinic end generated code: output=2cf95f0101f3dbca input=a9049054013a1b77]*/ diff --git a/Modules/clinic/_tracemalloc.c.h b/Modules/clinic/_tracemalloc.c.h index aa736ec2fd0c..68fafdc3833d 100644 --- a/Modules/clinic/_tracemalloc.c.h +++ b/Modules/clinic/_tracemalloc.c.h @@ -95,10 +95,22 @@ _tracemalloc_start(PyObject *module, PyObject *const *args, Py_ssize_t nargs) PyObject *return_value = NULL; int nframe = 1; - if (!_PyArg_ParseStack(args, nargs, "|i:start", - &nframe)) { + if (!_PyArg_CheckPositional("start", nargs, 0, 1)) { goto exit; } + if (nargs < 1) { + goto skip_optional; + } + if (PyFloat_Check(args[0])) { + PyErr_SetString(PyExc_TypeError, + "integer argument expected, got float" ); + goto exit; + } + nframe = _PyLong_AsInt(args[0]); + if (nframe == -1 && PyErr_Occurred()) { + goto exit; + } +skip_optional: return_value = _tracemalloc_start_impl(module, nframe); exit: @@ -185,4 +197,4 @@ _tracemalloc_get_traced_memory(PyObject *module, PyObject *Py_UNUSED(ignored)) { return _tracemalloc_get_traced_memory_impl(module); } -/*[clinic end generated code: output=d4a2dd3eaba9f72d input=a9049054013a1b77]*/ +/*[clinic end generated code: output=1bc96dc569706afa input=a9049054013a1b77]*/ diff --git a/Modules/clinic/_weakref.c.h b/Modules/clinic/_weakref.c.h index 21f00ffd205c..aa0d77f7180e 100644 --- a/Modules/clinic/_weakref.c.h +++ b/Modules/clinic/_weakref.c.h @@ -50,13 +50,18 @@ _weakref__remove_dead_weakref(PyObject *module, PyObject *const *args, Py_ssize_ PyObject *dct; PyObject *key; - if (!_PyArg_ParseStack(args, nargs, "O!O:_remove_dead_weakref", - &PyDict_Type, &dct, &key)) { + if (!_PyArg_CheckPositional("_remove_dead_weakref", nargs, 2, 2)) { goto exit; } + if (!PyDict_Check(args[0])) { + _PyArg_BadArgument("_remove_dead_weakref", 1, "dict", args[0]); + goto exit; + } + dct = args[0]; + key = args[1]; return_value = _weakref__remove_dead_weakref_impl(module, dct, key); exit: return return_value; } -/*[clinic end generated code: output=927e889feb8a7dc4 input=a9049054013a1b77]*/ +/*[clinic end generated code: output=eae22e2d2e43120e input=a9049054013a1b77]*/ diff --git a/Modules/clinic/arraymodule.c.h b/Modules/clinic/arraymodule.c.h index eb081ad0718e..5f45b7cf6734 100644 --- a/Modules/clinic/arraymodule.c.h +++ b/Modules/clinic/arraymodule.c.h @@ -76,10 +76,30 @@ array_array_pop(arrayobject *self, PyObject *const *args, Py_ssize_t nargs) PyObject *return_value = NULL; Py_ssize_t i = -1; - if (!_PyArg_ParseStack(args, nargs, "|n:pop", - &i)) { + if (!_PyArg_CheckPositional("pop", nargs, 0, 1)) { goto exit; } + if (nargs < 1) { + goto skip_optional; + } + if (PyFloat_Check(args[0])) { + PyErr_SetString(PyExc_TypeError, + "integer argument expected, got float" ); + goto exit; + } + { + Py_ssize_t ival = -1; + PyObject *iobj = PyNumber_Index(args[0]); + if (iobj != NULL) { + ival = PyLong_AsSsize_t(iobj); + Py_DECREF(iobj); + } + if (ival == -1 && PyErr_Occurred()) { + goto exit; + } + i = ival; + } +skip_optional: return_value = array_array_pop_impl(self, i); exit: @@ -114,10 +134,27 @@ array_array_insert(arrayobject *self, PyObject *const *args, Py_ssize_t nargs) Py_ssize_t i; PyObject *v; - if (!_PyArg_ParseStack(args, nargs, "nO:insert", - &i, &v)) { + if (!_PyArg_CheckPositional("insert", nargs, 2, 2)) { goto exit; } + if (PyFloat_Check(args[0])) { + PyErr_SetString(PyExc_TypeError, + "integer argument expected, got float" ); + goto exit; + } + { + Py_ssize_t ival = -1; + PyObject *iobj = PyNumber_Index(args[0]); + if (iobj != NULL) { + ival = PyLong_AsSsize_t(iobj); + Py_DECREF(iobj); + } + if (ival == -1 && PyErr_Occurred()) { + goto exit; + } + i = ival; + } + v = args[1]; return_value = array_array_insert_impl(self, i, v); exit: @@ -212,10 +249,27 @@ array_array_fromfile(arrayobject *self, PyObject *const *args, Py_ssize_t nargs) PyObject *f; Py_ssize_t n; - if (!_PyArg_ParseStack(args, nargs, "On:fromfile", - &f, &n)) { + if (!_PyArg_CheckPositional("fromfile", nargs, 2, 2)) { + goto exit; + } + f = args[0]; + if (PyFloat_Check(args[1])) { + PyErr_SetString(PyExc_TypeError, + "integer argument expected, got float" ); goto exit; } + { + Py_ssize_t ival = -1; + PyObject *iobj = PyNumber_Index(args[1]); + if (iobj != NULL) { + ival = PyLong_AsSsize_t(iobj); + Py_DECREF(iobj); + } + if (ival == -1 && PyErr_Occurred()) { + goto exit; + } + n = ival; + } return_value = array_array_fromfile_impl(self, f, n); exit: @@ -291,7 +345,7 @@ array_array_fromstring(arrayobject *self, PyObject *arg) goto exit; } if (!PyBuffer_IsContiguous(&buffer, 'C')) { - _PyArg_BadArgument("fromstring", "contiguous buffer", arg); + _PyArg_BadArgument("fromstring", 0, "contiguous buffer", arg); goto exit; } } @@ -328,7 +382,7 @@ array_array_frombytes(arrayobject *self, PyObject *arg) goto exit; } if (!PyBuffer_IsContiguous(&buffer, 'C')) { - _PyArg_BadArgument("frombytes", "contiguous buffer", arg); + _PyArg_BadArgument("frombytes", 0, "contiguous buffer", arg); goto exit; } return_value = array_array_frombytes_impl(self, &buffer); @@ -478,10 +532,32 @@ array__array_reconstructor(PyObject *module, PyObject *const *args, Py_ssize_t n enum machine_format_code mformat_code; PyObject *items; - if (!_PyArg_ParseStack(args, nargs, "OCiO:_array_reconstructor", - &arraytype, &typecode, &mformat_code, &items)) { + if (!_PyArg_CheckPositional("_array_reconstructor", nargs, 4, 4)) { + goto exit; + } + arraytype = (PyTypeObject *)args[0]; + if (!PyUnicode_Check(args[1])) { + _PyArg_BadArgument("_array_reconstructor", 2, "a unicode character", args[1]); + goto exit; + } + if (PyUnicode_READY(args[1])) { + goto exit; + } + if (PyUnicode_GET_LENGTH(args[1]) != 1) { + _PyArg_BadArgument("_array_reconstructor", 2, "a unicode character", args[1]); + goto exit; + } + typecode = PyUnicode_READ_CHAR(args[1], 0); + if (PyFloat_Check(args[2])) { + PyErr_SetString(PyExc_TypeError, + "integer argument expected, got float" ); + goto exit; + } + mformat_code = _PyLong_AsInt(args[2]); + if (mformat_code == -1 && PyErr_Occurred()) { goto exit; } + items = args[3]; return_value = array__array_reconstructor_impl(module, arraytype, typecode, mformat_code, items); exit: @@ -523,4 +599,4 @@ PyDoc_STRVAR(array_arrayiterator___setstate____doc__, #define ARRAY_ARRAYITERATOR___SETSTATE___METHODDEF \ {"__setstate__", (PyCFunction)array_arrayiterator___setstate__, METH_O, array_arrayiterator___setstate____doc__}, -/*[clinic end generated code: output=15da19d2ece09d22 input=a9049054013a1b77]*/ +/*[clinic end generated code: output=c9a40f11f1a866fb input=a9049054013a1b77]*/ diff --git a/Modules/clinic/audioop.c.h b/Modules/clinic/audioop.c.h index 1eae9aaf5d80..4ea7155373be 100644 --- a/Modules/clinic/audioop.c.h +++ b/Modules/clinic/audioop.c.h @@ -23,10 +23,42 @@ audioop_getsample(PyObject *module, PyObject *const *args, Py_ssize_t nargs) int width; Py_ssize_t index; - if (!_PyArg_ParseStack(args, nargs, "y*in:getsample", - &fragment, &width, &index)) { + if (!_PyArg_CheckPositional("getsample", nargs, 3, 3)) { goto exit; } + if (PyObject_GetBuffer(args[0], &fragment, PyBUF_SIMPLE) != 0) { + goto exit; + } + if (!PyBuffer_IsContiguous(&fragment, 'C')) { + _PyArg_BadArgument("getsample", 1, "contiguous buffer", args[0]); + goto exit; + } + if (PyFloat_Check(args[1])) { + PyErr_SetString(PyExc_TypeError, + "integer argument expected, got float" ); + goto exit; + } + width = _PyLong_AsInt(args[1]); + if (width == -1 && PyErr_Occurred()) { + goto exit; + } + if (PyFloat_Check(args[2])) { + PyErr_SetString(PyExc_TypeError, + "integer argument expected, got float" ); + goto exit; + } + { + Py_ssize_t ival = -1; + PyObject *iobj = PyNumber_Index(args[2]); + if (iobj != NULL) { + ival = PyLong_AsSsize_t(iobj); + Py_DECREF(iobj); + } + if (ival == -1 && PyErr_Occurred()) { + goto exit; + } + index = ival; + } return_value = audioop_getsample_impl(module, &fragment, width, index); exit: @@ -57,8 +89,23 @@ audioop_max(PyObject *module, PyObject *const *args, Py_ssize_t nargs) Py_buffer fragment = {NULL, NULL}; int width; - if (!_PyArg_ParseStack(args, nargs, "y*i:max", - &fragment, &width)) { + if (!_PyArg_CheckPositional("max", nargs, 2, 2)) { + goto exit; + } + if (PyObject_GetBuffer(args[0], &fragment, PyBUF_SIMPLE) != 0) { + goto exit; + } + if (!PyBuffer_IsContiguous(&fragment, 'C')) { + _PyArg_BadArgument("max", 1, "contiguous buffer", args[0]); + goto exit; + } + if (PyFloat_Check(args[1])) { + PyErr_SetString(PyExc_TypeError, + "integer argument expected, got float" ); + goto exit; + } + width = _PyLong_AsInt(args[1]); + if (width == -1 && PyErr_Occurred()) { goto exit; } return_value = audioop_max_impl(module, &fragment, width); @@ -91,8 +138,23 @@ audioop_minmax(PyObject *module, PyObject *const *args, Py_ssize_t nargs) Py_buffer fragment = {NULL, NULL}; int width; - if (!_PyArg_ParseStack(args, nargs, "y*i:minmax", - &fragment, &width)) { + if (!_PyArg_CheckPositional("minmax", nargs, 2, 2)) { + goto exit; + } + if (PyObject_GetBuffer(args[0], &fragment, PyBUF_SIMPLE) != 0) { + goto exit; + } + if (!PyBuffer_IsContiguous(&fragment, 'C')) { + _PyArg_BadArgument("minmax", 1, "contiguous buffer", args[0]); + goto exit; + } + if (PyFloat_Check(args[1])) { + PyErr_SetString(PyExc_TypeError, + "integer argument expected, got float" ); + goto exit; + } + width = _PyLong_AsInt(args[1]); + if (width == -1 && PyErr_Occurred()) { goto exit; } return_value = audioop_minmax_impl(module, &fragment, width); @@ -125,8 +187,23 @@ audioop_avg(PyObject *module, PyObject *const *args, Py_ssize_t nargs) Py_buffer fragment = {NULL, NULL}; int width; - if (!_PyArg_ParseStack(args, nargs, "y*i:avg", - &fragment, &width)) { + if (!_PyArg_CheckPositional("avg", nargs, 2, 2)) { + goto exit; + } + if (PyObject_GetBuffer(args[0], &fragment, PyBUF_SIMPLE) != 0) { + goto exit; + } + if (!PyBuffer_IsContiguous(&fragment, 'C')) { + _PyArg_BadArgument("avg", 1, "contiguous buffer", args[0]); + goto exit; + } + if (PyFloat_Check(args[1])) { + PyErr_SetString(PyExc_TypeError, + "integer argument expected, got float" ); + goto exit; + } + width = _PyLong_AsInt(args[1]); + if (width == -1 && PyErr_Occurred()) { goto exit; } return_value = audioop_avg_impl(module, &fragment, width); @@ -159,8 +236,23 @@ audioop_rms(PyObject *module, PyObject *const *args, Py_ssize_t nargs) Py_buffer fragment = {NULL, NULL}; int width; - if (!_PyArg_ParseStack(args, nargs, "y*i:rms", - &fragment, &width)) { + if (!_PyArg_CheckPositional("rms", nargs, 2, 2)) { + goto exit; + } + if (PyObject_GetBuffer(args[0], &fragment, PyBUF_SIMPLE) != 0) { + goto exit; + } + if (!PyBuffer_IsContiguous(&fragment, 'C')) { + _PyArg_BadArgument("rms", 1, "contiguous buffer", args[0]); + goto exit; + } + if (PyFloat_Check(args[1])) { + PyErr_SetString(PyExc_TypeError, + "integer argument expected, got float" ); + goto exit; + } + width = _PyLong_AsInt(args[1]); + if (width == -1 && PyErr_Occurred()) { goto exit; } return_value = audioop_rms_impl(module, &fragment, width); @@ -194,8 +286,21 @@ audioop_findfit(PyObject *module, PyObject *const *args, Py_ssize_t nargs) Py_buffer fragment = {NULL, NULL}; Py_buffer reference = {NULL, NULL}; - if (!_PyArg_ParseStack(args, nargs, "y*y*:findfit", - &fragment, &reference)) { + if (!_PyArg_CheckPositional("findfit", nargs, 2, 2)) { + goto exit; + } + if (PyObject_GetBuffer(args[0], &fragment, PyBUF_SIMPLE) != 0) { + goto exit; + } + if (!PyBuffer_IsContiguous(&fragment, 'C')) { + _PyArg_BadArgument("findfit", 1, "contiguous buffer", args[0]); + goto exit; + } + if (PyObject_GetBuffer(args[1], &reference, PyBUF_SIMPLE) != 0) { + goto exit; + } + if (!PyBuffer_IsContiguous(&reference, 'C')) { + _PyArg_BadArgument("findfit", 2, "contiguous buffer", args[1]); goto exit; } return_value = audioop_findfit_impl(module, &fragment, &reference); @@ -233,8 +338,21 @@ audioop_findfactor(PyObject *module, PyObject *const *args, Py_ssize_t nargs) Py_buffer fragment = {NULL, NULL}; Py_buffer reference = {NULL, NULL}; - if (!_PyArg_ParseStack(args, nargs, "y*y*:findfactor", - &fragment, &reference)) { + if (!_PyArg_CheckPositional("findfactor", nargs, 2, 2)) { + goto exit; + } + if (PyObject_GetBuffer(args[0], &fragment, PyBUF_SIMPLE) != 0) { + goto exit; + } + if (!PyBuffer_IsContiguous(&fragment, 'C')) { + _PyArg_BadArgument("findfactor", 1, "contiguous buffer", args[0]); + goto exit; + } + if (PyObject_GetBuffer(args[1], &reference, PyBUF_SIMPLE) != 0) { + goto exit; + } + if (!PyBuffer_IsContiguous(&reference, 'C')) { + _PyArg_BadArgument("findfactor", 2, "contiguous buffer", args[1]); goto exit; } return_value = audioop_findfactor_impl(module, &fragment, &reference); @@ -272,10 +390,33 @@ audioop_findmax(PyObject *module, PyObject *const *args, Py_ssize_t nargs) Py_buffer fragment = {NULL, NULL}; Py_ssize_t length; - if (!_PyArg_ParseStack(args, nargs, "y*n:findmax", - &fragment, &length)) { + if (!_PyArg_CheckPositional("findmax", nargs, 2, 2)) { + goto exit; + } + if (PyObject_GetBuffer(args[0], &fragment, PyBUF_SIMPLE) != 0) { + goto exit; + } + if (!PyBuffer_IsContiguous(&fragment, 'C')) { + _PyArg_BadArgument("findmax", 1, "contiguous buffer", args[0]); goto exit; } + if (PyFloat_Check(args[1])) { + PyErr_SetString(PyExc_TypeError, + "integer argument expected, got float" ); + goto exit; + } + { + Py_ssize_t ival = -1; + PyObject *iobj = PyNumber_Index(args[1]); + if (iobj != NULL) { + ival = PyLong_AsSsize_t(iobj); + Py_DECREF(iobj); + } + if (ival == -1 && PyErr_Occurred()) { + goto exit; + } + length = ival; + } return_value = audioop_findmax_impl(module, &fragment, length); exit: @@ -306,8 +447,23 @@ audioop_avgpp(PyObject *module, PyObject *const *args, Py_ssize_t nargs) Py_buffer fragment = {NULL, NULL}; int width; - if (!_PyArg_ParseStack(args, nargs, "y*i:avgpp", - &fragment, &width)) { + if (!_PyArg_CheckPositional("avgpp", nargs, 2, 2)) { + goto exit; + } + if (PyObject_GetBuffer(args[0], &fragment, PyBUF_SIMPLE) != 0) { + goto exit; + } + if (!PyBuffer_IsContiguous(&fragment, 'C')) { + _PyArg_BadArgument("avgpp", 1, "contiguous buffer", args[0]); + goto exit; + } + if (PyFloat_Check(args[1])) { + PyErr_SetString(PyExc_TypeError, + "integer argument expected, got float" ); + goto exit; + } + width = _PyLong_AsInt(args[1]); + if (width == -1 && PyErr_Occurred()) { goto exit; } return_value = audioop_avgpp_impl(module, &fragment, width); @@ -340,8 +496,23 @@ audioop_maxpp(PyObject *module, PyObject *const *args, Py_ssize_t nargs) Py_buffer fragment = {NULL, NULL}; int width; - if (!_PyArg_ParseStack(args, nargs, "y*i:maxpp", - &fragment, &width)) { + if (!_PyArg_CheckPositional("maxpp", nargs, 2, 2)) { + goto exit; + } + if (PyObject_GetBuffer(args[0], &fragment, PyBUF_SIMPLE) != 0) { + goto exit; + } + if (!PyBuffer_IsContiguous(&fragment, 'C')) { + _PyArg_BadArgument("maxpp", 1, "contiguous buffer", args[0]); + goto exit; + } + if (PyFloat_Check(args[1])) { + PyErr_SetString(PyExc_TypeError, + "integer argument expected, got float" ); + goto exit; + } + width = _PyLong_AsInt(args[1]); + if (width == -1 && PyErr_Occurred()) { goto exit; } return_value = audioop_maxpp_impl(module, &fragment, width); @@ -374,8 +545,23 @@ audioop_cross(PyObject *module, PyObject *const *args, Py_ssize_t nargs) Py_buffer fragment = {NULL, NULL}; int width; - if (!_PyArg_ParseStack(args, nargs, "y*i:cross", - &fragment, &width)) { + if (!_PyArg_CheckPositional("cross", nargs, 2, 2)) { + goto exit; + } + if (PyObject_GetBuffer(args[0], &fragment, PyBUF_SIMPLE) != 0) { + goto exit; + } + if (!PyBuffer_IsContiguous(&fragment, 'C')) { + _PyArg_BadArgument("cross", 1, "contiguous buffer", args[0]); + goto exit; + } + if (PyFloat_Check(args[1])) { + PyErr_SetString(PyExc_TypeError, + "integer argument expected, got float" ); + goto exit; + } + width = _PyLong_AsInt(args[1]); + if (width == -1 && PyErr_Occurred()) { goto exit; } return_value = audioop_cross_impl(module, &fragment, width); @@ -410,8 +596,27 @@ audioop_mul(PyObject *module, PyObject *const *args, Py_ssize_t nargs) int width; double factor; - if (!_PyArg_ParseStack(args, nargs, "y*id:mul", - &fragment, &width, &factor)) { + if (!_PyArg_CheckPositional("mul", nargs, 3, 3)) { + goto exit; + } + if (PyObject_GetBuffer(args[0], &fragment, PyBUF_SIMPLE) != 0) { + goto exit; + } + if (!PyBuffer_IsContiguous(&fragment, 'C')) { + _PyArg_BadArgument("mul", 1, "contiguous buffer", args[0]); + goto exit; + } + if (PyFloat_Check(args[1])) { + PyErr_SetString(PyExc_TypeError, + "integer argument expected, got float" ); + goto exit; + } + width = _PyLong_AsInt(args[1]); + if (width == -1 && PyErr_Occurred()) { + goto exit; + } + factor = PyFloat_AsDouble(args[2]); + if (PyErr_Occurred()) { goto exit; } return_value = audioop_mul_impl(module, &fragment, width, factor); @@ -447,8 +652,31 @@ audioop_tomono(PyObject *module, PyObject *const *args, Py_ssize_t nargs) double lfactor; double rfactor; - if (!_PyArg_ParseStack(args, nargs, "y*idd:tomono", - &fragment, &width, &lfactor, &rfactor)) { + if (!_PyArg_CheckPositional("tomono", nargs, 4, 4)) { + goto exit; + } + if (PyObject_GetBuffer(args[0], &fragment, PyBUF_SIMPLE) != 0) { + goto exit; + } + if (!PyBuffer_IsContiguous(&fragment, 'C')) { + _PyArg_BadArgument("tomono", 1, "contiguous buffer", args[0]); + goto exit; + } + if (PyFloat_Check(args[1])) { + PyErr_SetString(PyExc_TypeError, + "integer argument expected, got float" ); + goto exit; + } + width = _PyLong_AsInt(args[1]); + if (width == -1 && PyErr_Occurred()) { + goto exit; + } + lfactor = PyFloat_AsDouble(args[2]); + if (PyErr_Occurred()) { + goto exit; + } + rfactor = PyFloat_AsDouble(args[3]); + if (PyErr_Occurred()) { goto exit; } return_value = audioop_tomono_impl(module, &fragment, width, lfactor, rfactor); @@ -484,8 +712,31 @@ audioop_tostereo(PyObject *module, PyObject *const *args, Py_ssize_t nargs) double lfactor; double rfactor; - if (!_PyArg_ParseStack(args, nargs, "y*idd:tostereo", - &fragment, &width, &lfactor, &rfactor)) { + if (!_PyArg_CheckPositional("tostereo", nargs, 4, 4)) { + goto exit; + } + if (PyObject_GetBuffer(args[0], &fragment, PyBUF_SIMPLE) != 0) { + goto exit; + } + if (!PyBuffer_IsContiguous(&fragment, 'C')) { + _PyArg_BadArgument("tostereo", 1, "contiguous buffer", args[0]); + goto exit; + } + if (PyFloat_Check(args[1])) { + PyErr_SetString(PyExc_TypeError, + "integer argument expected, got float" ); + goto exit; + } + width = _PyLong_AsInt(args[1]); + if (width == -1 && PyErr_Occurred()) { + goto exit; + } + lfactor = PyFloat_AsDouble(args[2]); + if (PyErr_Occurred()) { + goto exit; + } + rfactor = PyFloat_AsDouble(args[3]); + if (PyErr_Occurred()) { goto exit; } return_value = audioop_tostereo_impl(module, &fragment, width, lfactor, rfactor); @@ -520,8 +771,30 @@ audioop_add(PyObject *module, PyObject *const *args, Py_ssize_t nargs) Py_buffer fragment2 = {NULL, NULL}; int width; - if (!_PyArg_ParseStack(args, nargs, "y*y*i:add", - &fragment1, &fragment2, &width)) { + if (!_PyArg_CheckPositional("add", nargs, 3, 3)) { + goto exit; + } + if (PyObject_GetBuffer(args[0], &fragment1, PyBUF_SIMPLE) != 0) { + goto exit; + } + if (!PyBuffer_IsContiguous(&fragment1, 'C')) { + _PyArg_BadArgument("add", 1, "contiguous buffer", args[0]); + goto exit; + } + if (PyObject_GetBuffer(args[1], &fragment2, PyBUF_SIMPLE) != 0) { + goto exit; + } + if (!PyBuffer_IsContiguous(&fragment2, 'C')) { + _PyArg_BadArgument("add", 2, "contiguous buffer", args[1]); + goto exit; + } + if (PyFloat_Check(args[2])) { + PyErr_SetString(PyExc_TypeError, + "integer argument expected, got float" ); + goto exit; + } + width = _PyLong_AsInt(args[2]); + if (width == -1 && PyErr_Occurred()) { goto exit; } return_value = audioop_add_impl(module, &fragment1, &fragment2, width); @@ -559,8 +832,32 @@ audioop_bias(PyObject *module, PyObject *const *args, Py_ssize_t nargs) int width; int bias; - if (!_PyArg_ParseStack(args, nargs, "y*ii:bias", - &fragment, &width, &bias)) { + if (!_PyArg_CheckPositional("bias", nargs, 3, 3)) { + goto exit; + } + if (PyObject_GetBuffer(args[0], &fragment, PyBUF_SIMPLE) != 0) { + goto exit; + } + if (!PyBuffer_IsContiguous(&fragment, 'C')) { + _PyArg_BadArgument("bias", 1, "contiguous buffer", args[0]); + goto exit; + } + if (PyFloat_Check(args[1])) { + PyErr_SetString(PyExc_TypeError, + "integer argument expected, got float" ); + goto exit; + } + width = _PyLong_AsInt(args[1]); + if (width == -1 && PyErr_Occurred()) { + goto exit; + } + if (PyFloat_Check(args[2])) { + PyErr_SetString(PyExc_TypeError, + "integer argument expected, got float" ); + goto exit; + } + bias = _PyLong_AsInt(args[2]); + if (bias == -1 && PyErr_Occurred()) { goto exit; } return_value = audioop_bias_impl(module, &fragment, width, bias); @@ -593,8 +890,23 @@ audioop_reverse(PyObject *module, PyObject *const *args, Py_ssize_t nargs) Py_buffer fragment = {NULL, NULL}; int width; - if (!_PyArg_ParseStack(args, nargs, "y*i:reverse", - &fragment, &width)) { + if (!_PyArg_CheckPositional("reverse", nargs, 2, 2)) { + goto exit; + } + if (PyObject_GetBuffer(args[0], &fragment, PyBUF_SIMPLE) != 0) { + goto exit; + } + if (!PyBuffer_IsContiguous(&fragment, 'C')) { + _PyArg_BadArgument("reverse", 1, "contiguous buffer", args[0]); + goto exit; + } + if (PyFloat_Check(args[1])) { + PyErr_SetString(PyExc_TypeError, + "integer argument expected, got float" ); + goto exit; + } + width = _PyLong_AsInt(args[1]); + if (width == -1 && PyErr_Occurred()) { goto exit; } return_value = audioop_reverse_impl(module, &fragment, width); @@ -627,8 +939,23 @@ audioop_byteswap(PyObject *module, PyObject *const *args, Py_ssize_t nargs) Py_buffer fragment = {NULL, NULL}; int width; - if (!_PyArg_ParseStack(args, nargs, "y*i:byteswap", - &fragment, &width)) { + if (!_PyArg_CheckPositional("byteswap", nargs, 2, 2)) { + goto exit; + } + if (PyObject_GetBuffer(args[0], &fragment, PyBUF_SIMPLE) != 0) { + goto exit; + } + if (!PyBuffer_IsContiguous(&fragment, 'C')) { + _PyArg_BadArgument("byteswap", 1, "contiguous buffer", args[0]); + goto exit; + } + if (PyFloat_Check(args[1])) { + PyErr_SetString(PyExc_TypeError, + "integer argument expected, got float" ); + goto exit; + } + width = _PyLong_AsInt(args[1]); + if (width == -1 && PyErr_Occurred()) { goto exit; } return_value = audioop_byteswap_impl(module, &fragment, width); @@ -663,8 +990,32 @@ audioop_lin2lin(PyObject *module, PyObject *const *args, Py_ssize_t nargs) int width; int newwidth; - if (!_PyArg_ParseStack(args, nargs, "y*ii:lin2lin", - &fragment, &width, &newwidth)) { + if (!_PyArg_CheckPositional("lin2lin", nargs, 3, 3)) { + goto exit; + } + if (PyObject_GetBuffer(args[0], &fragment, PyBUF_SIMPLE) != 0) { + goto exit; + } + if (!PyBuffer_IsContiguous(&fragment, 'C')) { + _PyArg_BadArgument("lin2lin", 1, "contiguous buffer", args[0]); + goto exit; + } + if (PyFloat_Check(args[1])) { + PyErr_SetString(PyExc_TypeError, + "integer argument expected, got float" ); + goto exit; + } + width = _PyLong_AsInt(args[1]); + if (width == -1 && PyErr_Occurred()) { + goto exit; + } + if (PyFloat_Check(args[2])) { + PyErr_SetString(PyExc_TypeError, + "integer argument expected, got float" ); + goto exit; + } + newwidth = _PyLong_AsInt(args[2]); + if (newwidth == -1 && PyErr_Occurred()) { goto exit; } return_value = audioop_lin2lin_impl(module, &fragment, width, newwidth); @@ -706,10 +1057,78 @@ audioop_ratecv(PyObject *module, PyObject *const *args, Py_ssize_t nargs) int weightA = 1; int weightB = 0; - if (!_PyArg_ParseStack(args, nargs, "y*iiiiO|ii:ratecv", - &fragment, &width, &nchannels, &inrate, &outrate, &state, &weightA, &weightB)) { + if (!_PyArg_CheckPositional("ratecv", nargs, 6, 8)) { + goto exit; + } + if (PyObject_GetBuffer(args[0], &fragment, PyBUF_SIMPLE) != 0) { + goto exit; + } + if (!PyBuffer_IsContiguous(&fragment, 'C')) { + _PyArg_BadArgument("ratecv", 1, "contiguous buffer", args[0]); + goto exit; + } + if (PyFloat_Check(args[1])) { + PyErr_SetString(PyExc_TypeError, + "integer argument expected, got float" ); + goto exit; + } + width = _PyLong_AsInt(args[1]); + if (width == -1 && PyErr_Occurred()) { + goto exit; + } + if (PyFloat_Check(args[2])) { + PyErr_SetString(PyExc_TypeError, + "integer argument expected, got float" ); + goto exit; + } + nchannels = _PyLong_AsInt(args[2]); + if (nchannels == -1 && PyErr_Occurred()) { + goto exit; + } + if (PyFloat_Check(args[3])) { + PyErr_SetString(PyExc_TypeError, + "integer argument expected, got float" ); goto exit; } + inrate = _PyLong_AsInt(args[3]); + if (inrate == -1 && PyErr_Occurred()) { + goto exit; + } + if (PyFloat_Check(args[4])) { + PyErr_SetString(PyExc_TypeError, + "integer argument expected, got float" ); + goto exit; + } + outrate = _PyLong_AsInt(args[4]); + if (outrate == -1 && PyErr_Occurred()) { + goto exit; + } + state = args[5]; + if (nargs < 7) { + goto skip_optional; + } + if (PyFloat_Check(args[6])) { + PyErr_SetString(PyExc_TypeError, + "integer argument expected, got float" ); + goto exit; + } + weightA = _PyLong_AsInt(args[6]); + if (weightA == -1 && PyErr_Occurred()) { + goto exit; + } + if (nargs < 8) { + goto skip_optional; + } + if (PyFloat_Check(args[7])) { + PyErr_SetString(PyExc_TypeError, + "integer argument expected, got float" ); + goto exit; + } + weightB = _PyLong_AsInt(args[7]); + if (weightB == -1 && PyErr_Occurred()) { + goto exit; + } +skip_optional: return_value = audioop_ratecv_impl(module, &fragment, width, nchannels, inrate, outrate, state, weightA, weightB); exit: @@ -740,8 +1159,23 @@ audioop_lin2ulaw(PyObject *module, PyObject *const *args, Py_ssize_t nargs) Py_buffer fragment = {NULL, NULL}; int width; - if (!_PyArg_ParseStack(args, nargs, "y*i:lin2ulaw", - &fragment, &width)) { + if (!_PyArg_CheckPositional("lin2ulaw", nargs, 2, 2)) { + goto exit; + } + if (PyObject_GetBuffer(args[0], &fragment, PyBUF_SIMPLE) != 0) { + goto exit; + } + if (!PyBuffer_IsContiguous(&fragment, 'C')) { + _PyArg_BadArgument("lin2ulaw", 1, "contiguous buffer", args[0]); + goto exit; + } + if (PyFloat_Check(args[1])) { + PyErr_SetString(PyExc_TypeError, + "integer argument expected, got float" ); + goto exit; + } + width = _PyLong_AsInt(args[1]); + if (width == -1 && PyErr_Occurred()) { goto exit; } return_value = audioop_lin2ulaw_impl(module, &fragment, width); @@ -774,8 +1208,23 @@ audioop_ulaw2lin(PyObject *module, PyObject *const *args, Py_ssize_t nargs) Py_buffer fragment = {NULL, NULL}; int width; - if (!_PyArg_ParseStack(args, nargs, "y*i:ulaw2lin", - &fragment, &width)) { + if (!_PyArg_CheckPositional("ulaw2lin", nargs, 2, 2)) { + goto exit; + } + if (PyObject_GetBuffer(args[0], &fragment, PyBUF_SIMPLE) != 0) { + goto exit; + } + if (!PyBuffer_IsContiguous(&fragment, 'C')) { + _PyArg_BadArgument("ulaw2lin", 1, "contiguous buffer", args[0]); + goto exit; + } + if (PyFloat_Check(args[1])) { + PyErr_SetString(PyExc_TypeError, + "integer argument expected, got float" ); + goto exit; + } + width = _PyLong_AsInt(args[1]); + if (width == -1 && PyErr_Occurred()) { goto exit; } return_value = audioop_ulaw2lin_impl(module, &fragment, width); @@ -808,8 +1257,23 @@ audioop_lin2alaw(PyObject *module, PyObject *const *args, Py_ssize_t nargs) Py_buffer fragment = {NULL, NULL}; int width; - if (!_PyArg_ParseStack(args, nargs, "y*i:lin2alaw", - &fragment, &width)) { + if (!_PyArg_CheckPositional("lin2alaw", nargs, 2, 2)) { + goto exit; + } + if (PyObject_GetBuffer(args[0], &fragment, PyBUF_SIMPLE) != 0) { + goto exit; + } + if (!PyBuffer_IsContiguous(&fragment, 'C')) { + _PyArg_BadArgument("lin2alaw", 1, "contiguous buffer", args[0]); + goto exit; + } + if (PyFloat_Check(args[1])) { + PyErr_SetString(PyExc_TypeError, + "integer argument expected, got float" ); + goto exit; + } + width = _PyLong_AsInt(args[1]); + if (width == -1 && PyErr_Occurred()) { goto exit; } return_value = audioop_lin2alaw_impl(module, &fragment, width); @@ -842,8 +1306,23 @@ audioop_alaw2lin(PyObject *module, PyObject *const *args, Py_ssize_t nargs) Py_buffer fragment = {NULL, NULL}; int width; - if (!_PyArg_ParseStack(args, nargs, "y*i:alaw2lin", - &fragment, &width)) { + if (!_PyArg_CheckPositional("alaw2lin", nargs, 2, 2)) { + goto exit; + } + if (PyObject_GetBuffer(args[0], &fragment, PyBUF_SIMPLE) != 0) { + goto exit; + } + if (!PyBuffer_IsContiguous(&fragment, 'C')) { + _PyArg_BadArgument("alaw2lin", 1, "contiguous buffer", args[0]); + goto exit; + } + if (PyFloat_Check(args[1])) { + PyErr_SetString(PyExc_TypeError, + "integer argument expected, got float" ); + goto exit; + } + width = _PyLong_AsInt(args[1]); + if (width == -1 && PyErr_Occurred()) { goto exit; } return_value = audioop_alaw2lin_impl(module, &fragment, width); @@ -878,10 +1357,26 @@ audioop_lin2adpcm(PyObject *module, PyObject *const *args, Py_ssize_t nargs) int width; PyObject *state; - if (!_PyArg_ParseStack(args, nargs, "y*iO:lin2adpcm", - &fragment, &width, &state)) { + if (!_PyArg_CheckPositional("lin2adpcm", nargs, 3, 3)) { goto exit; } + if (PyObject_GetBuffer(args[0], &fragment, PyBUF_SIMPLE) != 0) { + goto exit; + } + if (!PyBuffer_IsContiguous(&fragment, 'C')) { + _PyArg_BadArgument("lin2adpcm", 1, "contiguous buffer", args[0]); + goto exit; + } + if (PyFloat_Check(args[1])) { + PyErr_SetString(PyExc_TypeError, + "integer argument expected, got float" ); + goto exit; + } + width = _PyLong_AsInt(args[1]); + if (width == -1 && PyErr_Occurred()) { + goto exit; + } + state = args[2]; return_value = audioop_lin2adpcm_impl(module, &fragment, width, state); exit: @@ -914,10 +1409,26 @@ audioop_adpcm2lin(PyObject *module, PyObject *const *args, Py_ssize_t nargs) int width; PyObject *state; - if (!_PyArg_ParseStack(args, nargs, "y*iO:adpcm2lin", - &fragment, &width, &state)) { + if (!_PyArg_CheckPositional("adpcm2lin", nargs, 3, 3)) { + goto exit; + } + if (PyObject_GetBuffer(args[0], &fragment, PyBUF_SIMPLE) != 0) { + goto exit; + } + if (!PyBuffer_IsContiguous(&fragment, 'C')) { + _PyArg_BadArgument("adpcm2lin", 1, "contiguous buffer", args[0]); + goto exit; + } + if (PyFloat_Check(args[1])) { + PyErr_SetString(PyExc_TypeError, + "integer argument expected, got float" ); + goto exit; + } + width = _PyLong_AsInt(args[1]); + if (width == -1 && PyErr_Occurred()) { goto exit; } + state = args[2]; return_value = audioop_adpcm2lin_impl(module, &fragment, width, state); exit: @@ -928,4 +1439,4 @@ audioop_adpcm2lin(PyObject *module, PyObject *const *args, Py_ssize_t nargs) return return_value; } -/*[clinic end generated code: output=d197b1559196a48a input=a9049054013a1b77]*/ +/*[clinic end generated code: output=2b173a25726252e9 input=a9049054013a1b77]*/ diff --git a/Modules/clinic/binascii.c.h b/Modules/clinic/binascii.c.h index 91f261cb300c..3295833e87a9 100644 --- a/Modules/clinic/binascii.c.h +++ b/Modules/clinic/binascii.c.h @@ -189,7 +189,7 @@ binascii_rlecode_hqx(PyObject *module, PyObject *arg) goto exit; } if (!PyBuffer_IsContiguous(&data, 'C')) { - _PyArg_BadArgument("rlecode_hqx", "contiguous buffer", arg); + _PyArg_BadArgument("rlecode_hqx", 0, "contiguous buffer", arg); goto exit; } return_value = binascii_rlecode_hqx_impl(module, &data); @@ -225,7 +225,7 @@ binascii_b2a_hqx(PyObject *module, PyObject *arg) goto exit; } if (!PyBuffer_IsContiguous(&data, 'C')) { - _PyArg_BadArgument("b2a_hqx", "contiguous buffer", arg); + _PyArg_BadArgument("b2a_hqx", 0, "contiguous buffer", arg); goto exit; } return_value = binascii_b2a_hqx_impl(module, &data); @@ -261,7 +261,7 @@ binascii_rledecode_hqx(PyObject *module, PyObject *arg) goto exit; } if (!PyBuffer_IsContiguous(&data, 'C')) { - _PyArg_BadArgument("rledecode_hqx", "contiguous buffer", arg); + _PyArg_BadArgument("rledecode_hqx", 0, "contiguous buffer", arg); goto exit; } return_value = binascii_rledecode_hqx_impl(module, &data); @@ -295,8 +295,23 @@ binascii_crc_hqx(PyObject *module, PyObject *const *args, Py_ssize_t nargs) unsigned int crc; unsigned int _return_value; - if (!_PyArg_ParseStack(args, nargs, "y*I:crc_hqx", - &data, &crc)) { + if (!_PyArg_CheckPositional("crc_hqx", nargs, 2, 2)) { + goto exit; + } + if (PyObject_GetBuffer(args[0], &data, PyBUF_SIMPLE) != 0) { + goto exit; + } + if (!PyBuffer_IsContiguous(&data, 'C')) { + _PyArg_BadArgument("crc_hqx", 1, "contiguous buffer", args[0]); + goto exit; + } + if (PyFloat_Check(args[1])) { + PyErr_SetString(PyExc_TypeError, + "integer argument expected, got float" ); + goto exit; + } + crc = (unsigned int)PyLong_AsUnsignedLongMask(args[1]); + if (crc == (unsigned int)-1 && PyErr_Occurred()) { goto exit; } _return_value = binascii_crc_hqx_impl(module, &data, crc); @@ -334,10 +349,29 @@ binascii_crc32(PyObject *module, PyObject *const *args, Py_ssize_t nargs) unsigned int crc = 0; unsigned int _return_value; - if (!_PyArg_ParseStack(args, nargs, "y*|I:crc32", - &data, &crc)) { + if (!_PyArg_CheckPositional("crc32", nargs, 1, 2)) { + goto exit; + } + if (PyObject_GetBuffer(args[0], &data, PyBUF_SIMPLE) != 0) { + goto exit; + } + if (!PyBuffer_IsContiguous(&data, 'C')) { + _PyArg_BadArgument("crc32", 1, "contiguous buffer", args[0]); + goto exit; + } + if (nargs < 2) { + goto skip_optional; + } + if (PyFloat_Check(args[1])) { + PyErr_SetString(PyExc_TypeError, + "integer argument expected, got float" ); + goto exit; + } + crc = (unsigned int)PyLong_AsUnsignedLongMask(args[1]); + if (crc == (unsigned int)-1 && PyErr_Occurred()) { goto exit; } +skip_optional: _return_value = binascii_crc32_impl(module, &data, crc); if ((_return_value == (unsigned int)-1) && PyErr_Occurred()) { goto exit; @@ -378,7 +412,7 @@ binascii_b2a_hex(PyObject *module, PyObject *arg) goto exit; } if (!PyBuffer_IsContiguous(&data, 'C')) { - _PyArg_BadArgument("b2a_hex", "contiguous buffer", arg); + _PyArg_BadArgument("b2a_hex", 0, "contiguous buffer", arg); goto exit; } return_value = binascii_b2a_hex_impl(module, &data); @@ -416,7 +450,7 @@ binascii_hexlify(PyObject *module, PyObject *arg) goto exit; } if (!PyBuffer_IsContiguous(&data, 'C')) { - _PyArg_BadArgument("hexlify", "contiguous buffer", arg); + _PyArg_BadArgument("hexlify", 0, "contiguous buffer", arg); goto exit; } return_value = binascii_hexlify_impl(module, &data); @@ -574,4 +608,4 @@ binascii_b2a_qp(PyObject *module, PyObject *const *args, Py_ssize_t nargs, PyObj return return_value; } -/*[clinic end generated code: output=8ff0cb5717b15d1b input=a9049054013a1b77]*/ +/*[clinic end generated code: output=7210a01a718da4a0 input=a9049054013a1b77]*/ diff --git a/Modules/clinic/cmathmodule.c.h b/Modules/clinic/cmathmodule.c.h index 30378c7e841b..cc1b8f2e21c0 100644 --- a/Modules/clinic/cmathmodule.c.h +++ b/Modules/clinic/cmathmodule.c.h @@ -668,10 +668,18 @@ cmath_log(PyObject *module, PyObject *const *args, Py_ssize_t nargs) Py_complex x; PyObject *y_obj = NULL; - if (!_PyArg_ParseStack(args, nargs, "D|O:log", - &x, &y_obj)) { + if (!_PyArg_CheckPositional("log", nargs, 1, 2)) { goto exit; } + x = PyComplex_AsCComplex(args[0]); + if (PyErr_Occurred()) { + goto exit; + } + if (nargs < 2) { + goto skip_optional; + } + y_obj = args[1]; +skip_optional: return_value = cmath_log_impl(module, x, y_obj); exit: @@ -755,8 +763,15 @@ cmath_rect(PyObject *module, PyObject *const *args, Py_ssize_t nargs) double r; double phi; - if (!_PyArg_ParseStack(args, nargs, "dd:rect", - &r, &phi)) { + if (!_PyArg_CheckPositional("rect", nargs, 2, 2)) { + goto exit; + } + r = PyFloat_AsDouble(args[0]); + if (PyErr_Occurred()) { + goto exit; + } + phi = PyFloat_AsDouble(args[1]); + if (PyErr_Occurred()) { goto exit; } return_value = cmath_rect_impl(module, r, phi); @@ -902,4 +917,4 @@ cmath_isclose(PyObject *module, PyObject *const *args, Py_ssize_t nargs, PyObjec exit: return return_value; } -/*[clinic end generated code: output=50a105aa2bc5308f input=a9049054013a1b77]*/ +/*[clinic end generated code: output=86a365d23f34aaff input=a9049054013a1b77]*/ diff --git a/Modules/clinic/fcntlmodule.c.h b/Modules/clinic/fcntlmodule.c.h index 7b5a280045cc..024a44cfbf8b 100644 --- a/Modules/clinic/fcntlmodule.c.h +++ b/Modules/clinic/fcntlmodule.c.h @@ -32,10 +32,26 @@ fcntl_fcntl(PyObject *module, PyObject *const *args, Py_ssize_t nargs) int code; PyObject *arg = NULL; - if (!_PyArg_ParseStack(args, nargs, "O&i|O:fcntl", - conv_descriptor, &fd, &code, &arg)) { + if (!_PyArg_CheckPositional("fcntl", nargs, 2, 3)) { goto exit; } + if (!conv_descriptor(args[0], &fd)) { + goto exit; + } + if (PyFloat_Check(args[1])) { + PyErr_SetString(PyExc_TypeError, + "integer argument expected, got float" ); + goto exit; + } + code = _PyLong_AsInt(args[1]); + if (code == -1 && PyErr_Occurred()) { + goto exit; + } + if (nargs < 3) { + goto skip_optional; + } + arg = args[2]; +skip_optional: return_value = fcntl_fcntl_impl(module, fd, code, arg); exit: @@ -91,10 +107,33 @@ fcntl_ioctl(PyObject *module, PyObject *const *args, Py_ssize_t nargs) PyObject *ob_arg = NULL; int mutate_arg = 1; - if (!_PyArg_ParseStack(args, nargs, "O&I|Op:ioctl", - conv_descriptor, &fd, &code, &ob_arg, &mutate_arg)) { + if (!_PyArg_CheckPositional("ioctl", nargs, 2, 4)) { + goto exit; + } + if (!conv_descriptor(args[0], &fd)) { goto exit; } + if (PyFloat_Check(args[1])) { + PyErr_SetString(PyExc_TypeError, + "integer argument expected, got float" ); + goto exit; + } + code = (unsigned int)PyLong_AsUnsignedLongMask(args[1]); + if (code == (unsigned int)-1 && PyErr_Occurred()) { + goto exit; + } + if (nargs < 3) { + goto skip_optional; + } + ob_arg = args[2]; + if (nargs < 4) { + goto skip_optional; + } + mutate_arg = PyObject_IsTrue(args[3]); + if (mutate_arg < 0) { + goto exit; + } +skip_optional: return_value = fcntl_ioctl_impl(module, fd, code, ob_arg, mutate_arg); exit: @@ -123,8 +162,19 @@ fcntl_flock(PyObject *module, PyObject *const *args, Py_ssize_t nargs) int fd; int code; - if (!_PyArg_ParseStack(args, nargs, "O&i:flock", - conv_descriptor, &fd, &code)) { + if (!_PyArg_CheckPositional("flock", nargs, 2, 2)) { + goto exit; + } + if (!conv_descriptor(args[0], &fd)) { + goto exit; + } + if (PyFloat_Check(args[1])) { + PyErr_SetString(PyExc_TypeError, + "integer argument expected, got float" ); + goto exit; + } + code = _PyLong_AsInt(args[1]); + if (code == -1 && PyErr_Occurred()) { goto exit; } return_value = fcntl_flock_impl(module, fd, code); @@ -177,13 +227,45 @@ fcntl_lockf(PyObject *module, PyObject *const *args, Py_ssize_t nargs) PyObject *startobj = NULL; int whence = 0; - if (!_PyArg_ParseStack(args, nargs, "O&i|OOi:lockf", - conv_descriptor, &fd, &code, &lenobj, &startobj, &whence)) { + if (!_PyArg_CheckPositional("lockf", nargs, 2, 5)) { + goto exit; + } + if (!conv_descriptor(args[0], &fd)) { + goto exit; + } + if (PyFloat_Check(args[1])) { + PyErr_SetString(PyExc_TypeError, + "integer argument expected, got float" ); + goto exit; + } + code = _PyLong_AsInt(args[1]); + if (code == -1 && PyErr_Occurred()) { + goto exit; + } + if (nargs < 3) { + goto skip_optional; + } + lenobj = args[2]; + if (nargs < 4) { + goto skip_optional; + } + startobj = args[3]; + if (nargs < 5) { + goto skip_optional; + } + if (PyFloat_Check(args[4])) { + PyErr_SetString(PyExc_TypeError, + "integer argument expected, got float" ); + goto exit; + } + whence = _PyLong_AsInt(args[4]); + if (whence == -1 && PyErr_Occurred()) { goto exit; } +skip_optional: return_value = fcntl_lockf_impl(module, fd, code, lenobj, startobj, whence); exit: return return_value; } -/*[clinic end generated code: output=fc1a781750750a14 input=a9049054013a1b77]*/ +/*[clinic end generated code: output=e912d25e28362c52 input=a9049054013a1b77]*/ diff --git a/Modules/clinic/itertoolsmodule.c.h b/Modules/clinic/itertoolsmodule.c.h index b7d70a3492d3..9fc429aa3c36 100644 --- a/Modules/clinic/itertoolsmodule.c.h +++ b/Modules/clinic/itertoolsmodule.c.h @@ -52,10 +52,15 @@ itertools__grouper(PyTypeObject *type, PyObject *args, PyObject *kwargs) !_PyArg_NoKeywords("_grouper", kwargs)) { goto exit; } - if (!PyArg_ParseTuple(args, "O!O:_grouper", - &groupby_type, &parent, &tgtkey)) { + if (!_PyArg_CheckPositional("_grouper", PyTuple_GET_SIZE(args), 2, 2)) { goto exit; } + if (!PyObject_TypeCheck(PyTuple_GET_ITEM(args, 0), &groupby_type)) { + _PyArg_BadArgument("_grouper", 1, (&groupby_type)->tp_name, PyTuple_GET_ITEM(args, 0)); + goto exit; + } + parent = PyTuple_GET_ITEM(args, 0); + tgtkey = PyTuple_GET_ITEM(args, 1); return_value = itertools__grouper_impl(type, parent, tgtkey); exit: @@ -84,10 +89,16 @@ itertools_teedataobject(PyTypeObject *type, PyObject *args, PyObject *kwargs) !_PyArg_NoKeywords("teedataobject", kwargs)) { goto exit; } - if (!PyArg_ParseTuple(args, "OO!O:teedataobject", - &it, &PyList_Type, &values, &next)) { + if (!_PyArg_CheckPositional("teedataobject", PyTuple_GET_SIZE(args), 3, 3)) { + goto exit; + } + it = PyTuple_GET_ITEM(args, 0); + if (!PyList_Check(PyTuple_GET_ITEM(args, 1))) { + _PyArg_BadArgument("teedataobject", 2, "list", PyTuple_GET_ITEM(args, 1)); goto exit; } + values = PyTuple_GET_ITEM(args, 1); + next = PyTuple_GET_ITEM(args, 2); return_value = itertools_teedataobject_impl(type, it, values, next); exit: @@ -143,10 +154,31 @@ itertools_tee(PyObject *module, PyObject *const *args, Py_ssize_t nargs) PyObject *iterable; Py_ssize_t n = 2; - if (!_PyArg_ParseStack(args, nargs, "O|n:tee", - &iterable, &n)) { + if (!_PyArg_CheckPositional("tee", nargs, 1, 2)) { goto exit; } + iterable = args[0]; + if (nargs < 2) { + goto skip_optional; + } + if (PyFloat_Check(args[1])) { + PyErr_SetString(PyExc_TypeError, + "integer argument expected, got float" ); + goto exit; + } + { + Py_ssize_t ival = -1; + PyObject *iobj = PyNumber_Index(args[1]); + if (iobj != NULL) { + ival = PyLong_AsSsize_t(iobj); + Py_DECREF(iobj); + } + if (ival == -1 && PyErr_Occurred()) { + goto exit; + } + n = ival; + } +skip_optional: return_value = itertools_tee_impl(module, iterable, n); exit: @@ -510,4 +542,4 @@ itertools_count(PyTypeObject *type, PyObject *args, PyObject *kwargs) exit: return return_value; } -/*[clinic end generated code: output=916251f891fa84b9 input=a9049054013a1b77]*/ +/*[clinic end generated code: output=f289354f54e04c13 input=a9049054013a1b77]*/ diff --git a/Modules/clinic/mathmodule.c.h b/Modules/clinic/mathmodule.c.h index e436f5900c6b..116578968759 100644 --- a/Modules/clinic/mathmodule.c.h +++ b/Modules/clinic/mathmodule.c.h @@ -139,10 +139,14 @@ math_ldexp(PyObject *module, PyObject *const *args, Py_ssize_t nargs) double x; PyObject *i; - if (!_PyArg_ParseStack(args, nargs, "dO:ldexp", - &x, &i)) { + if (!_PyArg_CheckPositional("ldexp", nargs, 2, 2)) { goto exit; } + x = PyFloat_AsDouble(args[0]); + if (PyErr_Occurred()) { + goto exit; + } + i = args[1]; return_value = math_ldexp_impl(module, x, i); exit: @@ -261,8 +265,15 @@ math_fmod(PyObject *module, PyObject *const *args, Py_ssize_t nargs) double x; double y; - if (!_PyArg_ParseStack(args, nargs, "dd:fmod", - &x, &y)) { + if (!_PyArg_CheckPositional("fmod", nargs, 2, 2)) { + goto exit; + } + x = PyFloat_AsDouble(args[0]); + if (PyErr_Occurred()) { + goto exit; + } + y = PyFloat_AsDouble(args[1]); + if (PyErr_Occurred()) { goto exit; } return_value = math_fmod_impl(module, x, y); @@ -326,8 +337,15 @@ math_pow(PyObject *module, PyObject *const *args, Py_ssize_t nargs) double x; double y; - if (!_PyArg_ParseStack(args, nargs, "dd:pow", - &x, &y)) { + if (!_PyArg_CheckPositional("pow", nargs, 2, 2)) { + goto exit; + } + x = PyFloat_AsDouble(args[0]); + if (PyErr_Occurred()) { + goto exit; + } + y = PyFloat_AsDouble(args[1]); + if (PyErr_Occurred()) { goto exit; } return_value = math_pow_impl(module, x, y); @@ -530,4 +548,4 @@ math_isclose(PyObject *module, PyObject *const *args, Py_ssize_t nargs, PyObject exit: return return_value; } -/*[clinic end generated code: output=da4b9940a5cb0188 input=a9049054013a1b77]*/ +/*[clinic end generated code: output=2fe4fecd85585313 input=a9049054013a1b77]*/ diff --git a/Modules/clinic/posixmodule.c.h b/Modules/clinic/posixmodule.c.h index e310d99cf468..2c1ee97faf71 100644 --- a/Modules/clinic/posixmodule.c.h +++ b/Modules/clinic/posixmodule.c.h @@ -1658,10 +1658,13 @@ os_execv(PyObject *module, PyObject *const *args, Py_ssize_t nargs) path_t path = PATH_T_INITIALIZE("execv", "path", 0, 0); PyObject *argv; - if (!_PyArg_ParseStack(args, nargs, "O&O:execv", - path_converter, &path, &argv)) { + if (!_PyArg_CheckPositional("execv", nargs, 2, 2)) { goto exit; } + if (!path_converter(args[0], &path)) { + goto exit; + } + argv = args[1]; return_value = os_execv_impl(module, &path, argv); exit: @@ -1817,10 +1820,22 @@ os_spawnv(PyObject *module, PyObject *const *args, Py_ssize_t nargs) path_t path = PATH_T_INITIALIZE("spawnv", "path", 0, 0); PyObject *argv; - if (!_PyArg_ParseStack(args, nargs, "iO&O:spawnv", - &mode, path_converter, &path, &argv)) { + if (!_PyArg_CheckPositional("spawnv", nargs, 3, 3)) { + goto exit; + } + if (PyFloat_Check(args[0])) { + PyErr_SetString(PyExc_TypeError, + "integer argument expected, got float" ); + goto exit; + } + mode = _PyLong_AsInt(args[0]); + if (mode == -1 && PyErr_Occurred()) { + goto exit; + } + if (!path_converter(args[1], &path)) { goto exit; } + argv = args[2]; return_value = os_spawnv_impl(module, mode, &path, argv); exit: @@ -1865,10 +1880,23 @@ os_spawnve(PyObject *module, PyObject *const *args, Py_ssize_t nargs) PyObject *argv; PyObject *env; - if (!_PyArg_ParseStack(args, nargs, "iO&OO:spawnve", - &mode, path_converter, &path, &argv, &env)) { + if (!_PyArg_CheckPositional("spawnve", nargs, 4, 4)) { goto exit; } + if (PyFloat_Check(args[0])) { + PyErr_SetString(PyExc_TypeError, + "integer argument expected, got float" ); + goto exit; + } + mode = _PyLong_AsInt(args[0]); + if (mode == -1 && PyErr_Occurred()) { + goto exit; + } + if (!path_converter(args[1], &path)) { + goto exit; + } + argv = args[2]; + env = args[3]; return_value = os_spawnve_impl(module, mode, &path, argv, env); exit: @@ -2874,8 +2902,13 @@ os_setreuid(PyObject *module, PyObject *const *args, Py_ssize_t nargs) uid_t ruid; uid_t euid; - if (!_PyArg_ParseStack(args, nargs, "O&O&:setreuid", - _Py_Uid_Converter, &ruid, _Py_Uid_Converter, &euid)) { + if (!_PyArg_CheckPositional("setreuid", nargs, 2, 2)) { + goto exit; + } + if (!_Py_Uid_Converter(args[0], &ruid)) { + goto exit; + } + if (!_Py_Uid_Converter(args[1], &euid)) { goto exit; } return_value = os_setreuid_impl(module, ruid, euid); @@ -2907,8 +2940,13 @@ os_setregid(PyObject *module, PyObject *const *args, Py_ssize_t nargs) gid_t rgid; gid_t egid; - if (!_PyArg_ParseStack(args, nargs, "O&O&:setregid", - _Py_Gid_Converter, &rgid, _Py_Gid_Converter, &egid)) { + if (!_PyArg_CheckPositional("setregid", nargs, 2, 2)) { + goto exit; + } + if (!_Py_Gid_Converter(args[0], &rgid)) { + goto exit; + } + if (!_Py_Gid_Converter(args[1], &egid)) { goto exit; } return_value = os_setregid_impl(module, rgid, egid); @@ -3558,8 +3596,25 @@ os_closerange(PyObject *module, PyObject *const *args, Py_ssize_t nargs) int fd_low; int fd_high; - if (!_PyArg_ParseStack(args, nargs, "ii:closerange", - &fd_low, &fd_high)) { + if (!_PyArg_CheckPositional("closerange", nargs, 2, 2)) { + goto exit; + } + if (PyFloat_Check(args[0])) { + PyErr_SetString(PyExc_TypeError, + "integer argument expected, got float" ); + goto exit; + } + fd_low = _PyLong_AsInt(args[0]); + if (fd_low == -1 && PyErr_Occurred()) { + goto exit; + } + if (PyFloat_Check(args[1])) { + PyErr_SetString(PyExc_TypeError, + "integer argument expected, got float" ); + goto exit; + } + fd_high = _PyLong_AsInt(args[1]); + if (fd_high == -1 && PyErr_Occurred()) { goto exit; } return_value = os_closerange_impl(module, fd_low, fd_high); @@ -3672,8 +3727,28 @@ os_lockf(PyObject *module, PyObject *const *args, Py_ssize_t nargs) int command; Py_off_t length; - if (!_PyArg_ParseStack(args, nargs, "iiO&:lockf", - &fd, &command, Py_off_t_converter, &length)) { + if (!_PyArg_CheckPositional("lockf", nargs, 3, 3)) { + goto exit; + } + if (PyFloat_Check(args[0])) { + PyErr_SetString(PyExc_TypeError, + "integer argument expected, got float" ); + goto exit; + } + fd = _PyLong_AsInt(args[0]); + if (fd == -1 && PyErr_Occurred()) { + goto exit; + } + if (PyFloat_Check(args[1])) { + PyErr_SetString(PyExc_TypeError, + "integer argument expected, got float" ); + goto exit; + } + command = _PyLong_AsInt(args[1]); + if (command == -1 && PyErr_Occurred()) { + goto exit; + } + if (!Py_off_t_converter(args[2], &length)) { goto exit; } return_value = os_lockf_impl(module, fd, command, length); @@ -3708,8 +3783,28 @@ os_lseek(PyObject *module, PyObject *const *args, Py_ssize_t nargs) int how; Py_off_t _return_value; - if (!_PyArg_ParseStack(args, nargs, "iO&i:lseek", - &fd, Py_off_t_converter, &position, &how)) { + if (!_PyArg_CheckPositional("lseek", nargs, 3, 3)) { + goto exit; + } + if (PyFloat_Check(args[0])) { + PyErr_SetString(PyExc_TypeError, + "integer argument expected, got float" ); + goto exit; + } + fd = _PyLong_AsInt(args[0]); + if (fd == -1 && PyErr_Occurred()) { + goto exit; + } + if (!Py_off_t_converter(args[1], &position)) { + goto exit; + } + if (PyFloat_Check(args[2])) { + PyErr_SetString(PyExc_TypeError, + "integer argument expected, got float" ); + goto exit; + } + how = _PyLong_AsInt(args[2]); + if (how == -1 && PyErr_Occurred()) { goto exit; } _return_value = os_lseek_impl(module, fd, position, how); @@ -3741,10 +3836,35 @@ os_read(PyObject *module, PyObject *const *args, Py_ssize_t nargs) int fd; Py_ssize_t length; - if (!_PyArg_ParseStack(args, nargs, "in:read", - &fd, &length)) { + if (!_PyArg_CheckPositional("read", nargs, 2, 2)) { + goto exit; + } + if (PyFloat_Check(args[0])) { + PyErr_SetString(PyExc_TypeError, + "integer argument expected, got float" ); + goto exit; + } + fd = _PyLong_AsInt(args[0]); + if (fd == -1 && PyErr_Occurred()) { + goto exit; + } + if (PyFloat_Check(args[1])) { + PyErr_SetString(PyExc_TypeError, + "integer argument expected, got float" ); goto exit; } + { + Py_ssize_t ival = -1; + PyObject *iobj = PyNumber_Index(args[1]); + if (iobj != NULL) { + ival = PyLong_AsSsize_t(iobj); + Py_DECREF(iobj); + } + if (ival == -1 && PyErr_Occurred()) { + goto exit; + } + length = ival; + } return_value = os_read_impl(module, fd, length); exit: @@ -3781,10 +3901,19 @@ os_readv(PyObject *module, PyObject *const *args, Py_ssize_t nargs) PyObject *buffers; Py_ssize_t _return_value; - if (!_PyArg_ParseStack(args, nargs, "iO:readv", - &fd, &buffers)) { + if (!_PyArg_CheckPositional("readv", nargs, 2, 2)) { + goto exit; + } + if (PyFloat_Check(args[0])) { + PyErr_SetString(PyExc_TypeError, + "integer argument expected, got float" ); + goto exit; + } + fd = _PyLong_AsInt(args[0]); + if (fd == -1 && PyErr_Occurred()) { goto exit; } + buffers = args[1]; _return_value = os_readv_impl(module, fd, buffers); if ((_return_value == -1) && PyErr_Occurred()) { goto exit; @@ -3822,8 +3951,28 @@ os_pread(PyObject *module, PyObject *const *args, Py_ssize_t nargs) int length; Py_off_t offset; - if (!_PyArg_ParseStack(args, nargs, "iiO&:pread", - &fd, &length, Py_off_t_converter, &offset)) { + if (!_PyArg_CheckPositional("pread", nargs, 3, 3)) { + goto exit; + } + if (PyFloat_Check(args[0])) { + PyErr_SetString(PyExc_TypeError, + "integer argument expected, got float" ); + goto exit; + } + fd = _PyLong_AsInt(args[0]); + if (fd == -1 && PyErr_Occurred()) { + goto exit; + } + if (PyFloat_Check(args[1])) { + PyErr_SetString(PyExc_TypeError, + "integer argument expected, got float" ); + goto exit; + } + length = _PyLong_AsInt(args[1]); + if (length == -1 && PyErr_Occurred()) { + goto exit; + } + if (!Py_off_t_converter(args[2], &offset)) { goto exit; } return_value = os_pread_impl(module, fd, length, offset); @@ -3873,10 +4022,35 @@ os_preadv(PyObject *module, PyObject *const *args, Py_ssize_t nargs) int flags = 0; Py_ssize_t _return_value; - if (!_PyArg_ParseStack(args, nargs, "iOO&|i:preadv", - &fd, &buffers, Py_off_t_converter, &offset, &flags)) { + if (!_PyArg_CheckPositional("preadv", nargs, 3, 4)) { + goto exit; + } + if (PyFloat_Check(args[0])) { + PyErr_SetString(PyExc_TypeError, + "integer argument expected, got float" ); + goto exit; + } + fd = _PyLong_AsInt(args[0]); + if (fd == -1 && PyErr_Occurred()) { + goto exit; + } + buffers = args[1]; + if (!Py_off_t_converter(args[2], &offset)) { goto exit; } + if (nargs < 4) { + goto skip_optional; + } + if (PyFloat_Check(args[3])) { + PyErr_SetString(PyExc_TypeError, + "integer argument expected, got float" ); + goto exit; + } + flags = _PyLong_AsInt(args[3]); + if (flags == -1 && PyErr_Occurred()) { + goto exit; + } +skip_optional: _return_value = os_preadv_impl(module, fd, buffers, offset, flags); if ((_return_value == -1) && PyErr_Occurred()) { goto exit; @@ -3909,8 +4083,23 @@ os_write(PyObject *module, PyObject *const *args, Py_ssize_t nargs) Py_buffer data = {NULL, NULL}; Py_ssize_t _return_value; - if (!_PyArg_ParseStack(args, nargs, "iy*:write", - &fd, &data)) { + if (!_PyArg_CheckPositional("write", nargs, 2, 2)) { + goto exit; + } + if (PyFloat_Check(args[0])) { + PyErr_SetString(PyExc_TypeError, + "integer argument expected, got float" ); + goto exit; + } + fd = _PyLong_AsInt(args[0]); + if (fd == -1 && PyErr_Occurred()) { + goto exit; + } + if (PyObject_GetBuffer(args[1], &data, PyBUF_SIMPLE) != 0) { + goto exit; + } + if (!PyBuffer_IsContiguous(&data, 'C')) { + _PyArg_BadArgument("write", 2, "contiguous buffer", args[1]); goto exit; } _return_value = os_write_impl(module, fd, &data); @@ -3950,8 +4139,34 @@ os__fcopyfile(PyObject *module, PyObject *const *args, Py_ssize_t nargs) int outfd; int flags; - if (!_PyArg_ParseStack(args, nargs, "iii:_fcopyfile", - &infd, &outfd, &flags)) { + if (!_PyArg_CheckPositional("_fcopyfile", nargs, 3, 3)) { + goto exit; + } + if (PyFloat_Check(args[0])) { + PyErr_SetString(PyExc_TypeError, + "integer argument expected, got float" ); + goto exit; + } + infd = _PyLong_AsInt(args[0]); + if (infd == -1 && PyErr_Occurred()) { + goto exit; + } + if (PyFloat_Check(args[1])) { + PyErr_SetString(PyExc_TypeError, + "integer argument expected, got float" ); + goto exit; + } + outfd = _PyLong_AsInt(args[1]); + if (outfd == -1 && PyErr_Occurred()) { + goto exit; + } + if (PyFloat_Check(args[2])) { + PyErr_SetString(PyExc_TypeError, + "integer argument expected, got float" ); + goto exit; + } + flags = _PyLong_AsInt(args[2]); + if (flags == -1 && PyErr_Occurred()) { goto exit; } return_value = os__fcopyfile_impl(module, infd, outfd, flags); @@ -4129,10 +4344,19 @@ os_writev(PyObject *module, PyObject *const *args, Py_ssize_t nargs) PyObject *buffers; Py_ssize_t _return_value; - if (!_PyArg_ParseStack(args, nargs, "iO:writev", - &fd, &buffers)) { + if (!_PyArg_CheckPositional("writev", nargs, 2, 2)) { + goto exit; + } + if (PyFloat_Check(args[0])) { + PyErr_SetString(PyExc_TypeError, + "integer argument expected, got float" ); goto exit; } + fd = _PyLong_AsInt(args[0]); + if (fd == -1 && PyErr_Occurred()) { + goto exit; + } + buffers = args[1]; _return_value = os_writev_impl(module, fd, buffers); if ((_return_value == -1) && PyErr_Occurred()) { goto exit; @@ -4172,8 +4396,26 @@ os_pwrite(PyObject *module, PyObject *const *args, Py_ssize_t nargs) Py_off_t offset; Py_ssize_t _return_value; - if (!_PyArg_ParseStack(args, nargs, "iy*O&:pwrite", - &fd, &buffer, Py_off_t_converter, &offset)) { + if (!_PyArg_CheckPositional("pwrite", nargs, 3, 3)) { + goto exit; + } + if (PyFloat_Check(args[0])) { + PyErr_SetString(PyExc_TypeError, + "integer argument expected, got float" ); + goto exit; + } + fd = _PyLong_AsInt(args[0]); + if (fd == -1 && PyErr_Occurred()) { + goto exit; + } + if (PyObject_GetBuffer(args[1], &buffer, PyBUF_SIMPLE) != 0) { + goto exit; + } + if (!PyBuffer_IsContiguous(&buffer, 'C')) { + _PyArg_BadArgument("pwrite", 2, "contiguous buffer", args[1]); + goto exit; + } + if (!Py_off_t_converter(args[2], &offset)) { goto exit; } _return_value = os_pwrite_impl(module, fd, &buffer, offset); @@ -4232,10 +4474,35 @@ os_pwritev(PyObject *module, PyObject *const *args, Py_ssize_t nargs) int flags = 0; Py_ssize_t _return_value; - if (!_PyArg_ParseStack(args, nargs, "iOO&|i:pwritev", - &fd, &buffers, Py_off_t_converter, &offset, &flags)) { + if (!_PyArg_CheckPositional("pwritev", nargs, 3, 4)) { + goto exit; + } + if (PyFloat_Check(args[0])) { + PyErr_SetString(PyExc_TypeError, + "integer argument expected, got float" ); + goto exit; + } + fd = _PyLong_AsInt(args[0]); + if (fd == -1 && PyErr_Occurred()) { + goto exit; + } + buffers = args[1]; + if (!Py_off_t_converter(args[2], &offset)) { + goto exit; + } + if (nargs < 4) { + goto skip_optional; + } + if (PyFloat_Check(args[3])) { + PyErr_SetString(PyExc_TypeError, + "integer argument expected, got float" ); + goto exit; + } + flags = _PyLong_AsInt(args[3]); + if (flags == -1 && PyErr_Occurred()) { goto exit; } +skip_optional: _return_value = os_pwritev_impl(module, fd, buffers, offset, flags); if ((_return_value == -1) && PyErr_Occurred()) { goto exit; @@ -4439,8 +4706,25 @@ os_makedev(PyObject *module, PyObject *const *args, Py_ssize_t nargs) int minor; dev_t _return_value; - if (!_PyArg_ParseStack(args, nargs, "ii:makedev", - &major, &minor)) { + if (!_PyArg_CheckPositional("makedev", nargs, 2, 2)) { + goto exit; + } + if (PyFloat_Check(args[0])) { + PyErr_SetString(PyExc_TypeError, + "integer argument expected, got float" ); + goto exit; + } + major = _PyLong_AsInt(args[0]); + if (major == -1 && PyErr_Occurred()) { + goto exit; + } + if (PyFloat_Check(args[1])) { + PyErr_SetString(PyExc_TypeError, + "integer argument expected, got float" ); + goto exit; + } + minor = _PyLong_AsInt(args[1]); + if (minor == -1 && PyErr_Occurred()) { goto exit; } _return_value = os_makedev_impl(module, major, minor); @@ -4476,8 +4760,19 @@ os_ftruncate(PyObject *module, PyObject *const *args, Py_ssize_t nargs) int fd; Py_off_t length; - if (!_PyArg_ParseStack(args, nargs, "iO&:ftruncate", - &fd, Py_off_t_converter, &length)) { + if (!_PyArg_CheckPositional("ftruncate", nargs, 2, 2)) { + goto exit; + } + if (PyFloat_Check(args[0])) { + PyErr_SetString(PyExc_TypeError, + "integer argument expected, got float" ); + goto exit; + } + fd = _PyLong_AsInt(args[0]); + if (fd == -1 && PyErr_Occurred()) { + goto exit; + } + if (!Py_off_t_converter(args[1], &length)) { goto exit; } return_value = os_ftruncate_impl(module, fd, length); @@ -4555,8 +4850,22 @@ os_posix_fallocate(PyObject *module, PyObject *const *args, Py_ssize_t nargs) Py_off_t offset; Py_off_t length; - if (!_PyArg_ParseStack(args, nargs, "iO&O&:posix_fallocate", - &fd, Py_off_t_converter, &offset, Py_off_t_converter, &length)) { + if (!_PyArg_CheckPositional("posix_fallocate", nargs, 3, 3)) { + goto exit; + } + if (PyFloat_Check(args[0])) { + PyErr_SetString(PyExc_TypeError, + "integer argument expected, got float" ); + goto exit; + } + fd = _PyLong_AsInt(args[0]); + if (fd == -1 && PyErr_Occurred()) { + goto exit; + } + if (!Py_off_t_converter(args[1], &offset)) { + goto exit; + } + if (!Py_off_t_converter(args[2], &length)) { goto exit; } return_value = os_posix_fallocate_impl(module, fd, offset, length); @@ -4599,8 +4908,31 @@ os_posix_fadvise(PyObject *module, PyObject *const *args, Py_ssize_t nargs) Py_off_t length; int advice; - if (!_PyArg_ParseStack(args, nargs, "iO&O&i:posix_fadvise", - &fd, Py_off_t_converter, &offset, Py_off_t_converter, &length, &advice)) { + if (!_PyArg_CheckPositional("posix_fadvise", nargs, 4, 4)) { + goto exit; + } + if (PyFloat_Check(args[0])) { + PyErr_SetString(PyExc_TypeError, + "integer argument expected, got float" ); + goto exit; + } + fd = _PyLong_AsInt(args[0]); + if (fd == -1 && PyErr_Occurred()) { + goto exit; + } + if (!Py_off_t_converter(args[1], &offset)) { + goto exit; + } + if (!Py_off_t_converter(args[2], &length)) { + goto exit; + } + if (PyFloat_Check(args[3])) { + PyErr_SetString(PyExc_TypeError, + "integer argument expected, got float" ); + goto exit; + } + advice = _PyLong_AsInt(args[3]); + if (advice == -1 && PyErr_Occurred()) { goto exit; } return_value = os_posix_fadvise_impl(module, fd, offset, length, advice); @@ -4632,10 +4964,25 @@ os_putenv(PyObject *module, PyObject *const *args, Py_ssize_t nargs) PyObject *name; PyObject *value; - if (!_PyArg_ParseStack(args, nargs, "UU:putenv", - &name, &value)) { + if (!_PyArg_CheckPositional("putenv", nargs, 2, 2)) { + goto exit; + } + if (!PyUnicode_Check(args[0])) { + _PyArg_BadArgument("putenv", 1, "str", args[0]); + goto exit; + } + if (PyUnicode_READY(args[0]) == -1) { + goto exit; + } + name = args[0]; + if (!PyUnicode_Check(args[1])) { + _PyArg_BadArgument("putenv", 2, "str", args[1]); + goto exit; + } + if (PyUnicode_READY(args[1]) == -1) { goto exit; } + value = args[1]; return_value = os_putenv_impl(module, name, value); exit: @@ -4665,8 +5012,13 @@ os_putenv(PyObject *module, PyObject *const *args, Py_ssize_t nargs) PyObject *name = NULL; PyObject *value = NULL; - if (!_PyArg_ParseStack(args, nargs, "O&O&:putenv", - PyUnicode_FSConverter, &name, PyUnicode_FSConverter, &value)) { + if (!_PyArg_CheckPositional("putenv", nargs, 2, 2)) { + goto exit; + } + if (!PyUnicode_FSConverter(args[0], &name)) { + goto exit; + } + if (!PyUnicode_FSConverter(args[1], &value)) { goto exit; } return_value = os_putenv_impl(module, name, value); @@ -5208,8 +5560,19 @@ os_fpathconf(PyObject *module, PyObject *const *args, Py_ssize_t nargs) int name; long _return_value; - if (!_PyArg_ParseStack(args, nargs, "iO&:fpathconf", - &fd, conv_path_confname, &name)) { + if (!_PyArg_CheckPositional("fpathconf", nargs, 2, 2)) { + goto exit; + } + if (PyFloat_Check(args[0])) { + PyErr_SetString(PyExc_TypeError, + "integer argument expected, got float" ); + goto exit; + } + fd = _PyLong_AsInt(args[0]); + if (fd == -1 && PyErr_Occurred()) { + goto exit; + } + if (!conv_path_confname(args[1], &name)) { goto exit; } _return_value = os_fpathconf_impl(module, fd, name); @@ -5496,8 +5859,16 @@ os_setresuid(PyObject *module, PyObject *const *args, Py_ssize_t nargs) uid_t euid; uid_t suid; - if (!_PyArg_ParseStack(args, nargs, "O&O&O&:setresuid", - _Py_Uid_Converter, &ruid, _Py_Uid_Converter, &euid, _Py_Uid_Converter, &suid)) { + if (!_PyArg_CheckPositional("setresuid", nargs, 3, 3)) { + goto exit; + } + if (!_Py_Uid_Converter(args[0], &ruid)) { + goto exit; + } + if (!_Py_Uid_Converter(args[1], &euid)) { + goto exit; + } + if (!_Py_Uid_Converter(args[2], &suid)) { goto exit; } return_value = os_setresuid_impl(module, ruid, euid, suid); @@ -5530,8 +5901,16 @@ os_setresgid(PyObject *module, PyObject *const *args, Py_ssize_t nargs) gid_t egid; gid_t sgid; - if (!_PyArg_ParseStack(args, nargs, "O&O&O&:setresgid", - _Py_Gid_Converter, &rgid, _Py_Gid_Converter, &egid, _Py_Gid_Converter, &sgid)) { + if (!_PyArg_CheckPositional("setresgid", nargs, 3, 3)) { + goto exit; + } + if (!_Py_Gid_Converter(args[0], &rgid)) { + goto exit; + } + if (!_Py_Gid_Converter(args[1], &egid)) { + goto exit; + } + if (!_Py_Gid_Converter(args[2], &sgid)) { goto exit; } return_value = os_setresgid_impl(module, rgid, egid, sgid); @@ -5898,8 +6277,25 @@ os_set_inheritable(PyObject *module, PyObject *const *args, Py_ssize_t nargs) int fd; int inheritable; - if (!_PyArg_ParseStack(args, nargs, "ii:set_inheritable", - &fd, &inheritable)) { + if (!_PyArg_CheckPositional("set_inheritable", nargs, 2, 2)) { + goto exit; + } + if (PyFloat_Check(args[0])) { + PyErr_SetString(PyExc_TypeError, + "integer argument expected, got float" ); + goto exit; + } + fd = _PyLong_AsInt(args[0]); + if (fd == -1 && PyErr_Occurred()) { + goto exit; + } + if (PyFloat_Check(args[1])) { + PyErr_SetString(PyExc_TypeError, + "integer argument expected, got float" ); + goto exit; + } + inheritable = _PyLong_AsInt(args[1]); + if (inheritable == -1 && PyErr_Occurred()) { goto exit; } return_value = os_set_inheritable_impl(module, fd, inheritable); @@ -6046,8 +6442,25 @@ os_set_blocking(PyObject *module, PyObject *const *args, Py_ssize_t nargs) int fd; int blocking; - if (!_PyArg_ParseStack(args, nargs, "ii:set_blocking", - &fd, &blocking)) { + if (!_PyArg_CheckPositional("set_blocking", nargs, 2, 2)) { + goto exit; + } + if (PyFloat_Check(args[0])) { + PyErr_SetString(PyExc_TypeError, + "integer argument expected, got float" ); + goto exit; + } + fd = _PyLong_AsInt(args[0]); + if (fd == -1 && PyErr_Occurred()) { + goto exit; + } + if (PyFloat_Check(args[1])) { + PyErr_SetString(PyExc_TypeError, + "integer argument expected, got float" ); + goto exit; + } + blocking = _PyLong_AsInt(args[1]); + if (blocking == -1 && PyErr_Occurred()) { goto exit; } return_value = os_set_blocking_impl(module, fd, blocking); @@ -6845,4 +7258,4 @@ os_getrandom(PyObject *module, PyObject *const *args, Py_ssize_t nargs, PyObject #ifndef OS_GETRANDOM_METHODDEF #define OS_GETRANDOM_METHODDEF #endif /* !defined(OS_GETRANDOM_METHODDEF) */ -/*[clinic end generated code: output=b02036b2a269b1db input=a9049054013a1b77]*/ +/*[clinic end generated code: output=febc1e16c9024e40 input=a9049054013a1b77]*/ diff --git a/Modules/clinic/pwdmodule.c.h b/Modules/clinic/pwdmodule.c.h index 9270be0f9bf8..cf84ec959d58 100644 --- a/Modules/clinic/pwdmodule.c.h +++ b/Modules/clinic/pwdmodule.c.h @@ -34,7 +34,7 @@ pwd_getpwnam(PyObject *module, PyObject *arg) PyObject *name; if (!PyUnicode_Check(arg)) { - _PyArg_BadArgument("getpwnam", "str", arg); + _PyArg_BadArgument("getpwnam", 0, "str", arg); goto exit; } if (PyUnicode_READY(arg) == -1) { @@ -74,4 +74,4 @@ pwd_getpwall(PyObject *module, PyObject *Py_UNUSED(ignored)) #ifndef PWD_GETPWALL_METHODDEF #define PWD_GETPWALL_METHODDEF #endif /* !defined(PWD_GETPWALL_METHODDEF) */ -/*[clinic end generated code: output=9e86e23d6ad9cd08 input=a9049054013a1b77]*/ +/*[clinic end generated code: output=f9412bdedc69706c input=a9049054013a1b77]*/ diff --git a/Modules/clinic/pyexpat.c.h b/Modules/clinic/pyexpat.c.h index 1545485e9c5b..3fe7b64e6f39 100644 --- a/Modules/clinic/pyexpat.c.h +++ b/Modules/clinic/pyexpat.c.h @@ -24,10 +24,23 @@ pyexpat_xmlparser_Parse(xmlparseobject *self, PyObject *const *args, Py_ssize_t PyObject *data; int isfinal = 0; - if (!_PyArg_ParseStack(args, nargs, "O|i:Parse", - &data, &isfinal)) { + if (!_PyArg_CheckPositional("Parse", nargs, 1, 2)) { goto exit; } + data = args[0]; + if (nargs < 2) { + goto skip_optional; + } + if (PyFloat_Check(args[1])) { + PyErr_SetString(PyExc_TypeError, + "integer argument expected, got float" ); + goto exit; + } + isfinal = _PyLong_AsInt(args[1]); + if (isfinal == -1 && PyErr_Occurred()) { + goto exit; + } +skip_optional: return_value = pyexpat_xmlparser_Parse_impl(self, data, isfinal); exit: @@ -62,7 +75,7 @@ pyexpat_xmlparser_SetBase(xmlparseobject *self, PyObject *arg) const char *base; if (!PyUnicode_Check(arg)) { - _PyArg_BadArgument("SetBase", "str", arg); + _PyArg_BadArgument("SetBase", 0, "str", arg); goto exit; } Py_ssize_t base_length; @@ -140,10 +153,44 @@ pyexpat_xmlparser_ExternalEntityParserCreate(xmlparseobject *self, PyObject *con const char *context; const char *encoding = NULL; - if (!_PyArg_ParseStack(args, nargs, "z|s:ExternalEntityParserCreate", - &context, &encoding)) { + if (!_PyArg_CheckPositional("ExternalEntityParserCreate", nargs, 1, 2)) { + goto exit; + } + if (args[0] == Py_None) { + context = NULL; + } + else if (PyUnicode_Check(args[0])) { + Py_ssize_t context_length; + context = PyUnicode_AsUTF8AndSize(args[0], &context_length); + if (context == NULL) { + goto exit; + } + if (strlen(context) != (size_t)context_length) { + PyErr_SetString(PyExc_ValueError, "embedded null character"); + goto exit; + } + } + else { + _PyArg_BadArgument("ExternalEntityParserCreate", 1, "str or None", args[0]); goto exit; } + if (nargs < 2) { + goto skip_optional; + } + if (!PyUnicode_Check(args[1])) { + _PyArg_BadArgument("ExternalEntityParserCreate", 2, "str", args[1]); + goto exit; + } + Py_ssize_t encoding_length; + encoding = PyUnicode_AsUTF8AndSize(args[1], &encoding_length); + if (encoding == NULL) { + goto exit; + } + if (strlen(encoding) != (size_t)encoding_length) { + PyErr_SetString(PyExc_ValueError, "embedded null character"); + goto exit; + } +skip_optional: return_value = pyexpat_xmlparser_ExternalEntityParserCreate_impl(self, context, encoding); exit: @@ -212,10 +259,17 @@ pyexpat_xmlparser_UseForeignDTD(xmlparseobject *self, PyObject *const *args, Py_ PyObject *return_value = NULL; int flag = 1; - if (!_PyArg_ParseStack(args, nargs, "|p:UseForeignDTD", - &flag)) { + if (!_PyArg_CheckPositional("UseForeignDTD", nargs, 0, 1)) { + goto exit; + } + if (nargs < 1) { + goto skip_optional; + } + flag = PyObject_IsTrue(args[0]); + if (flag < 0) { goto exit; } +skip_optional: return_value = pyexpat_xmlparser_UseForeignDTD_impl(self, flag); exit: @@ -294,4 +348,4 @@ pyexpat_ErrorString(PyObject *module, PyObject *arg) #ifndef PYEXPAT_XMLPARSER_USEFOREIGNDTD_METHODDEF #define PYEXPAT_XMLPARSER_USEFOREIGNDTD_METHODDEF #endif /* !defined(PYEXPAT_XMLPARSER_USEFOREIGNDTD_METHODDEF) */ -/*[clinic end generated code: output=d3750256eb0da1cb input=a9049054013a1b77]*/ +/*[clinic end generated code: output=0f18b756d82b78a5 input=a9049054013a1b77]*/ diff --git a/Modules/clinic/resource.c.h b/Modules/clinic/resource.c.h index 0a66f8fd66fb..80efb714bb6b 100644 --- a/Modules/clinic/resource.c.h +++ b/Modules/clinic/resource.c.h @@ -84,10 +84,19 @@ resource_setrlimit(PyObject *module, PyObject *const *args, Py_ssize_t nargs) int resource; PyObject *limits; - if (!_PyArg_ParseStack(args, nargs, "iO:setrlimit", - &resource, &limits)) { + if (!_PyArg_CheckPositional("setrlimit", nargs, 2, 2)) { goto exit; } + if (PyFloat_Check(args[0])) { + PyErr_SetString(PyExc_TypeError, + "integer argument expected, got float" ); + goto exit; + } + resource = _PyLong_AsInt(args[0]); + if (resource == -1 && PyErr_Occurred()) { + goto exit; + } + limits = args[1]; return_value = resource_setrlimit_impl(module, resource, limits); exit: @@ -169,4 +178,4 @@ resource_getpagesize(PyObject *module, PyObject *Py_UNUSED(ignored)) #ifndef RESOURCE_PRLIMIT_METHODDEF #define RESOURCE_PRLIMIT_METHODDEF #endif /* !defined(RESOURCE_PRLIMIT_METHODDEF) */ -/*[clinic end generated code: output=b16a9149639081fd input=a9049054013a1b77]*/ +/*[clinic end generated code: output=ef3034f291156a34 input=a9049054013a1b77]*/ diff --git a/Modules/clinic/selectmodule.c.h b/Modules/clinic/selectmodule.c.h index bb69d957c697..655e24c3533f 100644 --- a/Modules/clinic/selectmodule.c.h +++ b/Modules/clinic/selectmodule.c.h @@ -82,10 +82,19 @@ select_poll_register(pollObject *self, PyObject *const *args, Py_ssize_t nargs) int fd; unsigned short eventmask = POLLIN | POLLPRI | POLLOUT; - if (!_PyArg_ParseStack(args, nargs, "O&|O&:register", - fildes_converter, &fd, _PyLong_UnsignedShort_Converter, &eventmask)) { + if (!_PyArg_CheckPositional("register", nargs, 1, 2)) { goto exit; } + if (!fildes_converter(args[0], &fd)) { + goto exit; + } + if (nargs < 2) { + goto skip_optional; + } + if (!_PyLong_UnsignedShort_Converter(args[1], &eventmask)) { + goto exit; + } +skip_optional: return_value = select_poll_register_impl(self, fd, eventmask); exit: @@ -121,8 +130,13 @@ select_poll_modify(pollObject *self, PyObject *const *args, Py_ssize_t nargs) int fd; unsigned short eventmask; - if (!_PyArg_ParseStack(args, nargs, "O&O&:modify", - fildes_converter, &fd, _PyLong_UnsignedShort_Converter, &eventmask)) { + if (!_PyArg_CheckPositional("modify", nargs, 2, 2)) { + goto exit; + } + if (!fildes_converter(args[0], &fd)) { + goto exit; + } + if (!_PyLong_UnsignedShort_Converter(args[1], &eventmask)) { goto exit; } return_value = select_poll_modify_impl(self, fd, eventmask); @@ -228,10 +242,19 @@ select_devpoll_register(devpollObject *self, PyObject *const *args, Py_ssize_t n int fd; unsigned short eventmask = POLLIN | POLLPRI | POLLOUT; - if (!_PyArg_ParseStack(args, nargs, "O&|O&:register", - fildes_converter, &fd, _PyLong_UnsignedShort_Converter, &eventmask)) { + if (!_PyArg_CheckPositional("register", nargs, 1, 2)) { + goto exit; + } + if (!fildes_converter(args[0], &fd)) { + goto exit; + } + if (nargs < 2) { + goto skip_optional; + } + if (!_PyLong_UnsignedShort_Converter(args[1], &eventmask)) { goto exit; } +skip_optional: return_value = select_devpoll_register_impl(self, fd, eventmask); exit: @@ -268,10 +291,19 @@ select_devpoll_modify(devpollObject *self, PyObject *const *args, Py_ssize_t nar int fd; unsigned short eventmask = POLLIN | POLLPRI | POLLOUT; - if (!_PyArg_ParseStack(args, nargs, "O&|O&:modify", - fildes_converter, &fd, _PyLong_UnsignedShort_Converter, &eventmask)) { + if (!_PyArg_CheckPositional("modify", nargs, 1, 2)) { + goto exit; + } + if (!fildes_converter(args[0], &fd)) { goto exit; } + if (nargs < 2) { + goto skip_optional; + } + if (!_PyLong_UnsignedShort_Converter(args[1], &eventmask)) { + goto exit; + } +skip_optional: return_value = select_devpoll_modify_impl(self, fd, eventmask); exit: @@ -948,10 +980,24 @@ select_kqueue_control(kqueue_queue_Object *self, PyObject *const *args, Py_ssize int maxevents; PyObject *otimeout = Py_None; - if (!_PyArg_ParseStack(args, nargs, "Oi|O:control", - &changelist, &maxevents, &otimeout)) { + if (!_PyArg_CheckPositional("control", nargs, 2, 3)) { goto exit; } + changelist = args[0]; + if (PyFloat_Check(args[1])) { + PyErr_SetString(PyExc_TypeError, + "integer argument expected, got float" ); + goto exit; + } + maxevents = _PyLong_AsInt(args[1]); + if (maxevents == -1 && PyErr_Occurred()) { + goto exit; + } + if (nargs < 3) { + goto skip_optional; + } + otimeout = args[2]; +skip_optional: return_value = select_kqueue_control_impl(self, changelist, maxevents, otimeout); exit: @@ -1059,4 +1105,4 @@ select_kqueue_control(kqueue_queue_Object *self, PyObject *const *args, Py_ssize #ifndef SELECT_KQUEUE_CONTROL_METHODDEF #define SELECT_KQUEUE_CONTROL_METHODDEF #endif /* !defined(SELECT_KQUEUE_CONTROL_METHODDEF) */ -/*[clinic end generated code: output=122a49f131cdd9d9 input=a9049054013a1b77]*/ +/*[clinic end generated code: output=20da8f9c050e1b65 input=a9049054013a1b77]*/ diff --git a/Modules/clinic/signalmodule.c.h b/Modules/clinic/signalmodule.c.h index f3742262d867..bc46515cb436 100644 --- a/Modules/clinic/signalmodule.c.h +++ b/Modules/clinic/signalmodule.c.h @@ -125,10 +125,19 @@ signal_signal(PyObject *module, PyObject *const *args, Py_ssize_t nargs) int signalnum; PyObject *handler; - if (!_PyArg_ParseStack(args, nargs, "iO:signal", - &signalnum, &handler)) { + if (!_PyArg_CheckPositional("signal", nargs, 2, 2)) { goto exit; } + if (PyFloat_Check(args[0])) { + PyErr_SetString(PyExc_TypeError, + "integer argument expected, got float" ); + goto exit; + } + signalnum = _PyLong_AsInt(args[0]); + if (signalnum == -1 && PyErr_Occurred()) { + goto exit; + } + handler = args[1]; return_value = signal_signal_impl(module, signalnum, handler); exit: @@ -234,8 +243,25 @@ signal_siginterrupt(PyObject *module, PyObject *const *args, Py_ssize_t nargs) int signalnum; int flag; - if (!_PyArg_ParseStack(args, nargs, "ii:siginterrupt", - &signalnum, &flag)) { + if (!_PyArg_CheckPositional("siginterrupt", nargs, 2, 2)) { + goto exit; + } + if (PyFloat_Check(args[0])) { + PyErr_SetString(PyExc_TypeError, + "integer argument expected, got float" ); + goto exit; + } + signalnum = _PyLong_AsInt(args[0]); + if (signalnum == -1 && PyErr_Occurred()) { + goto exit; + } + if (PyFloat_Check(args[1])) { + PyErr_SetString(PyExc_TypeError, + "integer argument expected, got float" ); + goto exit; + } + flag = _PyLong_AsInt(args[1]); + if (flag == -1 && PyErr_Occurred()) { goto exit; } return_value = signal_siginterrupt_impl(module, signalnum, flag); @@ -274,10 +300,24 @@ signal_setitimer(PyObject *module, PyObject *const *args, Py_ssize_t nargs) PyObject *seconds; PyObject *interval = NULL; - if (!_PyArg_ParseStack(args, nargs, "iO|O:setitimer", - &which, &seconds, &interval)) { + if (!_PyArg_CheckPositional("setitimer", nargs, 2, 3)) { + goto exit; + } + if (PyFloat_Check(args[0])) { + PyErr_SetString(PyExc_TypeError, + "integer argument expected, got float" ); goto exit; } + which = _PyLong_AsInt(args[0]); + if (which == -1 && PyErr_Occurred()) { + goto exit; + } + seconds = args[1]; + if (nargs < 3) { + goto skip_optional; + } + interval = args[2]; +skip_optional: return_value = signal_setitimer_impl(module, which, seconds, interval); exit: @@ -344,8 +384,19 @@ signal_pthread_sigmask(PyObject *module, PyObject *const *args, Py_ssize_t nargs int how; sigset_t mask; - if (!_PyArg_ParseStack(args, nargs, "iO&:pthread_sigmask", - &how, _Py_Sigset_Converter, &mask)) { + if (!_PyArg_CheckPositional("pthread_sigmask", nargs, 2, 2)) { + goto exit; + } + if (PyFloat_Check(args[0])) { + PyErr_SetString(PyExc_TypeError, + "integer argument expected, got float" ); + goto exit; + } + how = _PyLong_AsInt(args[0]); + if (how == -1 && PyErr_Occurred()) { + goto exit; + } + if (!_Py_Sigset_Converter(args[1], &mask)) { goto exit; } return_value = signal_pthread_sigmask_impl(module, how, mask); @@ -498,10 +549,13 @@ signal_sigtimedwait(PyObject *module, PyObject *const *args, Py_ssize_t nargs) sigset_t sigset; PyObject *timeout_obj; - if (!_PyArg_ParseStack(args, nargs, "O&O:sigtimedwait", - _Py_Sigset_Converter, &sigset, &timeout_obj)) { + if (!_PyArg_CheckPositional("sigtimedwait", nargs, 2, 2)) { goto exit; } + if (!_Py_Sigset_Converter(args[0], &sigset)) { + goto exit; + } + timeout_obj = args[1]; return_value = signal_sigtimedwait_impl(module, sigset, timeout_obj); exit: @@ -532,8 +586,21 @@ signal_pthread_kill(PyObject *module, PyObject *const *args, Py_ssize_t nargs) unsigned long thread_id; int signalnum; - if (!_PyArg_ParseStack(args, nargs, "ki:pthread_kill", - &thread_id, &signalnum)) { + if (!_PyArg_CheckPositional("pthread_kill", nargs, 2, 2)) { + goto exit; + } + if (!PyLong_Check(args[0])) { + _PyArg_BadArgument("pthread_kill", 1, "int", args[0]); + goto exit; + } + thread_id = PyLong_AsUnsignedLongMask(args[0]); + if (PyFloat_Check(args[1])) { + PyErr_SetString(PyExc_TypeError, + "integer argument expected, got float" ); + goto exit; + } + signalnum = _PyLong_AsInt(args[1]); + if (signalnum == -1 && PyErr_Occurred()) { goto exit; } return_value = signal_pthread_kill_impl(module, thread_id, signalnum); @@ -591,4 +658,4 @@ signal_pthread_kill(PyObject *module, PyObject *const *args, Py_ssize_t nargs) #ifndef SIGNAL_PTHREAD_KILL_METHODDEF #define SIGNAL_PTHREAD_KILL_METHODDEF #endif /* !defined(SIGNAL_PTHREAD_KILL_METHODDEF) */ -/*[clinic end generated code: output=365db4e807c26d4e input=a9049054013a1b77]*/ +/*[clinic end generated code: output=f0d3a5703581da76 input=a9049054013a1b77]*/ diff --git a/Modules/clinic/spwdmodule.c.h b/Modules/clinic/spwdmodule.c.h index a0a3e2e5ef8a..e051e6eb6589 100644 --- a/Modules/clinic/spwdmodule.c.h +++ b/Modules/clinic/spwdmodule.c.h @@ -25,7 +25,7 @@ spwd_getspnam(PyObject *module, PyObject *arg_) PyObject *arg; if (!PyUnicode_Check(arg_)) { - _PyArg_BadArgument("getspnam", "str", arg_); + _PyArg_BadArgument("getspnam", 0, "str", arg_); goto exit; } if (PyUnicode_READY(arg_) == -1) { @@ -71,4 +71,4 @@ spwd_getspall(PyObject *module, PyObject *Py_UNUSED(ignored)) #ifndef SPWD_GETSPALL_METHODDEF #define SPWD_GETSPALL_METHODDEF #endif /* !defined(SPWD_GETSPALL_METHODDEF) */ -/*[clinic end generated code: output=44a7c196d4b48f4e input=a9049054013a1b77]*/ +/*[clinic end generated code: output=2bbaa6bab1d9116e input=a9049054013a1b77]*/ diff --git a/Modules/clinic/symtablemodule.c.h b/Modules/clinic/symtablemodule.c.h index b5e64e0313f8..73e340bd462a 100644 --- a/Modules/clinic/symtablemodule.c.h +++ b/Modules/clinic/symtablemodule.c.h @@ -23,8 +23,36 @@ _symtable_symtable(PyObject *module, PyObject *const *args, Py_ssize_t nargs) PyObject *filename; const char *startstr; - if (!_PyArg_ParseStack(args, nargs, "sO&s:symtable", - &str, PyUnicode_FSDecoder, &filename, &startstr)) { + if (!_PyArg_CheckPositional("symtable", nargs, 3, 3)) { + goto exit; + } + if (!PyUnicode_Check(args[0])) { + _PyArg_BadArgument("symtable", 1, "str", args[0]); + goto exit; + } + Py_ssize_t str_length; + str = PyUnicode_AsUTF8AndSize(args[0], &str_length); + if (str == NULL) { + goto exit; + } + if (strlen(str) != (size_t)str_length) { + PyErr_SetString(PyExc_ValueError, "embedded null character"); + goto exit; + } + if (!PyUnicode_FSDecoder(args[1], &filename)) { + goto exit; + } + if (!PyUnicode_Check(args[2])) { + _PyArg_BadArgument("symtable", 3, "str", args[2]); + goto exit; + } + Py_ssize_t startstr_length; + startstr = PyUnicode_AsUTF8AndSize(args[2], &startstr_length); + if (startstr == NULL) { + goto exit; + } + if (strlen(startstr) != (size_t)startstr_length) { + PyErr_SetString(PyExc_ValueError, "embedded null character"); goto exit; } return_value = _symtable_symtable_impl(module, str, filename, startstr); @@ -32,4 +60,4 @@ _symtable_symtable(PyObject *module, PyObject *const *args, Py_ssize_t nargs) exit: return return_value; } -/*[clinic end generated code: output=52ece07dd0e7a113 input=a9049054013a1b77]*/ +/*[clinic end generated code: output=be1cca59de019984 input=a9049054013a1b77]*/ diff --git a/Modules/clinic/unicodedata.c.h b/Modules/clinic/unicodedata.c.h index 9e8d261bda55..8ca0881efcf8 100644 --- a/Modules/clinic/unicodedata.c.h +++ b/Modules/clinic/unicodedata.c.h @@ -26,10 +26,26 @@ unicodedata_UCD_decimal(PyObject *self, PyObject *const *args, Py_ssize_t nargs) int chr; PyObject *default_value = NULL; - if (!_PyArg_ParseStack(args, nargs, "C|O:decimal", - &chr, &default_value)) { + if (!_PyArg_CheckPositional("decimal", nargs, 1, 2)) { goto exit; } + if (!PyUnicode_Check(args[0])) { + _PyArg_BadArgument("decimal", 1, "a unicode character", args[0]); + goto exit; + } + if (PyUnicode_READY(args[0])) { + goto exit; + } + if (PyUnicode_GET_LENGTH(args[0]) != 1) { + _PyArg_BadArgument("decimal", 1, "a unicode character", args[0]); + goto exit; + } + chr = PyUnicode_READ_CHAR(args[0], 0); + if (nargs < 2) { + goto skip_optional; + } + default_value = args[1]; +skip_optional: return_value = unicodedata_UCD_decimal_impl(self, chr, default_value); exit: @@ -59,10 +75,26 @@ unicodedata_UCD_digit(PyObject *self, PyObject *const *args, Py_ssize_t nargs) int chr; PyObject *default_value = NULL; - if (!_PyArg_ParseStack(args, nargs, "C|O:digit", - &chr, &default_value)) { + if (!_PyArg_CheckPositional("digit", nargs, 1, 2)) { + goto exit; + } + if (!PyUnicode_Check(args[0])) { + _PyArg_BadArgument("digit", 1, "a unicode character", args[0]); + goto exit; + } + if (PyUnicode_READY(args[0])) { goto exit; } + if (PyUnicode_GET_LENGTH(args[0]) != 1) { + _PyArg_BadArgument("digit", 1, "a unicode character", args[0]); + goto exit; + } + chr = PyUnicode_READ_CHAR(args[0], 0); + if (nargs < 2) { + goto skip_optional; + } + default_value = args[1]; +skip_optional: return_value = unicodedata_UCD_digit_impl(self, chr, default_value); exit: @@ -93,10 +125,26 @@ unicodedata_UCD_numeric(PyObject *self, PyObject *const *args, Py_ssize_t nargs) int chr; PyObject *default_value = NULL; - if (!_PyArg_ParseStack(args, nargs, "C|O:numeric", - &chr, &default_value)) { + if (!_PyArg_CheckPositional("numeric", nargs, 1, 2)) { goto exit; } + if (!PyUnicode_Check(args[0])) { + _PyArg_BadArgument("numeric", 1, "a unicode character", args[0]); + goto exit; + } + if (PyUnicode_READY(args[0])) { + goto exit; + } + if (PyUnicode_GET_LENGTH(args[0]) != 1) { + _PyArg_BadArgument("numeric", 1, "a unicode character", args[0]); + goto exit; + } + chr = PyUnicode_READ_CHAR(args[0], 0); + if (nargs < 2) { + goto skip_optional; + } + default_value = args[1]; +skip_optional: return_value = unicodedata_UCD_numeric_impl(self, chr, default_value); exit: @@ -122,14 +170,14 @@ unicodedata_UCD_category(PyObject *self, PyObject *arg) int chr; if (!PyUnicode_Check(arg)) { - _PyArg_BadArgument("category", "a unicode character", arg); + _PyArg_BadArgument("category", 0, "a unicode character", arg); goto exit; } if (PyUnicode_READY(arg)) { goto exit; } if (PyUnicode_GET_LENGTH(arg) != 1) { - _PyArg_BadArgument("category", "a unicode character", arg); + _PyArg_BadArgument("category", 0, "a unicode character", arg); goto exit; } chr = PyUnicode_READ_CHAR(arg, 0); @@ -160,14 +208,14 @@ unicodedata_UCD_bidirectional(PyObject *self, PyObject *arg) int chr; if (!PyUnicode_Check(arg)) { - _PyArg_BadArgument("bidirectional", "a unicode character", arg); + _PyArg_BadArgument("bidirectional", 0, "a unicode character", arg); goto exit; } if (PyUnicode_READY(arg)) { goto exit; } if (PyUnicode_GET_LENGTH(arg) != 1) { - _PyArg_BadArgument("bidirectional", "a unicode character", arg); + _PyArg_BadArgument("bidirectional", 0, "a unicode character", arg); goto exit; } chr = PyUnicode_READ_CHAR(arg, 0); @@ -199,14 +247,14 @@ unicodedata_UCD_combining(PyObject *self, PyObject *arg) int _return_value; if (!PyUnicode_Check(arg)) { - _PyArg_BadArgument("combining", "a unicode character", arg); + _PyArg_BadArgument("combining", 0, "a unicode character", arg); goto exit; } if (PyUnicode_READY(arg)) { goto exit; } if (PyUnicode_GET_LENGTH(arg) != 1) { - _PyArg_BadArgument("combining", "a unicode character", arg); + _PyArg_BadArgument("combining", 0, "a unicode character", arg); goto exit; } chr = PyUnicode_READ_CHAR(arg, 0); @@ -243,14 +291,14 @@ unicodedata_UCD_mirrored(PyObject *self, PyObject *arg) int _return_value; if (!PyUnicode_Check(arg)) { - _PyArg_BadArgument("mirrored", "a unicode character", arg); + _PyArg_BadArgument("mirrored", 0, "a unicode character", arg); goto exit; } if (PyUnicode_READY(arg)) { goto exit; } if (PyUnicode_GET_LENGTH(arg) != 1) { - _PyArg_BadArgument("mirrored", "a unicode character", arg); + _PyArg_BadArgument("mirrored", 0, "a unicode character", arg); goto exit; } chr = PyUnicode_READ_CHAR(arg, 0); @@ -283,14 +331,14 @@ unicodedata_UCD_east_asian_width(PyObject *self, PyObject *arg) int chr; if (!PyUnicode_Check(arg)) { - _PyArg_BadArgument("east_asian_width", "a unicode character", arg); + _PyArg_BadArgument("east_asian_width", 0, "a unicode character", arg); goto exit; } if (PyUnicode_READY(arg)) { goto exit; } if (PyUnicode_GET_LENGTH(arg) != 1) { - _PyArg_BadArgument("east_asian_width", "a unicode character", arg); + _PyArg_BadArgument("east_asian_width", 0, "a unicode character", arg); goto exit; } chr = PyUnicode_READ_CHAR(arg, 0); @@ -321,14 +369,14 @@ unicodedata_UCD_decomposition(PyObject *self, PyObject *arg) int chr; if (!PyUnicode_Check(arg)) { - _PyArg_BadArgument("decomposition", "a unicode character", arg); + _PyArg_BadArgument("decomposition", 0, "a unicode character", arg); goto exit; } if (PyUnicode_READY(arg)) { goto exit; } if (PyUnicode_GET_LENGTH(arg) != 1) { - _PyArg_BadArgument("decomposition", "a unicode character", arg); + _PyArg_BadArgument("decomposition", 0, "a unicode character", arg); goto exit; } chr = PyUnicode_READ_CHAR(arg, 0); @@ -360,10 +408,25 @@ unicodedata_UCD_is_normalized(PyObject *self, PyObject *const *args, Py_ssize_t PyObject *form; PyObject *input; - if (!_PyArg_ParseStack(args, nargs, "UU:is_normalized", - &form, &input)) { + if (!_PyArg_CheckPositional("is_normalized", nargs, 2, 2)) { + goto exit; + } + if (!PyUnicode_Check(args[0])) { + _PyArg_BadArgument("is_normalized", 1, "str", args[0]); + goto exit; + } + if (PyUnicode_READY(args[0]) == -1) { goto exit; } + form = args[0]; + if (!PyUnicode_Check(args[1])) { + _PyArg_BadArgument("is_normalized", 2, "str", args[1]); + goto exit; + } + if (PyUnicode_READY(args[1]) == -1) { + goto exit; + } + input = args[1]; return_value = unicodedata_UCD_is_normalized_impl(self, form, input); exit: @@ -392,10 +455,25 @@ unicodedata_UCD_normalize(PyObject *self, PyObject *const *args, Py_ssize_t narg PyObject *form; PyObject *input; - if (!_PyArg_ParseStack(args, nargs, "UU:normalize", - &form, &input)) { + if (!_PyArg_CheckPositional("normalize", nargs, 2, 2)) { + goto exit; + } + if (!PyUnicode_Check(args[0])) { + _PyArg_BadArgument("normalize", 1, "str", args[0]); + goto exit; + } + if (PyUnicode_READY(args[0]) == -1) { goto exit; } + form = args[0]; + if (!PyUnicode_Check(args[1])) { + _PyArg_BadArgument("normalize", 2, "str", args[1]); + goto exit; + } + if (PyUnicode_READY(args[1]) == -1) { + goto exit; + } + input = args[1]; return_value = unicodedata_UCD_normalize_impl(self, form, input); exit: @@ -424,10 +502,26 @@ unicodedata_UCD_name(PyObject *self, PyObject *const *args, Py_ssize_t nargs) int chr; PyObject *default_value = NULL; - if (!_PyArg_ParseStack(args, nargs, "C|O:name", - &chr, &default_value)) { + if (!_PyArg_CheckPositional("name", nargs, 1, 2)) { + goto exit; + } + if (!PyUnicode_Check(args[0])) { + _PyArg_BadArgument("name", 1, "a unicode character", args[0]); goto exit; } + if (PyUnicode_READY(args[0])) { + goto exit; + } + if (PyUnicode_GET_LENGTH(args[0]) != 1) { + _PyArg_BadArgument("name", 1, "a unicode character", args[0]); + goto exit; + } + chr = PyUnicode_READ_CHAR(args[0], 0); + if (nargs < 2) { + goto skip_optional; + } + default_value = args[1]; +skip_optional: return_value = unicodedata_UCD_name_impl(self, chr, default_value); exit: @@ -465,4 +559,4 @@ unicodedata_UCD_lookup(PyObject *self, PyObject *arg) exit: return return_value; } -/*[clinic end generated code: output=709241b99d010896 input=a9049054013a1b77]*/ +/*[clinic end generated code: output=0fc850fe5b6b312c input=a9049054013a1b77]*/ diff --git a/Modules/clinic/zlibmodule.c.h b/Modules/clinic/zlibmodule.c.h index 87ad1ea00299..8e5f96aa143d 100644 --- a/Modules/clinic/zlibmodule.c.h +++ b/Modules/clinic/zlibmodule.c.h @@ -219,7 +219,7 @@ zlib_Compress_compress(compobject *self, PyObject *arg) goto exit; } if (!PyBuffer_IsContiguous(&data, 'C')) { - _PyArg_BadArgument("compress", "contiguous buffer", arg); + _PyArg_BadArgument("compress", 0, "contiguous buffer", arg); goto exit; } return_value = zlib_Compress_compress_impl(self, &data); @@ -305,10 +305,22 @@ zlib_Compress_flush(compobject *self, PyObject *const *args, Py_ssize_t nargs) PyObject *return_value = NULL; int mode = Z_FINISH; - if (!_PyArg_ParseStack(args, nargs, "|i:flush", - &mode)) { + if (!_PyArg_CheckPositional("flush", nargs, 0, 1)) { goto exit; } + if (nargs < 1) { + goto skip_optional; + } + if (PyFloat_Check(args[0])) { + PyErr_SetString(PyExc_TypeError, + "integer argument expected, got float" ); + goto exit; + } + mode = _PyLong_AsInt(args[0]); + if (mode == -1 && PyErr_Occurred()) { + goto exit; + } +skip_optional: return_value = zlib_Compress_flush_impl(self, mode); exit: @@ -446,10 +458,16 @@ zlib_Decompress_flush(compobject *self, PyObject *const *args, Py_ssize_t nargs) PyObject *return_value = NULL; Py_ssize_t length = DEF_BUF_SIZE; - if (!_PyArg_ParseStack(args, nargs, "|O&:flush", - ssize_t_converter, &length)) { + if (!_PyArg_CheckPositional("flush", nargs, 0, 1)) { + goto exit; + } + if (nargs < 1) { + goto skip_optional; + } + if (!ssize_t_converter(args[0], &length)) { goto exit; } +skip_optional: return_value = zlib_Decompress_flush_impl(self, length); exit: @@ -480,10 +498,29 @@ zlib_adler32(PyObject *module, PyObject *const *args, Py_ssize_t nargs) Py_buffer data = {NULL, NULL}; unsigned int value = 1; - if (!_PyArg_ParseStack(args, nargs, "y*|I:adler32", - &data, &value)) { + if (!_PyArg_CheckPositional("adler32", nargs, 1, 2)) { + goto exit; + } + if (PyObject_GetBuffer(args[0], &data, PyBUF_SIMPLE) != 0) { + goto exit; + } + if (!PyBuffer_IsContiguous(&data, 'C')) { + _PyArg_BadArgument("adler32", 1, "contiguous buffer", args[0]); + goto exit; + } + if (nargs < 2) { + goto skip_optional; + } + if (PyFloat_Check(args[1])) { + PyErr_SetString(PyExc_TypeError, + "integer argument expected, got float" ); + goto exit; + } + value = (unsigned int)PyLong_AsUnsignedLongMask(args[1]); + if (value == (unsigned int)-1 && PyErr_Occurred()) { goto exit; } +skip_optional: return_value = zlib_adler32_impl(module, &data, value); exit: @@ -519,10 +556,29 @@ zlib_crc32(PyObject *module, PyObject *const *args, Py_ssize_t nargs) Py_buffer data = {NULL, NULL}; unsigned int value = 0; - if (!_PyArg_ParseStack(args, nargs, "y*|I:crc32", - &data, &value)) { + if (!_PyArg_CheckPositional("crc32", nargs, 1, 2)) { + goto exit; + } + if (PyObject_GetBuffer(args[0], &data, PyBUF_SIMPLE) != 0) { + goto exit; + } + if (!PyBuffer_IsContiguous(&data, 'C')) { + _PyArg_BadArgument("crc32", 1, "contiguous buffer", args[0]); + goto exit; + } + if (nargs < 2) { + goto skip_optional; + } + if (PyFloat_Check(args[1])) { + PyErr_SetString(PyExc_TypeError, + "integer argument expected, got float" ); + goto exit; + } + value = (unsigned int)PyLong_AsUnsignedLongMask(args[1]); + if (value == (unsigned int)-1 && PyErr_Occurred()) { goto exit; } +skip_optional: return_value = zlib_crc32_impl(module, &data, value); exit: @@ -557,4 +613,4 @@ zlib_crc32(PyObject *module, PyObject *const *args, Py_ssize_t nargs) #ifndef ZLIB_DECOMPRESS___DEEPCOPY___METHODDEF #define ZLIB_DECOMPRESS___DEEPCOPY___METHODDEF #endif /* !defined(ZLIB_DECOMPRESS___DEEPCOPY___METHODDEF) */ -/*[clinic end generated code: output=bea1e3c64573d9fd input=a9049054013a1b77]*/ +/*[clinic end generated code: output=b3acec2384f18782 input=a9049054013a1b77]*/ diff --git a/Objects/clinic/bytearrayobject.c.h b/Objects/clinic/bytearrayobject.c.h index ec35eefa774f..2d7c74200775 100644 --- a/Objects/clinic/bytearrayobject.c.h +++ b/Objects/clinic/bytearrayobject.c.h @@ -100,8 +100,21 @@ bytearray_maketrans(void *null, PyObject *const *args, Py_ssize_t nargs) Py_buffer frm = {NULL, NULL}; Py_buffer to = {NULL, NULL}; - if (!_PyArg_ParseStack(args, nargs, "y*y*:maketrans", - &frm, &to)) { + if (!_PyArg_CheckPositional("maketrans", nargs, 2, 2)) { + goto exit; + } + if (PyObject_GetBuffer(args[0], &frm, PyBUF_SIMPLE) != 0) { + goto exit; + } + if (!PyBuffer_IsContiguous(&frm, 'C')) { + _PyArg_BadArgument("maketrans", 1, "contiguous buffer", args[0]); + goto exit; + } + if (PyObject_GetBuffer(args[1], &to, PyBUF_SIMPLE) != 0) { + goto exit; + } + if (!PyBuffer_IsContiguous(&to, 'C')) { + _PyArg_BadArgument("maketrans", 2, "contiguous buffer", args[1]); goto exit; } return_value = bytearray_maketrans_impl(&frm, &to); @@ -147,10 +160,44 @@ bytearray_replace(PyByteArrayObject *self, PyObject *const *args, Py_ssize_t nar Py_buffer new = {NULL, NULL}; Py_ssize_t count = -1; - if (!_PyArg_ParseStack(args, nargs, "y*y*|n:replace", - &old, &new, &count)) { + if (!_PyArg_CheckPositional("replace", nargs, 2, 3)) { + goto exit; + } + if (PyObject_GetBuffer(args[0], &old, PyBUF_SIMPLE) != 0) { + goto exit; + } + if (!PyBuffer_IsContiguous(&old, 'C')) { + _PyArg_BadArgument("replace", 1, "contiguous buffer", args[0]); + goto exit; + } + if (PyObject_GetBuffer(args[1], &new, PyBUF_SIMPLE) != 0) { + goto exit; + } + if (!PyBuffer_IsContiguous(&new, 'C')) { + _PyArg_BadArgument("replace", 2, "contiguous buffer", args[1]); + goto exit; + } + if (nargs < 3) { + goto skip_optional; + } + if (PyFloat_Check(args[2])) { + PyErr_SetString(PyExc_TypeError, + "integer argument expected, got float" ); goto exit; } + { + Py_ssize_t ival = -1; + PyObject *iobj = PyNumber_Index(args[2]); + if (iobj != NULL) { + ival = PyLong_AsSsize_t(iobj); + Py_DECREF(iobj); + } + if (ival == -1 && PyErr_Occurred()) { + goto exit; + } + count = ival; + } +skip_optional: return_value = bytearray_replace_impl(self, &old, &new, count); exit: @@ -323,8 +370,27 @@ bytearray_insert(PyByteArrayObject *self, PyObject *const *args, Py_ssize_t narg Py_ssize_t index; int item; - if (!_PyArg_ParseStack(args, nargs, "nO&:insert", - &index, _getbytevalue, &item)) { + if (!_PyArg_CheckPositional("insert", nargs, 2, 2)) { + goto exit; + } + if (PyFloat_Check(args[0])) { + PyErr_SetString(PyExc_TypeError, + "integer argument expected, got float" ); + goto exit; + } + { + Py_ssize_t ival = -1; + PyObject *iobj = PyNumber_Index(args[0]); + if (iobj != NULL) { + ival = PyLong_AsSsize_t(iobj); + Py_DECREF(iobj); + } + if (ival == -1 && PyErr_Occurred()) { + goto exit; + } + index = ival; + } + if (!_getbytevalue(args[1], &item)) { goto exit; } return_value = bytearray_insert_impl(self, index, item); @@ -399,10 +465,30 @@ bytearray_pop(PyByteArrayObject *self, PyObject *const *args, Py_ssize_t nargs) PyObject *return_value = NULL; Py_ssize_t index = -1; - if (!_PyArg_ParseStack(args, nargs, "|n:pop", - &index)) { + if (!_PyArg_CheckPositional("pop", nargs, 0, 1)) { goto exit; } + if (nargs < 1) { + goto skip_optional; + } + if (PyFloat_Check(args[0])) { + PyErr_SetString(PyExc_TypeError, + "integer argument expected, got float" ); + goto exit; + } + { + Py_ssize_t ival = -1; + PyObject *iobj = PyNumber_Index(args[0]); + if (iobj != NULL) { + ival = PyLong_AsSsize_t(iobj); + Py_DECREF(iobj); + } + if (ival == -1 && PyErr_Occurred()) { + goto exit; + } + index = ival; + } +skip_optional: return_value = bytearray_pop_impl(self, index); exit: @@ -641,7 +727,7 @@ bytearray_fromhex(PyTypeObject *type, PyObject *arg) PyObject *string; if (!PyUnicode_Check(arg)) { - _PyArg_BadArgument("fromhex", "str", arg); + _PyArg_BadArgument("fromhex", 0, "str", arg); goto exit; } if (PyUnicode_READY(arg) == -1) { @@ -690,10 +776,22 @@ bytearray_reduce_ex(PyByteArrayObject *self, PyObject *const *args, Py_ssize_t n PyObject *return_value = NULL; int proto = 0; - if (!_PyArg_ParseStack(args, nargs, "|i:__reduce_ex__", - &proto)) { + if (!_PyArg_CheckPositional("__reduce_ex__", nargs, 0, 1)) { + goto exit; + } + if (nargs < 1) { + goto skip_optional; + } + if (PyFloat_Check(args[0])) { + PyErr_SetString(PyExc_TypeError, + "integer argument expected, got float" ); + goto exit; + } + proto = _PyLong_AsInt(args[0]); + if (proto == -1 && PyErr_Occurred()) { goto exit; } +skip_optional: return_value = bytearray_reduce_ex_impl(self, proto); exit: @@ -717,4 +815,4 @@ bytearray_sizeof(PyByteArrayObject *self, PyObject *Py_UNUSED(ignored)) { return bytearray_sizeof_impl(self); } -/*[clinic end generated code: output=cd3e13a1905a473c input=a9049054013a1b77]*/ +/*[clinic end generated code: output=010e281b823d7df1 input=a9049054013a1b77]*/ diff --git a/Objects/clinic/bytesobject.c.h b/Objects/clinic/bytesobject.c.h index 1345b644d176..4e754d91af39 100644 --- a/Objects/clinic/bytesobject.c.h +++ b/Objects/clinic/bytesobject.c.h @@ -70,7 +70,7 @@ bytes_partition(PyBytesObject *self, PyObject *arg) goto exit; } if (!PyBuffer_IsContiguous(&sep, 'C')) { - _PyArg_BadArgument("partition", "contiguous buffer", arg); + _PyArg_BadArgument("partition", 0, "contiguous buffer", arg); goto exit; } return_value = bytes_partition_impl(self, &sep); @@ -113,7 +113,7 @@ bytes_rpartition(PyBytesObject *self, PyObject *arg) goto exit; } if (!PyBuffer_IsContiguous(&sep, 'C')) { - _PyArg_BadArgument("rpartition", "contiguous buffer", arg); + _PyArg_BadArgument("rpartition", 0, "contiguous buffer", arg); goto exit; } return_value = bytes_rpartition_impl(self, &sep); @@ -338,8 +338,21 @@ bytes_maketrans(void *null, PyObject *const *args, Py_ssize_t nargs) Py_buffer frm = {NULL, NULL}; Py_buffer to = {NULL, NULL}; - if (!_PyArg_ParseStack(args, nargs, "y*y*:maketrans", - &frm, &to)) { + if (!_PyArg_CheckPositional("maketrans", nargs, 2, 2)) { + goto exit; + } + if (PyObject_GetBuffer(args[0], &frm, PyBUF_SIMPLE) != 0) { + goto exit; + } + if (!PyBuffer_IsContiguous(&frm, 'C')) { + _PyArg_BadArgument("maketrans", 1, "contiguous buffer", args[0]); + goto exit; + } + if (PyObject_GetBuffer(args[1], &to, PyBUF_SIMPLE) != 0) { + goto exit; + } + if (!PyBuffer_IsContiguous(&to, 'C')) { + _PyArg_BadArgument("maketrans", 2, "contiguous buffer", args[1]); goto exit; } return_value = bytes_maketrans_impl(&frm, &to); @@ -385,10 +398,44 @@ bytes_replace(PyBytesObject *self, PyObject *const *args, Py_ssize_t nargs) Py_buffer new = {NULL, NULL}; Py_ssize_t count = -1; - if (!_PyArg_ParseStack(args, nargs, "y*y*|n:replace", - &old, &new, &count)) { + if (!_PyArg_CheckPositional("replace", nargs, 2, 3)) { + goto exit; + } + if (PyObject_GetBuffer(args[0], &old, PyBUF_SIMPLE) != 0) { goto exit; } + if (!PyBuffer_IsContiguous(&old, 'C')) { + _PyArg_BadArgument("replace", 1, "contiguous buffer", args[0]); + goto exit; + } + if (PyObject_GetBuffer(args[1], &new, PyBUF_SIMPLE) != 0) { + goto exit; + } + if (!PyBuffer_IsContiguous(&new, 'C')) { + _PyArg_BadArgument("replace", 2, "contiguous buffer", args[1]); + goto exit; + } + if (nargs < 3) { + goto skip_optional; + } + if (PyFloat_Check(args[2])) { + PyErr_SetString(PyExc_TypeError, + "integer argument expected, got float" ); + goto exit; + } + { + Py_ssize_t ival = -1; + PyObject *iobj = PyNumber_Index(args[2]); + if (iobj != NULL) { + ival = PyLong_AsSsize_t(iobj); + Py_DECREF(iobj); + } + if (ival == -1 && PyErr_Occurred()) { + goto exit; + } + count = ival; + } +skip_optional: return_value = bytes_replace_impl(self, &old, &new, count); exit: @@ -500,7 +547,7 @@ bytes_fromhex(PyTypeObject *type, PyObject *arg) PyObject *string; if (!PyUnicode_Check(arg)) { - _PyArg_BadArgument("fromhex", "str", arg); + _PyArg_BadArgument("fromhex", 0, "str", arg); goto exit; } if (PyUnicode_READY(arg) == -1) { @@ -512,4 +559,4 @@ bytes_fromhex(PyTypeObject *type, PyObject *arg) exit: return return_value; } -/*[clinic end generated code: output=dc9aa04f0007ab11 input=a9049054013a1b77]*/ +/*[clinic end generated code: output=810c8dfc72520ca4 input=a9049054013a1b77]*/ diff --git a/Objects/clinic/floatobject.c.h b/Objects/clinic/floatobject.c.h index 28b24d0a093d..741ca3bac5b5 100644 --- a/Objects/clinic/floatobject.c.h +++ b/Objects/clinic/floatobject.c.h @@ -229,7 +229,7 @@ float___getformat__(PyTypeObject *type, PyObject *arg) const char *typestr; if (!PyUnicode_Check(arg)) { - _PyArg_BadArgument("__getformat__", "str", arg); + _PyArg_BadArgument("__getformat__", 0, "str", arg); goto exit; } Py_ssize_t typestr_length; @@ -279,8 +279,33 @@ float___set_format__(PyTypeObject *type, PyObject *const *args, Py_ssize_t nargs const char *typestr; const char *fmt; - if (!_PyArg_ParseStack(args, nargs, "ss:__set_format__", - &typestr, &fmt)) { + if (!_PyArg_CheckPositional("__set_format__", nargs, 2, 2)) { + goto exit; + } + if (!PyUnicode_Check(args[0])) { + _PyArg_BadArgument("__set_format__", 1, "str", args[0]); + goto exit; + } + Py_ssize_t typestr_length; + typestr = PyUnicode_AsUTF8AndSize(args[0], &typestr_length); + if (typestr == NULL) { + goto exit; + } + if (strlen(typestr) != (size_t)typestr_length) { + PyErr_SetString(PyExc_ValueError, "embedded null character"); + goto exit; + } + if (!PyUnicode_Check(args[1])) { + _PyArg_BadArgument("__set_format__", 2, "str", args[1]); + goto exit; + } + Py_ssize_t fmt_length; + fmt = PyUnicode_AsUTF8AndSize(args[1], &fmt_length); + if (fmt == NULL) { + goto exit; + } + if (strlen(fmt) != (size_t)fmt_length) { + PyErr_SetString(PyExc_ValueError, "embedded null character"); goto exit; } return_value = float___set_format___impl(type, typestr, fmt); @@ -308,7 +333,7 @@ float___format__(PyObject *self, PyObject *arg) PyObject *format_spec; if (!PyUnicode_Check(arg)) { - _PyArg_BadArgument("__format__", "str", arg); + _PyArg_BadArgument("__format__", 0, "str", arg); goto exit; } if (PyUnicode_READY(arg) == -1) { @@ -320,4 +345,4 @@ float___format__(PyObject *self, PyObject *arg) exit: return return_value; } -/*[clinic end generated code: output=e8f8be828462d58b input=a9049054013a1b77]*/ +/*[clinic end generated code: output=2631a60701a8f7d4 input=a9049054013a1b77]*/ diff --git a/Objects/clinic/listobject.c.h b/Objects/clinic/listobject.c.h index 0097481dbaed..36174586141b 100644 --- a/Objects/clinic/listobject.c.h +++ b/Objects/clinic/listobject.c.h @@ -21,10 +21,27 @@ list_insert(PyListObject *self, PyObject *const *args, Py_ssize_t nargs) Py_ssize_t index; PyObject *object; - if (!_PyArg_ParseStack(args, nargs, "nO:insert", - &index, &object)) { + if (!_PyArg_CheckPositional("insert", nargs, 2, 2)) { goto exit; } + if (PyFloat_Check(args[0])) { + PyErr_SetString(PyExc_TypeError, + "integer argument expected, got float" ); + goto exit; + } + { + Py_ssize_t ival = -1; + PyObject *iobj = PyNumber_Index(args[0]); + if (iobj != NULL) { + ival = PyLong_AsSsize_t(iobj); + Py_DECREF(iobj); + } + if (ival == -1 && PyErr_Occurred()) { + goto exit; + } + index = ival; + } + object = args[1]; return_value = list_insert_impl(self, index, object); exit: @@ -105,10 +122,30 @@ list_pop(PyListObject *self, PyObject *const *args, Py_ssize_t nargs) PyObject *return_value = NULL; Py_ssize_t index = -1; - if (!_PyArg_ParseStack(args, nargs, "|n:pop", - &index)) { + if (!_PyArg_CheckPositional("pop", nargs, 0, 1)) { goto exit; } + if (nargs < 1) { + goto skip_optional; + } + if (PyFloat_Check(args[0])) { + PyErr_SetString(PyExc_TypeError, + "integer argument expected, got float" ); + goto exit; + } + { + Py_ssize_t ival = -1; + PyObject *iobj = PyNumber_Index(args[0]); + if (iobj != NULL) { + ival = PyLong_AsSsize_t(iobj); + Py_DECREF(iobj); + } + if (ival == -1 && PyErr_Occurred()) { + goto exit; + } + index = ival; + } +skip_optional: return_value = list_pop_impl(self, index); exit: @@ -187,10 +224,23 @@ list_index(PyListObject *self, PyObject *const *args, Py_ssize_t nargs) Py_ssize_t start = 0; Py_ssize_t stop = PY_SSIZE_T_MAX; - if (!_PyArg_ParseStack(args, nargs, "O|O&O&:index", - &value, _PyEval_SliceIndexNotNone, &start, _PyEval_SliceIndexNotNone, &stop)) { + if (!_PyArg_CheckPositional("index", nargs, 1, 3)) { + goto exit; + } + value = args[0]; + if (nargs < 2) { + goto skip_optional; + } + if (!_PyEval_SliceIndexNotNone(args[1], &start)) { + goto exit; + } + if (nargs < 3) { + goto skip_optional; + } + if (!_PyEval_SliceIndexNotNone(args[2], &stop)) { goto exit; } +skip_optional: return_value = list_index_impl(self, value, start, stop); exit: @@ -285,4 +335,4 @@ list___reversed__(PyListObject *self, PyObject *Py_UNUSED(ignored)) { return list___reversed___impl(self); } -/*[clinic end generated code: output=652ae4ee63a9de71 input=a9049054013a1b77]*/ +/*[clinic end generated code: output=1f641f5aef3f886f input=a9049054013a1b77]*/ diff --git a/Objects/clinic/longobject.c.h b/Objects/clinic/longobject.c.h index 27cdf9e60509..67700239ff8d 100644 --- a/Objects/clinic/longobject.c.h +++ b/Objects/clinic/longobject.c.h @@ -59,7 +59,7 @@ int___format__(PyObject *self, PyObject *arg) PyObject *format_spec; if (!PyUnicode_Check(arg)) { - _PyArg_BadArgument("__format__", "str", arg); + _PyArg_BadArgument("__format__", 0, "str", arg); goto exit; } if (PyUnicode_READY(arg) == -1) { @@ -244,4 +244,4 @@ int_from_bytes(PyTypeObject *type, PyObject *const *args, Py_ssize_t nargs, PyOb exit: return return_value; } -/*[clinic end generated code: output=7436b5f4decdcf9d input=a9049054013a1b77]*/ +/*[clinic end generated code: output=3b91cda9d83abaa2 input=a9049054013a1b77]*/ diff --git a/Objects/clinic/tupleobject.c.h b/Objects/clinic/tupleobject.c.h index 67d9f340e083..0096f0cd0f69 100644 --- a/Objects/clinic/tupleobject.c.h +++ b/Objects/clinic/tupleobject.c.h @@ -25,10 +25,23 @@ tuple_index(PyTupleObject *self, PyObject *const *args, Py_ssize_t nargs) Py_ssize_t start = 0; Py_ssize_t stop = PY_SSIZE_T_MAX; - if (!_PyArg_ParseStack(args, nargs, "O|O&O&:index", - &value, _PyEval_SliceIndexNotNone, &start, _PyEval_SliceIndexNotNone, &stop)) { + if (!_PyArg_CheckPositional("index", nargs, 1, 3)) { goto exit; } + value = args[0]; + if (nargs < 2) { + goto skip_optional; + } + if (!_PyEval_SliceIndexNotNone(args[1], &start)) { + goto exit; + } + if (nargs < 3) { + goto skip_optional; + } + if (!_PyEval_SliceIndexNotNone(args[2], &stop)) { + goto exit; + } +skip_optional: return_value = tuple_index_impl(self, value, start, stop); exit: @@ -95,4 +108,4 @@ tuple___getnewargs__(PyTupleObject *self, PyObject *Py_UNUSED(ignored)) { return tuple___getnewargs___impl(self); } -/*[clinic end generated code: output=0a6ebd2d16b09c5d input=a9049054013a1b77]*/ +/*[clinic end generated code: output=5312868473a41cfe input=a9049054013a1b77]*/ diff --git a/Objects/clinic/typeobject.c.h b/Objects/clinic/typeobject.c.h index 115a21807bef..fbe1261fb5a0 100644 --- a/Objects/clinic/typeobject.c.h +++ b/Objects/clinic/typeobject.c.h @@ -200,7 +200,7 @@ object___format__(PyObject *self, PyObject *arg) PyObject *format_spec; if (!PyUnicode_Check(arg)) { - _PyArg_BadArgument("__format__", "str", arg); + _PyArg_BadArgument("__format__", 0, "str", arg); goto exit; } if (PyUnicode_READY(arg) == -1) { @@ -248,4 +248,4 @@ object___dir__(PyObject *self, PyObject *Py_UNUSED(ignored)) { return object___dir___impl(self); } -/*[clinic end generated code: output=09f3453839e60136 input=a9049054013a1b77]*/ +/*[clinic end generated code: output=ea5734413064fa7e input=a9049054013a1b77]*/ diff --git a/Objects/clinic/unicodeobject.c.h b/Objects/clinic/unicodeobject.c.h index 744a6ebd23fc..21e54f1011cf 100644 --- a/Objects/clinic/unicodeobject.c.h +++ b/Objects/clinic/unicodeobject.c.h @@ -83,10 +83,33 @@ unicode_center(PyObject *self, PyObject *const *args, Py_ssize_t nargs) Py_ssize_t width; Py_UCS4 fillchar = ' '; - if (!_PyArg_ParseStack(args, nargs, "n|O&:center", - &width, convert_uc, &fillchar)) { + if (!_PyArg_CheckPositional("center", nargs, 1, 2)) { goto exit; } + if (PyFloat_Check(args[0])) { + PyErr_SetString(PyExc_TypeError, + "integer argument expected, got float" ); + goto exit; + } + { + Py_ssize_t ival = -1; + PyObject *iobj = PyNumber_Index(args[0]); + if (iobj != NULL) { + ival = PyLong_AsSsize_t(iobj); + Py_DECREF(iobj); + } + if (ival == -1 && PyErr_Occurred()) { + goto exit; + } + width = ival; + } + if (nargs < 2) { + goto skip_optional; + } + if (!convert_uc(args[1], &fillchar)) { + goto exit; + } +skip_optional: return_value = unicode_center_impl(self, width, fillchar); exit: @@ -452,10 +475,33 @@ unicode_ljust(PyObject *self, PyObject *const *args, Py_ssize_t nargs) Py_ssize_t width; Py_UCS4 fillchar = ' '; - if (!_PyArg_ParseStack(args, nargs, "n|O&:ljust", - &width, convert_uc, &fillchar)) { + if (!_PyArg_CheckPositional("ljust", nargs, 1, 2)) { + goto exit; + } + if (PyFloat_Check(args[0])) { + PyErr_SetString(PyExc_TypeError, + "integer argument expected, got float" ); goto exit; } + { + Py_ssize_t ival = -1; + PyObject *iobj = PyNumber_Index(args[0]); + if (iobj != NULL) { + ival = PyLong_AsSsize_t(iobj); + Py_DECREF(iobj); + } + if (ival == -1 && PyErr_Occurred()) { + goto exit; + } + width = ival; + } + if (nargs < 2) { + goto skip_optional; + } + if (!convert_uc(args[1], &fillchar)) { + goto exit; + } +skip_optional: return_value = unicode_ljust_impl(self, width, fillchar); exit: @@ -601,10 +647,46 @@ unicode_replace(PyObject *self, PyObject *const *args, Py_ssize_t nargs) PyObject *new; Py_ssize_t count = -1; - if (!_PyArg_ParseStack(args, nargs, "UU|n:replace", - &old, &new, &count)) { + if (!_PyArg_CheckPositional("replace", nargs, 2, 3)) { + goto exit; + } + if (!PyUnicode_Check(args[0])) { + _PyArg_BadArgument("replace", 1, "str", args[0]); + goto exit; + } + if (PyUnicode_READY(args[0]) == -1) { + goto exit; + } + old = args[0]; + if (!PyUnicode_Check(args[1])) { + _PyArg_BadArgument("replace", 2, "str", args[1]); + goto exit; + } + if (PyUnicode_READY(args[1]) == -1) { goto exit; } + new = args[1]; + if (nargs < 3) { + goto skip_optional; + } + if (PyFloat_Check(args[2])) { + PyErr_SetString(PyExc_TypeError, + "integer argument expected, got float" ); + goto exit; + } + { + Py_ssize_t ival = -1; + PyObject *iobj = PyNumber_Index(args[2]); + if (iobj != NULL) { + ival = PyLong_AsSsize_t(iobj); + Py_DECREF(iobj); + } + if (ival == -1 && PyErr_Occurred()) { + goto exit; + } + count = ival; + } +skip_optional: return_value = unicode_replace_impl(self, old, new, count); exit: @@ -632,10 +714,33 @@ unicode_rjust(PyObject *self, PyObject *const *args, Py_ssize_t nargs) Py_ssize_t width; Py_UCS4 fillchar = ' '; - if (!_PyArg_ParseStack(args, nargs, "n|O&:rjust", - &width, convert_uc, &fillchar)) { + if (!_PyArg_CheckPositional("rjust", nargs, 1, 2)) { goto exit; } + if (PyFloat_Check(args[0])) { + PyErr_SetString(PyExc_TypeError, + "integer argument expected, got float" ); + goto exit; + } + { + Py_ssize_t ival = -1; + PyObject *iobj = PyNumber_Index(args[0]); + if (iobj != NULL) { + ival = PyLong_AsSsize_t(iobj); + Py_DECREF(iobj); + } + if (ival == -1 && PyErr_Occurred()) { + goto exit; + } + width = ival; + } + if (nargs < 2) { + goto skip_optional; + } + if (!convert_uc(args[1], &fillchar)) { + goto exit; + } +skip_optional: return_value = unicode_rjust_impl(self, width, fillchar); exit: @@ -833,10 +938,33 @@ unicode_maketrans(void *null, PyObject *const *args, Py_ssize_t nargs) PyObject *y = NULL; PyObject *z = NULL; - if (!_PyArg_ParseStack(args, nargs, "O|UU:maketrans", - &x, &y, &z)) { + if (!_PyArg_CheckPositional("maketrans", nargs, 1, 3)) { + goto exit; + } + x = args[0]; + if (nargs < 2) { + goto skip_optional; + } + if (!PyUnicode_Check(args[1])) { + _PyArg_BadArgument("maketrans", 2, "str", args[1]); + goto exit; + } + if (PyUnicode_READY(args[1]) == -1) { + goto exit; + } + y = args[1]; + if (nargs < 3) { + goto skip_optional; + } + if (!PyUnicode_Check(args[2])) { + _PyArg_BadArgument("maketrans", 3, "str", args[2]); + goto exit; + } + if (PyUnicode_READY(args[2]) == -1) { goto exit; } + z = args[2]; +skip_optional: return_value = unicode_maketrans_impl(x, y, z); exit: @@ -940,7 +1068,7 @@ unicode___format__(PyObject *self, PyObject *arg) PyObject *format_spec; if (!PyUnicode_Check(arg)) { - _PyArg_BadArgument("__format__", "str", arg); + _PyArg_BadArgument("__format__", 0, "str", arg); goto exit; } if (PyUnicode_READY(arg) == -1) { @@ -970,4 +1098,4 @@ unicode_sizeof(PyObject *self, PyObject *Py_UNUSED(ignored)) { return unicode_sizeof_impl(self); } -/*[clinic end generated code: output=ff6acd5abd1998eb input=a9049054013a1b77]*/ +/*[clinic end generated code: output=73ad9670e00a2490 input=a9049054013a1b77]*/ diff --git a/Objects/stringlib/clinic/transmogrify.h.h b/Objects/stringlib/clinic/transmogrify.h.h index 7b7fd58c7e7a..15c43af61a28 100644 --- a/Objects/stringlib/clinic/transmogrify.h.h +++ b/Objects/stringlib/clinic/transmogrify.h.h @@ -55,10 +55,40 @@ stringlib_ljust(PyObject *self, PyObject *const *args, Py_ssize_t nargs) Py_ssize_t width; char fillchar = ' '; - if (!_PyArg_ParseStack(args, nargs, "n|c:ljust", - &width, &fillchar)) { + if (!_PyArg_CheckPositional("ljust", nargs, 1, 2)) { goto exit; } + if (PyFloat_Check(args[0])) { + PyErr_SetString(PyExc_TypeError, + "integer argument expected, got float" ); + goto exit; + } + { + Py_ssize_t ival = -1; + PyObject *iobj = PyNumber_Index(args[0]); + if (iobj != NULL) { + ival = PyLong_AsSsize_t(iobj); + Py_DECREF(iobj); + } + if (ival == -1 && PyErr_Occurred()) { + goto exit; + } + width = ival; + } + if (nargs < 2) { + goto skip_optional; + } + if (PyBytes_Check(args[1]) && PyBytes_GET_SIZE(args[1]) == 1) { + fillchar = PyBytes_AS_STRING(args[1])[0]; + } + else if (PyByteArray_Check(args[1]) && PyByteArray_GET_SIZE(args[1]) == 1) { + fillchar = PyByteArray_AS_STRING(args[1])[0]; + } + else { + _PyArg_BadArgument("ljust", 2, "a byte string of length 1", args[1]); + goto exit; + } +skip_optional: return_value = stringlib_ljust_impl(self, width, fillchar); exit: @@ -86,10 +116,40 @@ stringlib_rjust(PyObject *self, PyObject *const *args, Py_ssize_t nargs) Py_ssize_t width; char fillchar = ' '; - if (!_PyArg_ParseStack(args, nargs, "n|c:rjust", - &width, &fillchar)) { + if (!_PyArg_CheckPositional("rjust", nargs, 1, 2)) { + goto exit; + } + if (PyFloat_Check(args[0])) { + PyErr_SetString(PyExc_TypeError, + "integer argument expected, got float" ); + goto exit; + } + { + Py_ssize_t ival = -1; + PyObject *iobj = PyNumber_Index(args[0]); + if (iobj != NULL) { + ival = PyLong_AsSsize_t(iobj); + Py_DECREF(iobj); + } + if (ival == -1 && PyErr_Occurred()) { + goto exit; + } + width = ival; + } + if (nargs < 2) { + goto skip_optional; + } + if (PyBytes_Check(args[1]) && PyBytes_GET_SIZE(args[1]) == 1) { + fillchar = PyBytes_AS_STRING(args[1])[0]; + } + else if (PyByteArray_Check(args[1]) && PyByteArray_GET_SIZE(args[1]) == 1) { + fillchar = PyByteArray_AS_STRING(args[1])[0]; + } + else { + _PyArg_BadArgument("rjust", 2, "a byte string of length 1", args[1]); goto exit; } +skip_optional: return_value = stringlib_rjust_impl(self, width, fillchar); exit: @@ -117,10 +177,40 @@ stringlib_center(PyObject *self, PyObject *const *args, Py_ssize_t nargs) Py_ssize_t width; char fillchar = ' '; - if (!_PyArg_ParseStack(args, nargs, "n|c:center", - &width, &fillchar)) { + if (!_PyArg_CheckPositional("center", nargs, 1, 2)) { + goto exit; + } + if (PyFloat_Check(args[0])) { + PyErr_SetString(PyExc_TypeError, + "integer argument expected, got float" ); + goto exit; + } + { + Py_ssize_t ival = -1; + PyObject *iobj = PyNumber_Index(args[0]); + if (iobj != NULL) { + ival = PyLong_AsSsize_t(iobj); + Py_DECREF(iobj); + } + if (ival == -1 && PyErr_Occurred()) { + goto exit; + } + width = ival; + } + if (nargs < 2) { + goto skip_optional; + } + if (PyBytes_Check(args[1]) && PyBytes_GET_SIZE(args[1]) == 1) { + fillchar = PyBytes_AS_STRING(args[1])[0]; + } + else if (PyByteArray_Check(args[1]) && PyByteArray_GET_SIZE(args[1]) == 1) { + fillchar = PyByteArray_AS_STRING(args[1])[0]; + } + else { + _PyArg_BadArgument("center", 2, "a byte string of length 1", args[1]); goto exit; } +skip_optional: return_value = stringlib_center_impl(self, width, fillchar); exit: @@ -169,4 +259,4 @@ stringlib_zfill(PyObject *self, PyObject *arg) exit: return return_value; } -/*[clinic end generated code: output=bf2ef501639e1190 input=a9049054013a1b77]*/ +/*[clinic end generated code: output=787248a980f6a00e input=a9049054013a1b77]*/ diff --git a/PC/clinic/msvcrtmodule.c.h b/PC/clinic/msvcrtmodule.c.h index b1c588b0642a..d8e77d5f328d 100644 --- a/PC/clinic/msvcrtmodule.c.h +++ b/PC/clinic/msvcrtmodule.c.h @@ -50,8 +50,34 @@ msvcrt_locking(PyObject *module, PyObject *const *args, Py_ssize_t nargs) int mode; long nbytes; - if (!_PyArg_ParseStack(args, nargs, "iil:locking", - &fd, &mode, &nbytes)) { + if (!_PyArg_CheckPositional("locking", nargs, 3, 3)) { + goto exit; + } + if (PyFloat_Check(args[0])) { + PyErr_SetString(PyExc_TypeError, + "integer argument expected, got float" ); + goto exit; + } + fd = _PyLong_AsInt(args[0]); + if (fd == -1 && PyErr_Occurred()) { + goto exit; + } + if (PyFloat_Check(args[1])) { + PyErr_SetString(PyExc_TypeError, + "integer argument expected, got float" ); + goto exit; + } + mode = _PyLong_AsInt(args[1]); + if (mode == -1 && PyErr_Occurred()) { + goto exit; + } + if (PyFloat_Check(args[2])) { + PyErr_SetString(PyExc_TypeError, + "integer argument expected, got float" ); + goto exit; + } + nbytes = PyLong_AsLong(args[2]); + if (nbytes == -1 && PyErr_Occurred()) { goto exit; } return_value = msvcrt_locking_impl(module, fd, mode, nbytes); @@ -85,8 +111,25 @@ msvcrt_setmode(PyObject *module, PyObject *const *args, Py_ssize_t nargs) int flags; long _return_value; - if (!_PyArg_ParseStack(args, nargs, "ii:setmode", - &fd, &flags)) { + if (!_PyArg_CheckPositional("setmode", nargs, 2, 2)) { + goto exit; + } + if (PyFloat_Check(args[0])) { + PyErr_SetString(PyExc_TypeError, + "integer argument expected, got float" ); + goto exit; + } + fd = _PyLong_AsInt(args[0]); + if (fd == -1 && PyErr_Occurred()) { + goto exit; + } + if (PyFloat_Check(args[1])) { + PyErr_SetString(PyExc_TypeError, + "integer argument expected, got float" ); + goto exit; + } + flags = _PyLong_AsInt(args[1]); + if (flags == -1 && PyErr_Occurred()) { goto exit; } _return_value = msvcrt_setmode_impl(module, fd, flags); @@ -332,7 +375,7 @@ msvcrt_putch(PyObject *module, PyObject *arg) char_value = PyByteArray_AS_STRING(arg)[0]; } else { - _PyArg_BadArgument("putch", "a byte string of length 1", arg); + _PyArg_BadArgument("putch", 0, "a byte string of length 1", arg); goto exit; } return_value = msvcrt_putch_impl(module, char_value); @@ -360,14 +403,14 @@ msvcrt_putwch(PyObject *module, PyObject *arg) int unicode_char; if (!PyUnicode_Check(arg)) { - _PyArg_BadArgument("putwch", "a unicode character", arg); + _PyArg_BadArgument("putwch", 0, "a unicode character", arg); goto exit; } if (PyUnicode_READY(arg)) { goto exit; } if (PyUnicode_GET_LENGTH(arg) != 1) { - _PyArg_BadArgument("putwch", "a unicode character", arg); + _PyArg_BadArgument("putwch", 0, "a unicode character", arg); goto exit; } unicode_char = PyUnicode_READ_CHAR(arg, 0); @@ -406,7 +449,7 @@ msvcrt_ungetch(PyObject *module, PyObject *arg) char_value = PyByteArray_AS_STRING(arg)[0]; } else { - _PyArg_BadArgument("ungetch", "a byte string of length 1", arg); + _PyArg_BadArgument("ungetch", 0, "a byte string of length 1", arg); goto exit; } return_value = msvcrt_ungetch_impl(module, char_value); @@ -434,14 +477,14 @@ msvcrt_ungetwch(PyObject *module, PyObject *arg) int unicode_char; if (!PyUnicode_Check(arg)) { - _PyArg_BadArgument("ungetwch", "a unicode character", arg); + _PyArg_BadArgument("ungetwch", 0, "a unicode character", arg); goto exit; } if (PyUnicode_READY(arg)) { goto exit; } if (PyUnicode_GET_LENGTH(arg) != 1) { - _PyArg_BadArgument("ungetwch", "a unicode character", arg); + _PyArg_BadArgument("ungetwch", 0, "a unicode character", arg); goto exit; } unicode_char = PyUnicode_READ_CHAR(arg, 0); @@ -515,8 +558,25 @@ msvcrt_CrtSetReportMode(PyObject *module, PyObject *const *args, Py_ssize_t narg int mode; long _return_value; - if (!_PyArg_ParseStack(args, nargs, "ii:CrtSetReportMode", - &type, &mode)) { + if (!_PyArg_CheckPositional("CrtSetReportMode", nargs, 2, 2)) { + goto exit; + } + if (PyFloat_Check(args[0])) { + PyErr_SetString(PyExc_TypeError, + "integer argument expected, got float" ); + goto exit; + } + type = _PyLong_AsInt(args[0]); + if (type == -1 && PyErr_Occurred()) { + goto exit; + } + if (PyFloat_Check(args[1])) { + PyErr_SetString(PyExc_TypeError, + "integer argument expected, got float" ); + goto exit; + } + mode = _PyLong_AsInt(args[1]); + if (mode == -1 && PyErr_Occurred()) { goto exit; } _return_value = msvcrt_CrtSetReportMode_impl(module, type, mode); @@ -619,4 +679,4 @@ msvcrt_SetErrorMode(PyObject *module, PyObject *arg) #ifndef MSVCRT_SET_ERROR_MODE_METHODDEF #define MSVCRT_SET_ERROR_MODE_METHODDEF #endif /* !defined(MSVCRT_SET_ERROR_MODE_METHODDEF) */ -/*[clinic end generated code: output=2530b4ff248563b4 input=a9049054013a1b77]*/ +/*[clinic end generated code: output=816bc4f993893cea input=a9049054013a1b77]*/ diff --git a/PC/clinic/winreg.c.h b/PC/clinic/winreg.c.h index 3b100f9bf08a..21a9302d0af7 100644 --- a/PC/clinic/winreg.c.h +++ b/PC/clinic/winreg.c.h @@ -425,8 +425,19 @@ winreg_EnumKey(PyObject *module, PyObject *const *args, Py_ssize_t nargs) HKEY key; int index; - if (!_PyArg_ParseStack(args, nargs, "O&i:EnumKey", - clinic_HKEY_converter, &key, &index)) { + if (!_PyArg_CheckPositional("EnumKey", nargs, 2, 2)) { + goto exit; + } + if (!clinic_HKEY_converter(args[0], &key)) { + goto exit; + } + if (PyFloat_Check(args[1])) { + PyErr_SetString(PyExc_TypeError, + "integer argument expected, got float" ); + goto exit; + } + index = _PyLong_AsInt(args[1]); + if (index == -1 && PyErr_Occurred()) { goto exit; } return_value = winreg_EnumKey_impl(module, key, index); @@ -472,8 +483,19 @@ winreg_EnumValue(PyObject *module, PyObject *const *args, Py_ssize_t nargs) HKEY key; int index; - if (!_PyArg_ParseStack(args, nargs, "O&i:EnumValue", - clinic_HKEY_converter, &key, &index)) { + if (!_PyArg_CheckPositional("EnumValue", nargs, 2, 2)) { + goto exit; + } + if (!clinic_HKEY_converter(args[0], &key)) { + goto exit; + } + if (PyFloat_Check(args[1])) { + PyErr_SetString(PyExc_TypeError, + "integer argument expected, got float" ); + goto exit; + } + index = _PyLong_AsInt(args[1]); + if (index == -1 && PyErr_Occurred()) { goto exit; } return_value = winreg_EnumValue_impl(module, key, index); @@ -1095,4 +1117,4 @@ winreg_QueryReflectionKey(PyObject *module, PyObject *arg) exit: return return_value; } -/*[clinic end generated code: output=82bd56c524c6c3dd input=a9049054013a1b77]*/ +/*[clinic end generated code: output=bd491131d343ae7a input=a9049054013a1b77]*/ diff --git a/Python/clinic/bltinmodule.c.h b/Python/clinic/bltinmodule.c.h index 7f043dac4f38..68d8dccea651 100644 --- a/Python/clinic/bltinmodule.c.h +++ b/Python/clinic/bltinmodule.c.h @@ -94,10 +94,22 @@ builtin_format(PyObject *module, PyObject *const *args, Py_ssize_t nargs) PyObject *value; PyObject *format_spec = NULL; - if (!_PyArg_ParseStack(args, nargs, "O|U:format", - &value, &format_spec)) { + if (!_PyArg_CheckPositional("format", nargs, 1, 2)) { goto exit; } + value = args[0]; + if (nargs < 2) { + goto skip_optional; + } + if (!PyUnicode_Check(args[1])) { + _PyArg_BadArgument("format", 2, "str", args[1]); + goto exit; + } + if (PyUnicode_READY(args[1]) == -1) { + goto exit; + } + format_spec = args[1]; +skip_optional: return_value = builtin_format_impl(module, value, format_spec); exit: @@ -717,4 +729,4 @@ builtin_issubclass(PyObject *module, PyObject *const *args, Py_ssize_t nargs) exit: return return_value; } -/*[clinic end generated code: output=ed300ebf3f6db530 input=a9049054013a1b77]*/ +/*[clinic end generated code: output=11b5cd918bd7eb18 input=a9049054013a1b77]*/ diff --git a/Python/clinic/import.c.h b/Python/clinic/import.c.h index d34d68fed554..783ed4ebb96a 100644 --- a/Python/clinic/import.c.h +++ b/Python/clinic/import.c.h @@ -88,10 +88,22 @@ _imp__fix_co_filename(PyObject *module, PyObject *const *args, Py_ssize_t nargs) PyCodeObject *code; PyObject *path; - if (!_PyArg_ParseStack(args, nargs, "O!U:_fix_co_filename", - &PyCode_Type, &code, &path)) { + if (!_PyArg_CheckPositional("_fix_co_filename", nargs, 2, 2)) { goto exit; } + if (!PyObject_TypeCheck(args[0], &PyCode_Type)) { + _PyArg_BadArgument("_fix_co_filename", 1, (&PyCode_Type)->tp_name, args[0]); + goto exit; + } + code = (PyCodeObject *)args[0]; + if (!PyUnicode_Check(args[1])) { + _PyArg_BadArgument("_fix_co_filename", 2, "str", args[1]); + goto exit; + } + if (PyUnicode_READY(args[1]) == -1) { + goto exit; + } + path = args[1]; return_value = _imp__fix_co_filename_impl(module, code, path); exit: @@ -144,7 +156,7 @@ _imp_init_frozen(PyObject *module, PyObject *arg) PyObject *name; if (!PyUnicode_Check(arg)) { - _PyArg_BadArgument("init_frozen", "str", arg); + _PyArg_BadArgument("init_frozen", 0, "str", arg); goto exit; } if (PyUnicode_READY(arg) == -1) { @@ -176,7 +188,7 @@ _imp_get_frozen_object(PyObject *module, PyObject *arg) PyObject *name; if (!PyUnicode_Check(arg)) { - _PyArg_BadArgument("get_frozen_object", "str", arg); + _PyArg_BadArgument("get_frozen_object", 0, "str", arg); goto exit; } if (PyUnicode_READY(arg) == -1) { @@ -208,7 +220,7 @@ _imp_is_frozen_package(PyObject *module, PyObject *arg) PyObject *name; if (!PyUnicode_Check(arg)) { - _PyArg_BadArgument("is_frozen_package", "str", arg); + _PyArg_BadArgument("is_frozen_package", 0, "str", arg); goto exit; } if (PyUnicode_READY(arg) == -1) { @@ -240,7 +252,7 @@ _imp_is_builtin(PyObject *module, PyObject *arg) PyObject *name; if (!PyUnicode_Check(arg)) { - _PyArg_BadArgument("is_builtin", "str", arg); + _PyArg_BadArgument("is_builtin", 0, "str", arg); goto exit; } if (PyUnicode_READY(arg) == -1) { @@ -272,7 +284,7 @@ _imp_is_frozen(PyObject *module, PyObject *arg) PyObject *name; if (!PyUnicode_Check(arg)) { - _PyArg_BadArgument("is_frozen", "str", arg); + _PyArg_BadArgument("is_frozen", 0, "str", arg); goto exit; } if (PyUnicode_READY(arg) == -1) { @@ -421,4 +433,4 @@ _imp_source_hash(PyObject *module, PyObject *const *args, Py_ssize_t nargs, PyOb #ifndef _IMP_EXEC_DYNAMIC_METHODDEF #define _IMP_EXEC_DYNAMIC_METHODDEF #endif /* !defined(_IMP_EXEC_DYNAMIC_METHODDEF) */ -/*[clinic end generated code: output=d8be58c9541122f1 input=a9049054013a1b77]*/ +/*[clinic end generated code: output=22062cee6e8ba7f3 input=a9049054013a1b77]*/ diff --git a/Python/clinic/marshal.c.h b/Python/clinic/marshal.c.h index 516a31582cc1..ab4575340e25 100644 --- a/Python/clinic/marshal.c.h +++ b/Python/clinic/marshal.c.h @@ -34,10 +34,24 @@ marshal_dump(PyObject *module, PyObject *const *args, Py_ssize_t nargs) PyObject *file; int version = Py_MARSHAL_VERSION; - if (!_PyArg_ParseStack(args, nargs, "OO|i:dump", - &value, &file, &version)) { + if (!_PyArg_CheckPositional("dump", nargs, 2, 3)) { goto exit; } + value = args[0]; + file = args[1]; + if (nargs < 3) { + goto skip_optional; + } + if (PyFloat_Check(args[2])) { + PyErr_SetString(PyExc_TypeError, + "integer argument expected, got float" ); + goto exit; + } + version = _PyLong_AsInt(args[2]); + if (version == -1 && PyErr_Occurred()) { + goto exit; + } +skip_optional: return_value = marshal_dump_impl(module, value, file, version); exit: @@ -90,10 +104,23 @@ marshal_dumps(PyObject *module, PyObject *const *args, Py_ssize_t nargs) PyObject *value; int version = Py_MARSHAL_VERSION; - if (!_PyArg_ParseStack(args, nargs, "O|i:dumps", - &value, &version)) { + if (!_PyArg_CheckPositional("dumps", nargs, 1, 2)) { + goto exit; + } + value = args[0]; + if (nargs < 2) { + goto skip_optional; + } + if (PyFloat_Check(args[1])) { + PyErr_SetString(PyExc_TypeError, + "integer argument expected, got float" ); + goto exit; + } + version = _PyLong_AsInt(args[1]); + if (version == -1 && PyErr_Occurred()) { goto exit; } +skip_optional: return_value = marshal_dumps_impl(module, value, version); exit: @@ -125,7 +152,7 @@ marshal_loads(PyObject *module, PyObject *arg) goto exit; } if (!PyBuffer_IsContiguous(&bytes, 'C')) { - _PyArg_BadArgument("loads", "contiguous buffer", arg); + _PyArg_BadArgument("loads", 0, "contiguous buffer", arg); goto exit; } return_value = marshal_loads_impl(module, &bytes); @@ -138,4 +165,4 @@ marshal_loads(PyObject *module, PyObject *arg) return return_value; } -/*[clinic end generated code: output=8262e7e6c8cbc1ef input=a9049054013a1b77]*/ +/*[clinic end generated code: output=ae2bca1aa239e095 input=a9049054013a1b77]*/ diff --git a/Python/clinic/sysmodule.c.h b/Python/clinic/sysmodule.c.h index 2d9c6e2beee1..7370ab59ac00 100644 --- a/Python/clinic/sysmodule.c.h +++ b/Python/clinic/sysmodule.c.h @@ -175,7 +175,7 @@ sys_intern(PyObject *module, PyObject *arg) PyObject *s; if (!PyUnicode_Check(arg)) { - _PyArg_BadArgument("intern", "str", arg); + _PyArg_BadArgument("intern", 0, "str", arg); goto exit; } if (PyUnicode_READY(arg) == -1) { @@ -819,10 +819,22 @@ sys__getframe(PyObject *module, PyObject *const *args, Py_ssize_t nargs) PyObject *return_value = NULL; int depth = 0; - if (!_PyArg_ParseStack(args, nargs, "|i:_getframe", - &depth)) { + if (!_PyArg_CheckPositional("_getframe", nargs, 0, 1)) { + goto exit; + } + if (nargs < 1) { + goto skip_optional; + } + if (PyFloat_Check(args[0])) { + PyErr_SetString(PyExc_TypeError, + "integer argument expected, got float" ); + goto exit; + } + depth = _PyLong_AsInt(args[0]); + if (depth == -1 && PyErr_Occurred()) { goto exit; } +skip_optional: return_value = sys__getframe_impl(module, depth); exit: @@ -872,10 +884,15 @@ sys_call_tracing(PyObject *module, PyObject *const *args, Py_ssize_t nargs) PyObject *func; PyObject *funcargs; - if (!_PyArg_ParseStack(args, nargs, "OO!:call_tracing", - &func, &PyTuple_Type, &funcargs)) { + if (!_PyArg_CheckPositional("call_tracing", nargs, 2, 2)) { + goto exit; + } + func = args[0]; + if (!PyTuple_Check(args[1])) { + _PyArg_BadArgument("call_tracing", 2, "tuple", args[1]); goto exit; } + funcargs = args[1]; return_value = sys_call_tracing_impl(module, func, funcargs); exit: @@ -1029,4 +1046,4 @@ sys_getandroidapilevel(PyObject *module, PyObject *Py_UNUSED(ignored)) #ifndef SYS_GETANDROIDAPILEVEL_METHODDEF #define SYS_GETANDROIDAPILEVEL_METHODDEF #endif /* !defined(SYS_GETANDROIDAPILEVEL_METHODDEF) */ -/*[clinic end generated code: output=0e662f2e19293d57 input=a9049054013a1b77]*/ +/*[clinic end generated code: output=6a5202e5bfe5e6bd input=a9049054013a1b77]*/ diff --git a/Python/getargs.c b/Python/getargs.c index 550d0df69506..c491169abe57 100644 --- a/Python/getargs.c +++ b/Python/getargs.c @@ -613,11 +613,21 @@ convertitem(PyObject *arg, const char **p_format, va_list *p_va, int flags, /* Format an error message generated by convertsimple(). */ void -_PyArg_BadArgument(const char *fname, const char *expected, PyObject *arg) +_PyArg_BadArgument(const char *fname, int iarg, + const char *expected, PyObject *arg) { - PyErr_Format(PyExc_TypeError, "%.200s() argument must be %.50s, not %.50s", - fname, expected, - arg == Py_None ? "None" : arg->ob_type->tp_name); + if (iarg) { + PyErr_Format(PyExc_TypeError, + "%.200s() argument %d must be %.50s, not %.50s", + fname, iarg, expected, + arg == Py_None ? "None" : arg->ob_type->tp_name); + } + else { + PyErr_Format(PyExc_TypeError, + "%.200s() argument must be %.50s, not %.50s", + fname, expected, + arg == Py_None ? "None" : arg->ob_type->tp_name); + } } static const char * @@ -2416,13 +2426,12 @@ skipitem(const char **p_format, va_list *p_va, int flags) } -static int -unpack_stack(PyObject *const *args, Py_ssize_t nargs, const char *name, - Py_ssize_t min, Py_ssize_t max, va_list vargs) -{ - Py_ssize_t i; - PyObject **o; +#undef _PyArg_CheckPositional +int +_PyArg_CheckPositional(const char *name, Py_ssize_t nargs, + Py_ssize_t min, Py_ssize_t max) +{ assert(min >= 0); assert(min <= max); @@ -2460,6 +2469,20 @@ unpack_stack(PyObject *const *args, Py_ssize_t nargs, const char *name, return 0; } + return 1; +} + +static int +unpack_stack(PyObject *const *args, Py_ssize_t nargs, const char *name, + Py_ssize_t min, Py_ssize_t max, va_list vargs) +{ + Py_ssize_t i; + PyObject **o; + + if (!_PyArg_CheckPositional(name, nargs, min, max)) { + return 0; + } + for (i = 0; i < nargs; i++) { o = va_arg(vargs, PyObject **); *o = args[i]; diff --git a/Tools/clinic/clinic.py b/Tools/clinic/clinic.py index 3627725eecec..4087d3fec933 100755 --- a/Tools/clinic/clinic.py +++ b/Tools/clinic/clinic.py @@ -807,8 +807,13 @@ def insert_keywords(s): {c_basename}({self_type}{self_name}, PyObject *%s) """ % argname) - parsearg = converters[0].parse_arg(argname) - assert parsearg is not None + parsearg = converters[0].parse_arg(argname, 0) + if parsearg is None: + parsearg = """ + if (!PyArg_Parse(%s, "{format_units}:{name}", {parse_arguments})) {{ + goto exit; + }} + """ % argname parser_definition = parser_body(parser_prototype, normalize_snippet(parsearg, indent=4)) @@ -857,26 +862,58 @@ def insert_keywords(s): flags = "METH_FASTCALL" parser_prototype = parser_prototype_fastcall - - parser_definition = parser_body(parser_prototype, normalize_snippet(""" - if (!_PyArg_ParseStack(args, nargs, "{format_units}:{name}", - {parse_arguments})) {{ - goto exit; - }} - """, indent=4)) + nargs = 'nargs' + argname_fmt = 'args[%d]' else: # positional-only, but no option groups # we only need one call to PyArg_ParseTuple flags = "METH_VARARGS" parser_prototype = parser_prototype_varargs + nargs = 'PyTuple_GET_SIZE(args)' + argname_fmt = 'PyTuple_GET_ITEM(args, %d)' + + parser_code = [] + has_optional = False + for i, converter in enumerate(converters): + parsearg = converter.parse_arg(argname_fmt % i, i + 1) + if parsearg is None: + #print('Cannot convert %s %r for %s' % (converter.__class__.__name__, converter.format_unit, converter.name), file=sys.stderr) + parser_code = None + break + if has_optional or converter.default is not unspecified: + has_optional = True + parser_code.append(normalize_snippet(""" + if (%s < %d) {{ + goto skip_optional; + }} + """, indent=4) % (nargs, i + 1)) + parser_code.append(normalize_snippet(parsearg, indent=4)) - parser_definition = parser_body(parser_prototype, normalize_snippet(""" - if (!PyArg_ParseTuple(args, "{format_units}:{name}", - {parse_arguments})) {{ + if parser_code is not None: + parser_code.insert(0, normalize_snippet(""" + if (!_PyArg_CheckPositional("{name}", %s, {unpack_min}, {unpack_max})) {{ goto exit; }} - """, indent=4)) + """ % nargs, indent=4)) + if has_optional: + parser_code.append("skip_optional:") + else: + if not new_or_init: + parser_code = [normalize_snippet(""" + if (!_PyArg_ParseStack(args, nargs, "{format_units}:{name}", + {parse_arguments})) {{ + goto exit; + }} + """, indent=4)] + else: + parser_code = [normalize_snippet(""" + if (!PyArg_ParseTuple(args, "{format_units}:{name}", + {parse_arguments})) {{ + goto exit; + }} + """, indent=4)] + parser_definition = parser_body(parser_prototype, *parser_code) elif not new_or_init: flags = "METH_FASTCALL|METH_KEYWORDS" @@ -2536,7 +2573,7 @@ def pre_render(self): """ pass - def parse_arg(self, argname): + def parse_arg(self, argname, argnum): if self.format_unit == 'O&': return """ if (!{converter}({argname}, &{paramname})) {{{{ @@ -2550,25 +2587,27 @@ def parse_arg(self, argname): typecheck, typename = type_checks[self.subclass_of] return """ if (!{typecheck}({argname})) {{{{ - _PyArg_BadArgument("{{name}}", "{typename}", {argname}); + _PyArg_BadArgument("{{name}}", {argnum}, "{typename}", {argname}); goto exit; }}}} {paramname} = {cast}{argname}; """.format(argname=argname, paramname=self.name, + argnum=argnum, typecheck=typecheck, typename=typename, cast=cast) return """ if (!PyObject_TypeCheck({argname}, {subclass_of})) {{{{ - _PyArg_BadArgument("{{name}}", ({subclass_of})->tp_name, {argname}); + _PyArg_BadArgument("{{name}}", {argnum}, ({subclass_of})->tp_name, {argname}); goto exit; }}}} {paramname} = {cast}{argname}; - """.format(argname=argname, paramname=self.name, + """.format(argname=argname, paramname=self.name, argnum=argnum, subclass_of=self.subclass_of, cast=cast) - return """ - if (!PyArg_Parse(%s, "{format_units}:{name}", {parse_arguments})) {{ - goto exit; - }} - """ % argname + if self.format_unit == 'O': + cast = '(%s)' % self.type if self.type != 'PyObject *' else '' + return """ + {paramname} = {cast}{argname}; + """.format(argname=argname, paramname=self.name, cast=cast) + return None type_checks = { '&PyLong_Type': ('PyLong_Check', 'int'), @@ -2598,7 +2637,7 @@ def converter_init(self, *, accept={object}): self.default = bool(self.default) self.c_default = str(int(self.default)) - def parse_arg(self, argname): + def parse_arg(self, argname, argnum): if self.format_unit == 'i': return """ if (PyFloat_Check({argname})) {{{{ @@ -2618,7 +2657,7 @@ def parse_arg(self, argname): goto exit; }}}} """.format(argname=argname, paramname=self.name) - return super().parse_arg(argname) + return super().parse_arg(argname, argnum) class char_converter(CConverter): type = 'char' @@ -2635,7 +2674,7 @@ def converter_init(self): if self.c_default == '"\'"': self.c_default = r"'\''" - def parse_arg(self, argname): + def parse_arg(self, argname, argnum): if self.format_unit == 'c': return """ if (PyBytes_Check({argname}) && PyBytes_GET_SIZE({argname}) == 1) {{{{ @@ -2645,11 +2684,11 @@ def parse_arg(self, argname): {paramname} = PyByteArray_AS_STRING({argname})[0]; }}}} else {{{{ - _PyArg_BadArgument("{{name}}", "a byte string of length 1", {argname}); + _PyArg_BadArgument("{{name}}", {argnum}, "a byte string of length 1", {argname}); goto exit; }}}} - """.format(argname=argname, paramname=self.name) - return super().parse_arg(argname) + """.format(argname=argname, paramname=self.name, argnum=argnum) + return super().parse_arg(argname, argnum) @add_legacy_c_converter('B', bitwise=True) @@ -2663,7 +2702,7 @@ def converter_init(self, *, bitwise=False): if bitwise: self.format_unit = 'B' - def parse_arg(self, argname): + def parse_arg(self, argname, argnum): if self.format_unit == 'b': return """ if (PyFloat_Check({argname})) {{{{ @@ -2708,7 +2747,7 @@ def parse_arg(self, argname): }}}} }}}} """.format(argname=argname, paramname=self.name) - return super().parse_arg(argname) + return super().parse_arg(argname, argnum) class byte_converter(unsigned_char_converter): pass @@ -2718,7 +2757,7 @@ class short_converter(CConverter): format_unit = 'h' c_ignored_default = "0" - def parse_arg(self, argname): + def parse_arg(self, argname, argnum): if self.format_unit == 'h': return """ if (PyFloat_Check({argname})) {{{{ @@ -2746,7 +2785,7 @@ def parse_arg(self, argname): }}}} }}}} """.format(argname=argname, paramname=self.name) - return super().parse_arg(argname) + return super().parse_arg(argname, argnum) class unsigned_short_converter(CConverter): type = 'unsigned short' @@ -2759,6 +2798,21 @@ def converter_init(self, *, bitwise=False): else: self.converter = '_PyLong_UnsignedShort_Converter' + def parse_arg(self, argname, argnum): + if self.format_unit == 'H': + return """ + if (PyFloat_Check({argname})) {{{{ + PyErr_SetString(PyExc_TypeError, + "integer argument expected, got float" ); + goto exit; + }}}} + {paramname} = (unsigned short)PyLong_AsUnsignedLongMask({argname}); + if ({paramname} == (unsigned short)-1 && PyErr_Occurred()) {{{{ + goto exit; + }}}} + """.format(argname=argname, paramname=self.name) + return super().parse_arg(argname, argnum) + @add_legacy_c_converter('C', accept={str}) class int_converter(CConverter): type = 'int' @@ -2774,7 +2828,7 @@ def converter_init(self, *, accept={int}, type=None): if type != None: self.type = type - def parse_arg(self, argname): + def parse_arg(self, argname, argnum): if self.format_unit == 'i': return """ if (PyFloat_Check({argname})) {{{{ @@ -2790,19 +2844,19 @@ def parse_arg(self, argname): elif self.format_unit == 'C': return """ if (!PyUnicode_Check({argname})) {{{{ - _PyArg_BadArgument("{{name}}", "a unicode character", {argname}); + _PyArg_BadArgument("{{name}}", {argnum}, "a unicode character", {argname}); goto exit; }}}} if (PyUnicode_READY({argname})) {{{{ goto exit; }}}} if (PyUnicode_GET_LENGTH({argname}) != 1) {{{{ - _PyArg_BadArgument("{{name}}", "a unicode character", {argname}); + _PyArg_BadArgument("{{name}}", {argnum}, "a unicode character", {argname}); goto exit; }}}} {paramname} = PyUnicode_READ_CHAR({argname}, 0); - """.format(argname=argname, paramname=self.name) - return super().parse_arg(argname) + """.format(argname=argname, paramname=self.name, argnum=argnum) + return super().parse_arg(argname, argnum) class unsigned_int_converter(CConverter): type = 'unsigned int' @@ -2815,7 +2869,7 @@ def converter_init(self, *, bitwise=False): else: self.converter = '_PyLong_UnsignedInt_Converter' - def parse_arg(self, argname): + def parse_arg(self, argname, argnum): if self.format_unit == 'I': return """ if (PyFloat_Check({argname})) {{{{ @@ -2828,7 +2882,7 @@ def parse_arg(self, argname): goto exit; }}}} """.format(argname=argname, paramname=self.name) - return super().parse_arg(argname) + return super().parse_arg(argname, argnum) class long_converter(CConverter): type = 'long' @@ -2836,7 +2890,7 @@ class long_converter(CConverter): format_unit = 'l' c_ignored_default = "0" - def parse_arg(self, argname): + def parse_arg(self, argname, argnum): if self.format_unit == 'l': return """ if (PyFloat_Check({argname})) {{{{ @@ -2849,7 +2903,7 @@ def parse_arg(self, argname): goto exit; }}}} """.format(argname=argname, paramname=self.name) - return super().parse_arg(argname) + return super().parse_arg(argname, argnum) class unsigned_long_converter(CConverter): type = 'unsigned long' @@ -2862,16 +2916,16 @@ def converter_init(self, *, bitwise=False): else: self.converter = '_PyLong_UnsignedLong_Converter' - def parse_arg(self, argname): + def parse_arg(self, argname, argnum): if self.format_unit == 'k': return """ if (!PyLong_Check({argname})) {{{{ - _PyArg_BadArgument("{{name}}", "int", {argname}); + _PyArg_BadArgument("{{name}}", {argnum}, "int", {argname}); goto exit; }}}} {paramname} = PyLong_AsUnsignedLongMask({argname}); - """.format(argname=argname, paramname=self.name) - return super().parse_arg(argname) + """.format(argname=argname, paramname=self.name, argnum=argnum) + return super().parse_arg(argname, argnum) class long_long_converter(CConverter): type = 'long long' @@ -2879,7 +2933,7 @@ class long_long_converter(CConverter): format_unit = 'L' c_ignored_default = "0" - def parse_arg(self, argname): + def parse_arg(self, argname, argnum): if self.format_unit == 'L': return """ if (PyFloat_Check({argname})) {{{{ @@ -2892,7 +2946,7 @@ def parse_arg(self, argname): goto exit; }}}} """.format(argname=argname, paramname=self.name) - return super().parse_arg(argname) + return super().parse_arg(argname, argnum) class unsigned_long_long_converter(CConverter): type = 'unsigned long long' @@ -2905,16 +2959,16 @@ def converter_init(self, *, bitwise=False): else: self.converter = '_PyLong_UnsignedLongLong_Converter' - def parse_arg(self, argname): + def parse_arg(self, argname, argnum): if self.format_unit == 'K': return """ if (!PyLong_Check({argname})) {{{{ - _PyArg_BadArgument("{{name}}", "int", {argname}); + _PyArg_BadArgument("{{name}}", {argnum}, "int", {argname}); goto exit; }}}} {paramname} = PyLong_AsUnsignedLongLongMask({argname}); - """.format(argname=argname, paramname=self.name) - return super().parse_arg(argname) + """.format(argname=argname, paramname=self.name, argnum=argnum) + return super().parse_arg(argname, argnum) class Py_ssize_t_converter(CConverter): type = 'Py_ssize_t' @@ -2929,7 +2983,7 @@ def converter_init(self, *, accept={int}): else: fail("Py_ssize_t_converter: illegal 'accept' argument " + repr(accept)) - def parse_arg(self, argname): + def parse_arg(self, argname, argnum): if self.format_unit == 'n': return """ if (PyFloat_Check({argname})) {{{{ @@ -2950,7 +3004,7 @@ def parse_arg(self, argname): {paramname} = ival; }}}} """.format(argname=argname, paramname=self.name) - return super().parse_arg(argname) + return super().parse_arg(argname, argnum) class slice_index_converter(CConverter): @@ -2969,7 +3023,7 @@ class size_t_converter(CConverter): converter = '_PyLong_Size_t_Converter' c_ignored_default = "0" - def parse_arg(self, argname): + def parse_arg(self, argname, argnum): if self.format_unit == 'n': return """ {paramname} = PyNumber_AsSsize_t({argname}, PyExc_OverflowError); @@ -2977,7 +3031,7 @@ def parse_arg(self, argname): goto exit; }}}} """.format(argname=argname, paramname=self.name) - return super().parse_arg(argname) + return super().parse_arg(argname, argnum) class float_converter(CConverter): @@ -2986,7 +3040,7 @@ class float_converter(CConverter): format_unit = 'f' c_ignored_default = "0.0" - def parse_arg(self, argname): + def parse_arg(self, argname, argnum): if self.format_unit == 'f': return """ {paramname} = (float) PyFloat_AsDouble({argname}); @@ -2994,7 +3048,7 @@ def parse_arg(self, argname): goto exit; }}}} """.format(argname=argname, paramname=self.name) - return super().parse_arg(argname) + return super().parse_arg(argname, argnum) class double_converter(CConverter): type = 'double' @@ -3002,7 +3056,7 @@ class double_converter(CConverter): format_unit = 'd' c_ignored_default = "0.0" - def parse_arg(self, argname): + def parse_arg(self, argname, argnum): if self.format_unit == 'd': return """ {paramname} = PyFloat_AsDouble({argname}); @@ -3010,7 +3064,7 @@ def parse_arg(self, argname): goto exit; }}}} """.format(argname=argname, paramname=self.name) - return super().parse_arg(argname) + return super().parse_arg(argname, argnum) class Py_complex_converter(CConverter): @@ -3019,7 +3073,7 @@ class Py_complex_converter(CConverter): format_unit = 'D' c_ignored_default = "{0.0, 0.0}" - def parse_arg(self, argname): + def parse_arg(self, argname, argnum): if self.format_unit == 'D': return """ {paramname} = PyComplex_AsCComplex({argname}); @@ -3027,7 +3081,7 @@ def parse_arg(self, argname): goto exit; }}}} """.format(argname=argname, paramname=self.name) - return super().parse_arg(argname) + return super().parse_arg(argname, argnum) class object_converter(CConverter): @@ -3093,11 +3147,11 @@ def cleanup(self): name = self.name return "".join(["if (", name, ") {\n PyMem_FREE(", name, ");\n}\n"]) - def parse_arg(self, argname): + def parse_arg(self, argname, argnum): if self.format_unit == 's': return """ if (!PyUnicode_Check({argname})) {{{{ - _PyArg_BadArgument("{{name}}", "str", {argname}); + _PyArg_BadArgument("{{name}}", {argnum}, "str", {argname}); goto exit; }}}} Py_ssize_t {paramname}_length; @@ -3109,8 +3163,29 @@ def parse_arg(self, argname): PyErr_SetString(PyExc_ValueError, "embedded null character"); goto exit; }}}} - """.format(argname=argname, paramname=self.name) - return super().parse_arg(argname) + """.format(argname=argname, paramname=self.name, argnum=argnum) + if self.format_unit == 'z': + return """ + if ({argname} == Py_None) {{{{ + {paramname} = NULL; + }}}} + else if (PyUnicode_Check({argname})) {{{{ + Py_ssize_t {paramname}_length; + {paramname} = PyUnicode_AsUTF8AndSize({argname}, &{paramname}_length); + if ({paramname} == NULL) {{{{ + goto exit; + }}}} + if (strlen({paramname}) != (size_t){paramname}_length) {{{{ + PyErr_SetString(PyExc_ValueError, "embedded null character"); + goto exit; + }}}} + }}}} + else {{{{ + _PyArg_BadArgument("{{name}}", {argnum}, "str or None", {argname}); + goto exit; + }}}} + """.format(argname=argname, paramname=self.name, argnum=argnum) + return super().parse_arg(argname, argnum) # # This is the fourth or fifth rewrite of registering all the @@ -3165,51 +3240,53 @@ class PyBytesObject_converter(CConverter): format_unit = 'S' # accept = {bytes} - def parse_arg(self, argname): + def parse_arg(self, argname, argnum): if self.format_unit == 'S': return """ if (!PyBytes_Check({argname})) {{{{ - _PyArg_BadArgument("{{name}}", "bytes", {argname}); + _PyArg_BadArgument("{{name}}", {argnum}, "bytes", {argname}); goto exit; }}}} {paramname} = ({type}){argname}; - """.format(argname=argname, paramname=self.name, type=self.type) - return super().parse_arg(argname) + """.format(argname=argname, paramname=self.name, argnum=argnum, + type=self.type) + return super().parse_arg(argname, argnum) class PyByteArrayObject_converter(CConverter): type = 'PyByteArrayObject *' format_unit = 'Y' # accept = {bytearray} - def parse_arg(self, argname): + def parse_arg(self, argname, argnum): if self.format_unit == 'Y': return """ if (!PyByteArray_Check({argname})) {{{{ - _PyArg_BadArgument("{{name}}", "bytearray", {argname}); + _PyArg_BadArgument("{{name}}", {argnum}, "bytearray", {argname}); goto exit; }}}} {paramname} = ({type}){argname}; - """.format(argname=argname, paramname=self.name, type=self.type) - return super().parse_arg(argname) + """.format(argname=argname, paramname=self.name, argnum=argnum, + type=self.type) + return super().parse_arg(argname, argnum) class unicode_converter(CConverter): type = 'PyObject *' default_type = (str, Null, NoneType) format_unit = 'U' - def parse_arg(self, argname): + def parse_arg(self, argname, argnum): if self.format_unit == 'U': return """ if (!PyUnicode_Check({argname})) {{{{ - _PyArg_BadArgument("{{name}}", "str", {argname}); + _PyArg_BadArgument("{{name}}", {argnum}, "str", {argname}); goto exit; }}}} if (PyUnicode_READY({argname}) == -1) {{{{ goto exit; }}}} {paramname} = {argname}; - """.format(argname=argname, paramname=self.name) - return super().parse_arg(argname) + """.format(argname=argname, paramname=self.name, argnum=argnum) + return super().parse_arg(argname, argnum) @add_legacy_c_converter('u#', zeroes=True) @add_legacy_c_converter('Z', accept={str, NoneType}) @@ -3258,17 +3335,17 @@ def cleanup(self): name = self.name return "".join(["if (", name, ".obj) {\n PyBuffer_Release(&", name, ");\n}\n"]) - def parse_arg(self, argname): + def parse_arg(self, argname, argnum): if self.format_unit == 'y*': return """ if (PyObject_GetBuffer({argname}, &{paramname}, PyBUF_SIMPLE) != 0) {{{{ goto exit; }}}} if (!PyBuffer_IsContiguous(&{paramname}, 'C')) {{{{ - _PyArg_BadArgument("{{name}}", "contiguous buffer", {argname}); + _PyArg_BadArgument("{{name}}", {argnum}, "contiguous buffer", {argname}); goto exit; }}}} - """.format(argname=argname, paramname=self.name) + """.format(argname=argname, paramname=self.name, argnum=argnum) elif self.format_unit == 's*': return """ if (PyUnicode_Check({argname})) {{{{ @@ -3284,24 +3361,24 @@ def parse_arg(self, argname): goto exit; }}}} if (!PyBuffer_IsContiguous(&{paramname}, 'C')) {{{{ - _PyArg_BadArgument("{{name}}", "contiguous buffer", {argname}); + _PyArg_BadArgument("{{name}}", {argnum}, "contiguous buffer", {argname}); goto exit; }}}} }}}} - """.format(argname=argname, paramname=self.name) + """.format(argname=argname, paramname=self.name, argnum=argnum) elif self.format_unit == 'w*': return """ if (PyObject_GetBuffer({argname}, &{paramname}, PyBUF_WRITABLE) < 0) {{{{ PyErr_Clear(); - _PyArg_BadArgument("{{name}}", "read-write bytes-like object", {argname}); + _PyArg_BadArgument("{{name}}", {argnum}, "read-write bytes-like object", {argname}); goto exit; }}}} if (!PyBuffer_IsContiguous(&{paramname}, 'C')) {{{{ - _PyArg_BadArgument("{{name}}", "contiguous buffer", {argname}); + _PyArg_BadArgument("{{name}}", {argnum}, "contiguous buffer", {argname}); goto exit; }}}} - """.format(argname=argname, paramname=self.name) - return super().parse_arg(argname) + """.format(argname=argname, paramname=self.name, argnum=argnum) + return super().parse_arg(argname, argnum) def correct_name_for_self(f): From webhook-mailer at python.org Fri Jan 11 09:01:54 2019 From: webhook-mailer at python.org (Miss Islington (bot)) Date: Fri, 11 Jan 2019 14:01:54 -0000 Subject: [Python-checkins] bpo-32710: Fix _overlapped.Overlapped memory leaks (GH-11489) Message-ID: https://github.com/python/cpython/commit/059997d78ed1a1a5a364b1846ac972c98c704927 commit: 059997d78ed1a1a5a364b1846ac972c98c704927 branch: 3.7 author: Miss Islington (bot) <31488909+miss-islington at users.noreply.github.com> committer: GitHub date: 2019-01-11T06:01:50-08:00 summary: bpo-32710: Fix _overlapped.Overlapped memory leaks (GH-11489) Fix memory leaks in asyncio ProactorEventLoop on overlapped operation failures. Changes: * Implement the tp_traverse slot in the _overlapped.Overlapped type to help to break reference cycles and identify referrers in the garbage collector. * Always clear overlapped on failure: not only set type to TYPE_NOT_STARTED, but release also resources. (cherry picked from commit 5485085b324a45307c1ff4ec7d85b5998d7d5e0d) Co-authored-by: Victor Stinner files: A Misc/NEWS.d/next/Library/2019-01-10-15-55-10.bpo-32710.KwECPu.rst M Modules/overlapped.c diff --git a/Misc/NEWS.d/next/Library/2019-01-10-15-55-10.bpo-32710.KwECPu.rst b/Misc/NEWS.d/next/Library/2019-01-10-15-55-10.bpo-32710.KwECPu.rst new file mode 100644 index 000000000000..9f7a95a0aaff --- /dev/null +++ b/Misc/NEWS.d/next/Library/2019-01-10-15-55-10.bpo-32710.KwECPu.rst @@ -0,0 +1,2 @@ +Fix memory leaks in asyncio ProactorEventLoop on overlapped operation +failure. diff --git a/Modules/overlapped.c b/Modules/overlapped.c index 4ef5a96b1e4d..7798856bee73 100644 --- a/Modules/overlapped.c +++ b/Modules/overlapped.c @@ -561,6 +561,28 @@ Overlapped_new(PyTypeObject *type, PyObject *args, PyObject *kwds) return (PyObject *)self; } + +/* Note (bpo-32710): OverlappedType.tp_clear is not defined to not release + buffers while overlapped are still running, to prevent a crash. */ +static int +Overlapped_clear(OverlappedObject *self) +{ + switch (self->type) { + case TYPE_READ: + case TYPE_ACCEPT: + Py_CLEAR(self->allocated_buffer); + break; + case TYPE_WRITE: + case TYPE_READINTO: + if (self->user_buffer.obj) { + PyBuffer_Release(&self->user_buffer); + } + break; + } + self->type = TYPE_NOT_STARTED; + return 0; +} + static void Overlapped_dealloc(OverlappedObject *self) { @@ -594,20 +616,11 @@ Overlapped_dealloc(OverlappedObject *self) } } - if (self->overlapped.hEvent != NULL) + if (self->overlapped.hEvent != NULL) { CloseHandle(self->overlapped.hEvent); - - switch (self->type) { - case TYPE_READ: - case TYPE_ACCEPT: - Py_CLEAR(self->allocated_buffer); - break; - case TYPE_WRITE: - case TYPE_READINTO: - if (self->user_buffer.obj) - PyBuffer_Release(&self->user_buffer); - break; } + + Overlapped_clear(self); PyObject_Del(self); SetLastError(olderr); } @@ -723,8 +736,7 @@ do_ReadFile(OverlappedObject *self, HANDLE handle, case ERROR_IO_PENDING: Py_RETURN_NONE; default: - PyBuffer_Release(&self->user_buffer); - self->type = TYPE_NOT_STARTED; + Overlapped_clear(self); return SetFromWindowsErr(err); } } @@ -827,7 +839,7 @@ do_WSARecv(OverlappedObject *self, HANDLE handle, case ERROR_IO_PENDING: Py_RETURN_NONE; default: - self->type = TYPE_NOT_STARTED; + Overlapped_clear(self); return SetFromWindowsErr(err); } } @@ -955,7 +967,7 @@ Overlapped_WriteFile(OverlappedObject *self, PyObject *args) case ERROR_IO_PENDING: Py_RETURN_NONE; default: - self->type = TYPE_NOT_STARTED; + Overlapped_clear(self); return SetFromWindowsErr(err); } } @@ -1012,8 +1024,7 @@ Overlapped_WSASend(OverlappedObject *self, PyObject *args) case ERROR_IO_PENDING: Py_RETURN_NONE; default: - PyBuffer_Release(&self->user_buffer); - self->type = TYPE_NOT_STARTED; + Overlapped_clear(self); return SetFromWindowsErr(err); } } @@ -1063,7 +1074,7 @@ Overlapped_AcceptEx(OverlappedObject *self, PyObject *args) case ERROR_IO_PENDING: Py_RETURN_NONE; default: - self->type = TYPE_NOT_STARTED; + Overlapped_clear(self); return SetFromWindowsErr(err); } } @@ -1155,7 +1166,7 @@ Overlapped_ConnectEx(OverlappedObject *self, PyObject *args) case ERROR_IO_PENDING: Py_RETURN_NONE; default: - self->type = TYPE_NOT_STARTED; + Overlapped_clear(self); return SetFromWindowsErr(err); } } @@ -1194,7 +1205,7 @@ Overlapped_DisconnectEx(OverlappedObject *self, PyObject *args) case ERROR_IO_PENDING: Py_RETURN_NONE; default: - self->type = TYPE_NOT_STARTED; + Overlapped_clear(self); return SetFromWindowsErr(err); } } @@ -1249,7 +1260,7 @@ Overlapped_TransmitFile(OverlappedObject *self, PyObject *args) case ERROR_IO_PENDING: Py_RETURN_NONE; default: - self->type = TYPE_NOT_STARTED; + Overlapped_clear(self); return SetFromWindowsErr(err); } } @@ -1290,7 +1301,7 @@ Overlapped_ConnectNamedPipe(OverlappedObject *self, PyObject *args) case ERROR_IO_PENDING: Py_RETURN_FALSE; default: - self->type = TYPE_NOT_STARTED; + Overlapped_clear(self); return SetFromWindowsErr(err); } } @@ -1340,6 +1351,25 @@ Overlapped_getpending(OverlappedObject *self) self->type != TYPE_NOT_STARTED); } +static int +Overlapped_traverse(OverlappedObject *self, visitproc visit, void *arg) +{ + switch (self->type) { + case TYPE_READ: + case TYPE_ACCEPT: + Py_VISIT(self->allocated_buffer); + break; + case TYPE_WRITE: + case TYPE_READINTO: + if (self->user_buffer.obj) { + Py_VISIT(&self->user_buffer.obj); + } + break; + } + return 0; +} + + static PyMethodDef Overlapped_methods[] = { {"getresult", (PyCFunction) Overlapped_getresult, METH_VARARGS, Overlapped_getresult_doc}, @@ -1410,7 +1440,7 @@ PyTypeObject OverlappedType = { /* tp_as_buffer */ 0, /* tp_flags */ Py_TPFLAGS_DEFAULT, /* tp_doc */ "OVERLAPPED structure wrapper", - /* tp_traverse */ 0, + /* tp_traverse */ (traverseproc)Overlapped_traverse, /* tp_clear */ 0, /* tp_richcompare */ 0, /* tp_weaklistoffset */ 0, From webhook-mailer at python.org Fri Jan 11 11:01:52 2019 From: webhook-mailer at python.org (Serhiy Storchaka) Date: Fri, 11 Jan 2019 16:01:52 -0000 Subject: [Python-checkins] bpo-35582: Argument Clinic: Optimize the "all boring objects" case. (GH-11520) Message-ID: https://github.com/python/cpython/commit/2a39d251f07d4c620e3b9a1848e3d1eb3067be64 commit: 2a39d251f07d4c620e3b9a1848e3d1eb3067be64 branch: master author: Serhiy Storchaka committer: GitHub date: 2019-01-11T18:01:42+02:00 summary: bpo-35582: Argument Clinic: Optimize the "all boring objects" case. (GH-11520) Use _PyArg_CheckPositional() and inlined code instead of PyArg_UnpackTuple() and _PyArg_UnpackStack() if all parameters are positional and use the "object" converter. files: M Lib/test/clinic.test M Modules/_io/clinic/bufferedio.c.h M Modules/_io/clinic/bytesio.c.h M Modules/_io/clinic/fileio.c.h M Modules/_io/clinic/textio.c.h M Modules/cjkcodecs/clinic/multibytecodec.c.h M Modules/clinic/_abc.c.h M Modules/clinic/_cursesmodule.c.h M Modules/clinic/_elementtree.c.h M Modules/clinic/_gdbmmodule.c.h M Modules/clinic/_heapqmodule.c.h M Modules/clinic/_operator.c.h M Modules/clinic/_pickle.c.h M Modules/clinic/_sre.c.h M Modules/clinic/itertoolsmodule.c.h M Modules/clinic/mathmodule.c.h M Modules/clinic/selectmodule.c.h M Objects/clinic/bytearrayobject.c.h M Objects/clinic/bytesobject.c.h M Objects/clinic/dictobject.c.h M Objects/clinic/enumobject.c.h M Objects/clinic/floatobject.c.h M Objects/clinic/listobject.c.h M Objects/clinic/tupleobject.c.h M Objects/clinic/unicodeobject.c.h M Python/clinic/bltinmodule.c.h M Python/clinic/context.c.h M Python/clinic/import.c.h M Python/clinic/sysmodule.c.h M Tools/clinic/clinic.py diff --git a/Lib/test/clinic.test b/Lib/test/clinic.test index 7ae8f9642e8a..b8f2331b4375 100644 --- a/Lib/test/clinic.test +++ b/Lib/test/clinic.test @@ -106,11 +106,15 @@ test_objects_converter(PyObject *module, PyObject *const *args, Py_ssize_t nargs PyObject *a; PyObject *b = NULL; - if (!_PyArg_UnpackStack(args, nargs, "test_objects_converter", - 1, 2, - &a, &b)) { + if (!_PyArg_CheckPositional("test_objects_converter", nargs, 1, 2)) { goto exit; } + a = args[0]; + if (nargs < 2) { + goto skip_optional; + } + b = args[1]; +skip_optional: return_value = test_objects_converter_impl(module, a, b); exit: @@ -119,7 +123,7 @@ exit: static PyObject * test_objects_converter_impl(PyObject *module, PyObject *a, PyObject *b) -/*[clinic end generated code: output=068c25d6ae8cd1ef input=4cbb3d9edd2a36f3]*/ +/*[clinic end generated code: output=58009c0e42b4834e input=4cbb3d9edd2a36f3]*/ /*[clinic input] test_object_converter_subclass_of diff --git a/Modules/_io/clinic/bufferedio.c.h b/Modules/_io/clinic/bufferedio.c.h index 6345b9e16e85..60a6dac742fe 100644 --- a/Modules/_io/clinic/bufferedio.c.h +++ b/Modules/_io/clinic/bufferedio.c.h @@ -389,11 +389,14 @@ _io__Buffered_truncate(buffered *self, PyObject *const *args, Py_ssize_t nargs) PyObject *return_value = NULL; PyObject *pos = Py_None; - if (!_PyArg_UnpackStack(args, nargs, "truncate", - 0, 1, - &pos)) { + if (!_PyArg_CheckPositional("truncate", nargs, 0, 1)) { goto exit; } + if (nargs < 1) { + goto skip_optional; + } + pos = args[0]; +skip_optional: return_value = _io__Buffered_truncate_impl(self, pos); exit: @@ -591,4 +594,4 @@ _io_BufferedRandom___init__(PyObject *self, PyObject *args, PyObject *kwargs) exit: return return_value; } -/*[clinic end generated code: output=a85f61f495feff5c input=a9049054013a1b77]*/ +/*[clinic end generated code: output=b7f51040defff318 input=a9049054013a1b77]*/ diff --git a/Modules/_io/clinic/bytesio.c.h b/Modules/_io/clinic/bytesio.c.h index 558841697b3a..54c5123a008c 100644 --- a/Modules/_io/clinic/bytesio.c.h +++ b/Modules/_io/clinic/bytesio.c.h @@ -282,11 +282,14 @@ _io_BytesIO_readlines(bytesio *self, PyObject *const *args, Py_ssize_t nargs) PyObject *return_value = NULL; PyObject *arg = Py_None; - if (!_PyArg_UnpackStack(args, nargs, "readlines", - 0, 1, - &arg)) { + if (!_PyArg_CheckPositional("readlines", nargs, 0, 1)) { goto exit; } + if (nargs < 1) { + goto skip_optional; + } + arg = args[0]; +skip_optional: return_value = _io_BytesIO_readlines_impl(self, arg); exit: @@ -503,4 +506,4 @@ _io_BytesIO___init__(PyObject *self, PyObject *args, PyObject *kwargs) exit: return return_value; } -/*[clinic end generated code: output=5c68eb481fa960bf input=a9049054013a1b77]*/ +/*[clinic end generated code: output=a6b47dd7921abfcd input=a9049054013a1b77]*/ diff --git a/Modules/_io/clinic/fileio.c.h b/Modules/_io/clinic/fileio.c.h index 280549e996d8..e7d49d47f8ae 100644 --- a/Modules/_io/clinic/fileio.c.h +++ b/Modules/_io/clinic/fileio.c.h @@ -368,11 +368,14 @@ _io_FileIO_truncate(fileio *self, PyObject *const *args, Py_ssize_t nargs) PyObject *return_value = NULL; PyObject *posobj = NULL; - if (!_PyArg_UnpackStack(args, nargs, "truncate", - 0, 1, - &posobj)) { + if (!_PyArg_CheckPositional("truncate", nargs, 0, 1)) { goto exit; } + if (nargs < 1) { + goto skip_optional; + } + posobj = args[0]; +skip_optional: return_value = _io_FileIO_truncate_impl(self, posobj); exit: @@ -402,4 +405,4 @@ _io_FileIO_isatty(fileio *self, PyObject *Py_UNUSED(ignored)) #ifndef _IO_FILEIO_TRUNCATE_METHODDEF #define _IO_FILEIO_TRUNCATE_METHODDEF #endif /* !defined(_IO_FILEIO_TRUNCATE_METHODDEF) */ -/*[clinic end generated code: output=4cf4e5f0cd656b11 input=a9049054013a1b77]*/ +/*[clinic end generated code: output=b6f327457938d4dd input=a9049054013a1b77]*/ diff --git a/Modules/_io/clinic/textio.c.h b/Modules/_io/clinic/textio.c.h index 0ff0324f5a4f..2a13abfd9671 100644 --- a/Modules/_io/clinic/textio.c.h +++ b/Modules/_io/clinic/textio.c.h @@ -419,11 +419,14 @@ _io_TextIOWrapper_truncate(textio *self, PyObject *const *args, Py_ssize_t nargs PyObject *return_value = NULL; PyObject *pos = Py_None; - if (!_PyArg_UnpackStack(args, nargs, "truncate", - 0, 1, - &pos)) { + if (!_PyArg_CheckPositional("truncate", nargs, 0, 1)) { goto exit; } + if (nargs < 1) { + goto skip_optional; + } + pos = args[0]; +skip_optional: return_value = _io_TextIOWrapper_truncate_impl(self, pos); exit: @@ -548,4 +551,4 @@ _io_TextIOWrapper_close(textio *self, PyObject *Py_UNUSED(ignored)) { return _io_TextIOWrapper_close_impl(self); } -/*[clinic end generated code: output=8bdd1035bf878d6f input=a9049054013a1b77]*/ +/*[clinic end generated code: output=c3d1b2a5d2d2d429 input=a9049054013a1b77]*/ diff --git a/Modules/cjkcodecs/clinic/multibytecodec.c.h b/Modules/cjkcodecs/clinic/multibytecodec.c.h index 871bf339afc1..c62a64179bec 100644 --- a/Modules/cjkcodecs/clinic/multibytecodec.c.h +++ b/Modules/cjkcodecs/clinic/multibytecodec.c.h @@ -296,11 +296,14 @@ _multibytecodec_MultibyteStreamReader_read(MultibyteStreamReaderObject *self, Py PyObject *return_value = NULL; PyObject *sizeobj = Py_None; - if (!_PyArg_UnpackStack(args, nargs, "read", - 0, 1, - &sizeobj)) { + if (!_PyArg_CheckPositional("read", nargs, 0, 1)) { goto exit; } + if (nargs < 1) { + goto skip_optional; + } + sizeobj = args[0]; +skip_optional: return_value = _multibytecodec_MultibyteStreamReader_read_impl(self, sizeobj); exit: @@ -325,11 +328,14 @@ _multibytecodec_MultibyteStreamReader_readline(MultibyteStreamReaderObject *self PyObject *return_value = NULL; PyObject *sizeobj = Py_None; - if (!_PyArg_UnpackStack(args, nargs, "readline", - 0, 1, - &sizeobj)) { + if (!_PyArg_CheckPositional("readline", nargs, 0, 1)) { goto exit; } + if (nargs < 1) { + goto skip_optional; + } + sizeobj = args[0]; +skip_optional: return_value = _multibytecodec_MultibyteStreamReader_readline_impl(self, sizeobj); exit: @@ -354,11 +360,14 @@ _multibytecodec_MultibyteStreamReader_readlines(MultibyteStreamReaderObject *sel PyObject *return_value = NULL; PyObject *sizehintobj = Py_None; - if (!_PyArg_UnpackStack(args, nargs, "readlines", - 0, 1, - &sizehintobj)) { + if (!_PyArg_CheckPositional("readlines", nargs, 0, 1)) { goto exit; } + if (nargs < 1) { + goto skip_optional; + } + sizehintobj = args[0]; +skip_optional: return_value = _multibytecodec_MultibyteStreamReader_readlines_impl(self, sizehintobj); exit: @@ -422,4 +431,4 @@ PyDoc_STRVAR(_multibytecodec___create_codec__doc__, #define _MULTIBYTECODEC___CREATE_CODEC_METHODDEF \ {"__create_codec", (PyCFunction)_multibytecodec___create_codec, METH_O, _multibytecodec___create_codec__doc__}, -/*[clinic end generated code: output=2ed7030b28a79029 input=a9049054013a1b77]*/ +/*[clinic end generated code: output=bcd6311010557faf input=a9049054013a1b77]*/ diff --git a/Modules/clinic/_abc.c.h b/Modules/clinic/_abc.c.h index 22ddb6c100fb..62c6552ba645 100644 --- a/Modules/clinic/_abc.c.h +++ b/Modules/clinic/_abc.c.h @@ -65,11 +65,11 @@ _abc__abc_register(PyObject *module, PyObject *const *args, Py_ssize_t nargs) PyObject *self; PyObject *subclass; - if (!_PyArg_UnpackStack(args, nargs, "_abc_register", - 2, 2, - &self, &subclass)) { + if (!_PyArg_CheckPositional("_abc_register", nargs, 2, 2)) { goto exit; } + self = args[0]; + subclass = args[1]; return_value = _abc__abc_register_impl(module, self, subclass); exit: @@ -96,11 +96,11 @@ _abc__abc_instancecheck(PyObject *module, PyObject *const *args, Py_ssize_t narg PyObject *self; PyObject *instance; - if (!_PyArg_UnpackStack(args, nargs, "_abc_instancecheck", - 2, 2, - &self, &instance)) { + if (!_PyArg_CheckPositional("_abc_instancecheck", nargs, 2, 2)) { goto exit; } + self = args[0]; + instance = args[1]; return_value = _abc__abc_instancecheck_impl(module, self, instance); exit: @@ -127,11 +127,11 @@ _abc__abc_subclasscheck(PyObject *module, PyObject *const *args, Py_ssize_t narg PyObject *self; PyObject *subclass; - if (!_PyArg_UnpackStack(args, nargs, "_abc_subclasscheck", - 2, 2, - &self, &subclass)) { + if (!_PyArg_CheckPositional("_abc_subclasscheck", nargs, 2, 2)) { goto exit; } + self = args[0]; + subclass = args[1]; return_value = _abc__abc_subclasscheck_impl(module, self, subclass); exit: @@ -159,4 +159,4 @@ _abc_get_cache_token(PyObject *module, PyObject *Py_UNUSED(ignored)) { return _abc_get_cache_token_impl(module); } -/*[clinic end generated code: output=606db3cb658d9240 input=a9049054013a1b77]*/ +/*[clinic end generated code: output=2544b4b5ae50a089 input=a9049054013a1b77]*/ diff --git a/Modules/clinic/_cursesmodule.c.h b/Modules/clinic/_cursesmodule.c.h index 0988581363e0..659fd4ec4ec8 100644 --- a/Modules/clinic/_cursesmodule.c.h +++ b/Modules/clinic/_cursesmodule.c.h @@ -468,11 +468,42 @@ _curses_window_border(PyCursesWindowObject *self, PyObject *const *args, Py_ssiz PyObject *bl = NULL; PyObject *br = NULL; - if (!_PyArg_UnpackStack(args, nargs, "border", - 0, 8, - &ls, &rs, &ts, &bs, &tl, &tr, &bl, &br)) { + if (!_PyArg_CheckPositional("border", nargs, 0, 8)) { goto exit; } + if (nargs < 1) { + goto skip_optional; + } + ls = args[0]; + if (nargs < 2) { + goto skip_optional; + } + rs = args[1]; + if (nargs < 3) { + goto skip_optional; + } + ts = args[2]; + if (nargs < 4) { + goto skip_optional; + } + bs = args[3]; + if (nargs < 5) { + goto skip_optional; + } + tl = args[4]; + if (nargs < 6) { + goto skip_optional; + } + tr = args[5]; + if (nargs < 7) { + goto skip_optional; + } + bl = args[6]; + if (nargs < 8) { + goto skip_optional; + } + br = args[7]; +skip_optional: return_value = _curses_window_border_impl(self, ls, rs, ts, bs, tl, tr, bl, br); exit: @@ -4500,4 +4531,4 @@ _curses_use_default_colors(PyObject *module, PyObject *Py_UNUSED(ignored)) #ifndef _CURSES_USE_DEFAULT_COLORS_METHODDEF #define _CURSES_USE_DEFAULT_COLORS_METHODDEF #endif /* !defined(_CURSES_USE_DEFAULT_COLORS_METHODDEF) */ -/*[clinic end generated code: output=ceb2e32ee1370033 input=a9049054013a1b77]*/ +/*[clinic end generated code: output=5305982cb312a911 input=a9049054013a1b77]*/ diff --git a/Modules/clinic/_elementtree.c.h b/Modules/clinic/_elementtree.c.h index 896898396403..1293acddc1ce 100644 --- a/Modules/clinic/_elementtree.c.h +++ b/Modules/clinic/_elementtree.c.h @@ -504,11 +504,11 @@ _elementtree_Element_makeelement(ElementObject *self, PyObject *const *args, Py_ PyObject *tag; PyObject *attrib; - if (!_PyArg_UnpackStack(args, nargs, "makeelement", - 2, 2, - &tag, &attrib)) { + if (!_PyArg_CheckPositional("makeelement", nargs, 2, 2)) { goto exit; } + tag = args[0]; + attrib = args[1]; return_value = _elementtree_Element_makeelement_impl(self, tag, attrib); exit: @@ -562,11 +562,11 @@ _elementtree_Element_set(ElementObject *self, PyObject *const *args, Py_ssize_t PyObject *key; PyObject *value; - if (!_PyArg_UnpackStack(args, nargs, "set", - 2, 2, - &key, &value)) { + if (!_PyArg_CheckPositional("set", nargs, 2, 2)) { goto exit; } + key = args[0]; + value = args[1]; return_value = _elementtree_Element_set_impl(self, key, value); exit: @@ -647,11 +647,15 @@ _elementtree_TreeBuilder_start(TreeBuilderObject *self, PyObject *const *args, P PyObject *tag; PyObject *attrs = Py_None; - if (!_PyArg_UnpackStack(args, nargs, "start", - 1, 2, - &tag, &attrs)) { + if (!_PyArg_CheckPositional("start", nargs, 1, 2)) { goto exit; } + tag = args[0]; + if (nargs < 2) { + goto skip_optional; + } + attrs = args[1]; +skip_optional: return_value = _elementtree_TreeBuilder_start_impl(self, tag, attrs); exit: @@ -734,14 +738,18 @@ _elementtree_XMLParser__setevents(XMLParserObject *self, PyObject *const *args, PyObject *events_queue; PyObject *events_to_report = Py_None; - if (!_PyArg_UnpackStack(args, nargs, "_setevents", - 1, 2, - &events_queue, &events_to_report)) { + if (!_PyArg_CheckPositional("_setevents", nargs, 1, 2)) { goto exit; } + events_queue = args[0]; + if (nargs < 2) { + goto skip_optional; + } + events_to_report = args[1]; +skip_optional: return_value = _elementtree_XMLParser__setevents_impl(self, events_queue, events_to_report); exit: return return_value; } -/*[clinic end generated code: output=6bbedd24b709dc00 input=a9049054013a1b77]*/ +/*[clinic end generated code: output=0c15c41e03a7829f input=a9049054013a1b77]*/ diff --git a/Modules/clinic/_gdbmmodule.c.h b/Modules/clinic/_gdbmmodule.c.h index 9475264ef9d9..15f47dc70dc6 100644 --- a/Modules/clinic/_gdbmmodule.c.h +++ b/Modules/clinic/_gdbmmodule.c.h @@ -21,11 +21,15 @@ _gdbm_gdbm_get(dbmobject *self, PyObject *const *args, Py_ssize_t nargs) PyObject *key; PyObject *default_value = Py_None; - if (!_PyArg_UnpackStack(args, nargs, "get", - 1, 2, - &key, &default_value)) { + if (!_PyArg_CheckPositional("get", nargs, 1, 2)) { goto exit; } + key = args[0]; + if (nargs < 2) { + goto skip_optional; + } + default_value = args[1]; +skip_optional: return_value = _gdbm_gdbm_get_impl(self, key, default_value); exit: @@ -52,11 +56,15 @@ _gdbm_gdbm_setdefault(dbmobject *self, PyObject *const *args, Py_ssize_t nargs) PyObject *key; PyObject *default_value = Py_None; - if (!_PyArg_UnpackStack(args, nargs, "setdefault", - 1, 2, - &key, &default_value)) { + if (!_PyArg_CheckPositional("setdefault", nargs, 1, 2)) { goto exit; } + key = args[0]; + if (nargs < 2) { + goto skip_optional; + } + default_value = args[1]; +skip_optional: return_value = _gdbm_gdbm_setdefault_impl(self, key, default_value); exit: @@ -290,4 +298,4 @@ dbmopen(PyObject *module, PyObject *const *args, Py_ssize_t nargs) exit: return return_value; } -/*[clinic end generated code: output=05f06065d2dc1f9e input=a9049054013a1b77]*/ +/*[clinic end generated code: output=0a72598e5a3acd60 input=a9049054013a1b77]*/ diff --git a/Modules/clinic/_heapqmodule.c.h b/Modules/clinic/_heapqmodule.c.h index 0b5d0d7bbaba..55403706ba05 100644 --- a/Modules/clinic/_heapqmodule.c.h +++ b/Modules/clinic/_heapqmodule.c.h @@ -21,11 +21,11 @@ _heapq_heappush(PyObject *module, PyObject *const *args, Py_ssize_t nargs) PyObject *heap; PyObject *item; - if (!_PyArg_UnpackStack(args, nargs, "heappush", - 2, 2, - &heap, &item)) { + if (!_PyArg_CheckPositional("heappush", nargs, 2, 2)) { goto exit; } + heap = args[0]; + item = args[1]; return_value = _heapq_heappush_impl(module, heap, item); exit: @@ -68,11 +68,11 @@ _heapq_heapreplace(PyObject *module, PyObject *const *args, Py_ssize_t nargs) PyObject *heap; PyObject *item; - if (!_PyArg_UnpackStack(args, nargs, "heapreplace", - 2, 2, - &heap, &item)) { + if (!_PyArg_CheckPositional("heapreplace", nargs, 2, 2)) { goto exit; } + heap = args[0]; + item = args[1]; return_value = _heapq_heapreplace_impl(module, heap, item); exit: @@ -101,11 +101,11 @@ _heapq_heappushpop(PyObject *module, PyObject *const *args, Py_ssize_t nargs) PyObject *heap; PyObject *item; - if (!_PyArg_UnpackStack(args, nargs, "heappushpop", - 2, 2, - &heap, &item)) { + if (!_PyArg_CheckPositional("heappushpop", nargs, 2, 2)) { goto exit; } + heap = args[0]; + item = args[1]; return_value = _heapq_heappushpop_impl(module, heap, item); exit: @@ -150,11 +150,11 @@ _heapq__heapreplace_max(PyObject *module, PyObject *const *args, Py_ssize_t narg PyObject *heap; PyObject *item; - if (!_PyArg_UnpackStack(args, nargs, "_heapreplace_max", - 2, 2, - &heap, &item)) { + if (!_PyArg_CheckPositional("_heapreplace_max", nargs, 2, 2)) { goto exit; } + heap = args[0]; + item = args[1]; return_value = _heapq__heapreplace_max_impl(module, heap, item); exit: @@ -169,4 +169,4 @@ PyDoc_STRVAR(_heapq__heapify_max__doc__, #define _HEAPQ__HEAPIFY_MAX_METHODDEF \ {"_heapify_max", (PyCFunction)_heapq__heapify_max, METH_O, _heapq__heapify_max__doc__}, -/*[clinic end generated code: output=b73e874eeb9977b6 input=a9049054013a1b77]*/ +/*[clinic end generated code: output=37ef2a3319971c8d input=a9049054013a1b77]*/ diff --git a/Modules/clinic/_operator.c.h b/Modules/clinic/_operator.c.h index 72c0e57ea2bb..f9e353d86b49 100644 --- a/Modules/clinic/_operator.c.h +++ b/Modules/clinic/_operator.c.h @@ -49,11 +49,11 @@ _operator_add(PyObject *module, PyObject *const *args, Py_ssize_t nargs) PyObject *a; PyObject *b; - if (!_PyArg_UnpackStack(args, nargs, "add", - 2, 2, - &a, &b)) { + if (!_PyArg_CheckPositional("add", nargs, 2, 2)) { goto exit; } + a = args[0]; + b = args[1]; return_value = _operator_add_impl(module, a, b); exit: @@ -79,11 +79,11 @@ _operator_sub(PyObject *module, PyObject *const *args, Py_ssize_t nargs) PyObject *a; PyObject *b; - if (!_PyArg_UnpackStack(args, nargs, "sub", - 2, 2, - &a, &b)) { + if (!_PyArg_CheckPositional("sub", nargs, 2, 2)) { goto exit; } + a = args[0]; + b = args[1]; return_value = _operator_sub_impl(module, a, b); exit: @@ -109,11 +109,11 @@ _operator_mul(PyObject *module, PyObject *const *args, Py_ssize_t nargs) PyObject *a; PyObject *b; - if (!_PyArg_UnpackStack(args, nargs, "mul", - 2, 2, - &a, &b)) { + if (!_PyArg_CheckPositional("mul", nargs, 2, 2)) { goto exit; } + a = args[0]; + b = args[1]; return_value = _operator_mul_impl(module, a, b); exit: @@ -139,11 +139,11 @@ _operator_matmul(PyObject *module, PyObject *const *args, Py_ssize_t nargs) PyObject *a; PyObject *b; - if (!_PyArg_UnpackStack(args, nargs, "matmul", - 2, 2, - &a, &b)) { + if (!_PyArg_CheckPositional("matmul", nargs, 2, 2)) { goto exit; } + a = args[0]; + b = args[1]; return_value = _operator_matmul_impl(module, a, b); exit: @@ -169,11 +169,11 @@ _operator_floordiv(PyObject *module, PyObject *const *args, Py_ssize_t nargs) PyObject *a; PyObject *b; - if (!_PyArg_UnpackStack(args, nargs, "floordiv", - 2, 2, - &a, &b)) { + if (!_PyArg_CheckPositional("floordiv", nargs, 2, 2)) { goto exit; } + a = args[0]; + b = args[1]; return_value = _operator_floordiv_impl(module, a, b); exit: @@ -199,11 +199,11 @@ _operator_truediv(PyObject *module, PyObject *const *args, Py_ssize_t nargs) PyObject *a; PyObject *b; - if (!_PyArg_UnpackStack(args, nargs, "truediv", - 2, 2, - &a, &b)) { + if (!_PyArg_CheckPositional("truediv", nargs, 2, 2)) { goto exit; } + a = args[0]; + b = args[1]; return_value = _operator_truediv_impl(module, a, b); exit: @@ -229,11 +229,11 @@ _operator_mod(PyObject *module, PyObject *const *args, Py_ssize_t nargs) PyObject *a; PyObject *b; - if (!_PyArg_UnpackStack(args, nargs, "mod", - 2, 2, - &a, &b)) { + if (!_PyArg_CheckPositional("mod", nargs, 2, 2)) { goto exit; } + a = args[0]; + b = args[1]; return_value = _operator_mod_impl(module, a, b); exit: @@ -304,11 +304,11 @@ _operator_lshift(PyObject *module, PyObject *const *args, Py_ssize_t nargs) PyObject *a; PyObject *b; - if (!_PyArg_UnpackStack(args, nargs, "lshift", - 2, 2, - &a, &b)) { + if (!_PyArg_CheckPositional("lshift", nargs, 2, 2)) { goto exit; } + a = args[0]; + b = args[1]; return_value = _operator_lshift_impl(module, a, b); exit: @@ -334,11 +334,11 @@ _operator_rshift(PyObject *module, PyObject *const *args, Py_ssize_t nargs) PyObject *a; PyObject *b; - if (!_PyArg_UnpackStack(args, nargs, "rshift", - 2, 2, - &a, &b)) { + if (!_PyArg_CheckPositional("rshift", nargs, 2, 2)) { goto exit; } + a = args[0]; + b = args[1]; return_value = _operator_rshift_impl(module, a, b); exit: @@ -392,11 +392,11 @@ _operator_and_(PyObject *module, PyObject *const *args, Py_ssize_t nargs) PyObject *a; PyObject *b; - if (!_PyArg_UnpackStack(args, nargs, "and_", - 2, 2, - &a, &b)) { + if (!_PyArg_CheckPositional("and_", nargs, 2, 2)) { goto exit; } + a = args[0]; + b = args[1]; return_value = _operator_and__impl(module, a, b); exit: @@ -422,11 +422,11 @@ _operator_xor(PyObject *module, PyObject *const *args, Py_ssize_t nargs) PyObject *a; PyObject *b; - if (!_PyArg_UnpackStack(args, nargs, "xor", - 2, 2, - &a, &b)) { + if (!_PyArg_CheckPositional("xor", nargs, 2, 2)) { goto exit; } + a = args[0]; + b = args[1]; return_value = _operator_xor_impl(module, a, b); exit: @@ -452,11 +452,11 @@ _operator_or_(PyObject *module, PyObject *const *args, Py_ssize_t nargs) PyObject *a; PyObject *b; - if (!_PyArg_UnpackStack(args, nargs, "or_", - 2, 2, - &a, &b)) { + if (!_PyArg_CheckPositional("or_", nargs, 2, 2)) { goto exit; } + a = args[0]; + b = args[1]; return_value = _operator_or__impl(module, a, b); exit: @@ -482,11 +482,11 @@ _operator_iadd(PyObject *module, PyObject *const *args, Py_ssize_t nargs) PyObject *a; PyObject *b; - if (!_PyArg_UnpackStack(args, nargs, "iadd", - 2, 2, - &a, &b)) { + if (!_PyArg_CheckPositional("iadd", nargs, 2, 2)) { goto exit; } + a = args[0]; + b = args[1]; return_value = _operator_iadd_impl(module, a, b); exit: @@ -512,11 +512,11 @@ _operator_isub(PyObject *module, PyObject *const *args, Py_ssize_t nargs) PyObject *a; PyObject *b; - if (!_PyArg_UnpackStack(args, nargs, "isub", - 2, 2, - &a, &b)) { + if (!_PyArg_CheckPositional("isub", nargs, 2, 2)) { goto exit; } + a = args[0]; + b = args[1]; return_value = _operator_isub_impl(module, a, b); exit: @@ -542,11 +542,11 @@ _operator_imul(PyObject *module, PyObject *const *args, Py_ssize_t nargs) PyObject *a; PyObject *b; - if (!_PyArg_UnpackStack(args, nargs, "imul", - 2, 2, - &a, &b)) { + if (!_PyArg_CheckPositional("imul", nargs, 2, 2)) { goto exit; } + a = args[0]; + b = args[1]; return_value = _operator_imul_impl(module, a, b); exit: @@ -572,11 +572,11 @@ _operator_imatmul(PyObject *module, PyObject *const *args, Py_ssize_t nargs) PyObject *a; PyObject *b; - if (!_PyArg_UnpackStack(args, nargs, "imatmul", - 2, 2, - &a, &b)) { + if (!_PyArg_CheckPositional("imatmul", nargs, 2, 2)) { goto exit; } + a = args[0]; + b = args[1]; return_value = _operator_imatmul_impl(module, a, b); exit: @@ -602,11 +602,11 @@ _operator_ifloordiv(PyObject *module, PyObject *const *args, Py_ssize_t nargs) PyObject *a; PyObject *b; - if (!_PyArg_UnpackStack(args, nargs, "ifloordiv", - 2, 2, - &a, &b)) { + if (!_PyArg_CheckPositional("ifloordiv", nargs, 2, 2)) { goto exit; } + a = args[0]; + b = args[1]; return_value = _operator_ifloordiv_impl(module, a, b); exit: @@ -632,11 +632,11 @@ _operator_itruediv(PyObject *module, PyObject *const *args, Py_ssize_t nargs) PyObject *a; PyObject *b; - if (!_PyArg_UnpackStack(args, nargs, "itruediv", - 2, 2, - &a, &b)) { + if (!_PyArg_CheckPositional("itruediv", nargs, 2, 2)) { goto exit; } + a = args[0]; + b = args[1]; return_value = _operator_itruediv_impl(module, a, b); exit: @@ -662,11 +662,11 @@ _operator_imod(PyObject *module, PyObject *const *args, Py_ssize_t nargs) PyObject *a; PyObject *b; - if (!_PyArg_UnpackStack(args, nargs, "imod", - 2, 2, - &a, &b)) { + if (!_PyArg_CheckPositional("imod", nargs, 2, 2)) { goto exit; } + a = args[0]; + b = args[1]; return_value = _operator_imod_impl(module, a, b); exit: @@ -692,11 +692,11 @@ _operator_ilshift(PyObject *module, PyObject *const *args, Py_ssize_t nargs) PyObject *a; PyObject *b; - if (!_PyArg_UnpackStack(args, nargs, "ilshift", - 2, 2, - &a, &b)) { + if (!_PyArg_CheckPositional("ilshift", nargs, 2, 2)) { goto exit; } + a = args[0]; + b = args[1]; return_value = _operator_ilshift_impl(module, a, b); exit: @@ -722,11 +722,11 @@ _operator_irshift(PyObject *module, PyObject *const *args, Py_ssize_t nargs) PyObject *a; PyObject *b; - if (!_PyArg_UnpackStack(args, nargs, "irshift", - 2, 2, - &a, &b)) { + if (!_PyArg_CheckPositional("irshift", nargs, 2, 2)) { goto exit; } + a = args[0]; + b = args[1]; return_value = _operator_irshift_impl(module, a, b); exit: @@ -752,11 +752,11 @@ _operator_iand(PyObject *module, PyObject *const *args, Py_ssize_t nargs) PyObject *a; PyObject *b; - if (!_PyArg_UnpackStack(args, nargs, "iand", - 2, 2, - &a, &b)) { + if (!_PyArg_CheckPositional("iand", nargs, 2, 2)) { goto exit; } + a = args[0]; + b = args[1]; return_value = _operator_iand_impl(module, a, b); exit: @@ -782,11 +782,11 @@ _operator_ixor(PyObject *module, PyObject *const *args, Py_ssize_t nargs) PyObject *a; PyObject *b; - if (!_PyArg_UnpackStack(args, nargs, "ixor", - 2, 2, - &a, &b)) { + if (!_PyArg_CheckPositional("ixor", nargs, 2, 2)) { goto exit; } + a = args[0]; + b = args[1]; return_value = _operator_ixor_impl(module, a, b); exit: @@ -812,11 +812,11 @@ _operator_ior(PyObject *module, PyObject *const *args, Py_ssize_t nargs) PyObject *a; PyObject *b; - if (!_PyArg_UnpackStack(args, nargs, "ior", - 2, 2, - &a, &b)) { + if (!_PyArg_CheckPositional("ior", nargs, 2, 2)) { goto exit; } + a = args[0]; + b = args[1]; return_value = _operator_ior_impl(module, a, b); exit: @@ -842,11 +842,11 @@ _operator_concat(PyObject *module, PyObject *const *args, Py_ssize_t nargs) PyObject *a; PyObject *b; - if (!_PyArg_UnpackStack(args, nargs, "concat", - 2, 2, - &a, &b)) { + if (!_PyArg_CheckPositional("concat", nargs, 2, 2)) { goto exit; } + a = args[0]; + b = args[1]; return_value = _operator_concat_impl(module, a, b); exit: @@ -872,11 +872,11 @@ _operator_iconcat(PyObject *module, PyObject *const *args, Py_ssize_t nargs) PyObject *a; PyObject *b; - if (!_PyArg_UnpackStack(args, nargs, "iconcat", - 2, 2, - &a, &b)) { + if (!_PyArg_CheckPositional("iconcat", nargs, 2, 2)) { goto exit; } + a = args[0]; + b = args[1]; return_value = _operator_iconcat_impl(module, a, b); exit: @@ -903,11 +903,11 @@ _operator_contains(PyObject *module, PyObject *const *args, Py_ssize_t nargs) PyObject *b; int _return_value; - if (!_PyArg_UnpackStack(args, nargs, "contains", - 2, 2, - &a, &b)) { + if (!_PyArg_CheckPositional("contains", nargs, 2, 2)) { goto exit; } + a = args[0]; + b = args[1]; _return_value = _operator_contains_impl(module, a, b); if ((_return_value == -1) && PyErr_Occurred()) { goto exit; @@ -938,11 +938,11 @@ _operator_indexOf(PyObject *module, PyObject *const *args, Py_ssize_t nargs) PyObject *b; Py_ssize_t _return_value; - if (!_PyArg_UnpackStack(args, nargs, "indexOf", - 2, 2, - &a, &b)) { + if (!_PyArg_CheckPositional("indexOf", nargs, 2, 2)) { goto exit; } + a = args[0]; + b = args[1]; _return_value = _operator_indexOf_impl(module, a, b); if ((_return_value == -1) && PyErr_Occurred()) { goto exit; @@ -973,11 +973,11 @@ _operator_countOf(PyObject *module, PyObject *const *args, Py_ssize_t nargs) PyObject *b; Py_ssize_t _return_value; - if (!_PyArg_UnpackStack(args, nargs, "countOf", - 2, 2, - &a, &b)) { + if (!_PyArg_CheckPositional("countOf", nargs, 2, 2)) { goto exit; } + a = args[0]; + b = args[1]; _return_value = _operator_countOf_impl(module, a, b); if ((_return_value == -1) && PyErr_Occurred()) { goto exit; @@ -1007,11 +1007,11 @@ _operator_getitem(PyObject *module, PyObject *const *args, Py_ssize_t nargs) PyObject *a; PyObject *b; - if (!_PyArg_UnpackStack(args, nargs, "getitem", - 2, 2, - &a, &b)) { + if (!_PyArg_CheckPositional("getitem", nargs, 2, 2)) { goto exit; } + a = args[0]; + b = args[1]; return_value = _operator_getitem_impl(module, a, b); exit: @@ -1039,11 +1039,12 @@ _operator_setitem(PyObject *module, PyObject *const *args, Py_ssize_t nargs) PyObject *b; PyObject *c; - if (!_PyArg_UnpackStack(args, nargs, "setitem", - 3, 3, - &a, &b, &c)) { + if (!_PyArg_CheckPositional("setitem", nargs, 3, 3)) { goto exit; } + a = args[0]; + b = args[1]; + c = args[2]; return_value = _operator_setitem_impl(module, a, b, c); exit: @@ -1069,11 +1070,11 @@ _operator_delitem(PyObject *module, PyObject *const *args, Py_ssize_t nargs) PyObject *a; PyObject *b; - if (!_PyArg_UnpackStack(args, nargs, "delitem", - 2, 2, - &a, &b)) { + if (!_PyArg_CheckPositional("delitem", nargs, 2, 2)) { goto exit; } + a = args[0]; + b = args[1]; return_value = _operator_delitem_impl(module, a, b); exit: @@ -1099,11 +1100,11 @@ _operator_eq(PyObject *module, PyObject *const *args, Py_ssize_t nargs) PyObject *a; PyObject *b; - if (!_PyArg_UnpackStack(args, nargs, "eq", - 2, 2, - &a, &b)) { + if (!_PyArg_CheckPositional("eq", nargs, 2, 2)) { goto exit; } + a = args[0]; + b = args[1]; return_value = _operator_eq_impl(module, a, b); exit: @@ -1129,11 +1130,11 @@ _operator_ne(PyObject *module, PyObject *const *args, Py_ssize_t nargs) PyObject *a; PyObject *b; - if (!_PyArg_UnpackStack(args, nargs, "ne", - 2, 2, - &a, &b)) { + if (!_PyArg_CheckPositional("ne", nargs, 2, 2)) { goto exit; } + a = args[0]; + b = args[1]; return_value = _operator_ne_impl(module, a, b); exit: @@ -1159,11 +1160,11 @@ _operator_lt(PyObject *module, PyObject *const *args, Py_ssize_t nargs) PyObject *a; PyObject *b; - if (!_PyArg_UnpackStack(args, nargs, "lt", - 2, 2, - &a, &b)) { + if (!_PyArg_CheckPositional("lt", nargs, 2, 2)) { goto exit; } + a = args[0]; + b = args[1]; return_value = _operator_lt_impl(module, a, b); exit: @@ -1189,11 +1190,11 @@ _operator_le(PyObject *module, PyObject *const *args, Py_ssize_t nargs) PyObject *a; PyObject *b; - if (!_PyArg_UnpackStack(args, nargs, "le", - 2, 2, - &a, &b)) { + if (!_PyArg_CheckPositional("le", nargs, 2, 2)) { goto exit; } + a = args[0]; + b = args[1]; return_value = _operator_le_impl(module, a, b); exit: @@ -1219,11 +1220,11 @@ _operator_gt(PyObject *module, PyObject *const *args, Py_ssize_t nargs) PyObject *a; PyObject *b; - if (!_PyArg_UnpackStack(args, nargs, "gt", - 2, 2, - &a, &b)) { + if (!_PyArg_CheckPositional("gt", nargs, 2, 2)) { goto exit; } + a = args[0]; + b = args[1]; return_value = _operator_gt_impl(module, a, b); exit: @@ -1249,11 +1250,11 @@ _operator_ge(PyObject *module, PyObject *const *args, Py_ssize_t nargs) PyObject *a; PyObject *b; - if (!_PyArg_UnpackStack(args, nargs, "ge", - 2, 2, - &a, &b)) { + if (!_PyArg_CheckPositional("ge", nargs, 2, 2)) { goto exit; } + a = args[0]; + b = args[1]; return_value = _operator_ge_impl(module, a, b); exit: @@ -1279,11 +1280,11 @@ _operator_pow(PyObject *module, PyObject *const *args, Py_ssize_t nargs) PyObject *a; PyObject *b; - if (!_PyArg_UnpackStack(args, nargs, "pow", - 2, 2, - &a, &b)) { + if (!_PyArg_CheckPositional("pow", nargs, 2, 2)) { goto exit; } + a = args[0]; + b = args[1]; return_value = _operator_pow_impl(module, a, b); exit: @@ -1309,11 +1310,11 @@ _operator_ipow(PyObject *module, PyObject *const *args, Py_ssize_t nargs) PyObject *a; PyObject *b; - if (!_PyArg_UnpackStack(args, nargs, "ipow", - 2, 2, - &a, &b)) { + if (!_PyArg_CheckPositional("ipow", nargs, 2, 2)) { goto exit; } + a = args[0]; + b = args[1]; return_value = _operator_ipow_impl(module, a, b); exit: @@ -1348,11 +1349,11 @@ _operator_is_(PyObject *module, PyObject *const *args, Py_ssize_t nargs) PyObject *a; PyObject *b; - if (!_PyArg_UnpackStack(args, nargs, "is_", - 2, 2, - &a, &b)) { + if (!_PyArg_CheckPositional("is_", nargs, 2, 2)) { goto exit; } + a = args[0]; + b = args[1]; return_value = _operator_is__impl(module, a, b); exit: @@ -1378,11 +1379,11 @@ _operator_is_not(PyObject *module, PyObject *const *args, Py_ssize_t nargs) PyObject *a; PyObject *b; - if (!_PyArg_UnpackStack(args, nargs, "is_not", - 2, 2, - &a, &b)) { + if (!_PyArg_CheckPositional("is_not", nargs, 2, 2)) { goto exit; } + a = args[0]; + b = args[1]; return_value = _operator_is_not_impl(module, a, b); exit: @@ -1480,14 +1481,14 @@ _operator__compare_digest(PyObject *module, PyObject *const *args, Py_ssize_t na PyObject *a; PyObject *b; - if (!_PyArg_UnpackStack(args, nargs, "_compare_digest", - 2, 2, - &a, &b)) { + if (!_PyArg_CheckPositional("_compare_digest", nargs, 2, 2)) { goto exit; } + a = args[0]; + b = args[1]; return_value = _operator__compare_digest_impl(module, a, b); exit: return return_value; } -/*[clinic end generated code: output=b382bece80a5a254 input=a9049054013a1b77]*/ +/*[clinic end generated code: output=e7ed71a8c475a901 input=a9049054013a1b77]*/ diff --git a/Modules/clinic/_pickle.c.h b/Modules/clinic/_pickle.c.h index 3f6be4bd8049..2759b2f21e01 100644 --- a/Modules/clinic/_pickle.c.h +++ b/Modules/clinic/_pickle.c.h @@ -213,11 +213,11 @@ _pickle_Unpickler_find_class(UnpicklerObject *self, PyObject *const *args, Py_ss PyObject *module_name; PyObject *global_name; - if (!_PyArg_UnpackStack(args, nargs, "find_class", - 2, 2, - &module_name, &global_name)) { + if (!_PyArg_CheckPositional("find_class", nargs, 2, 2)) { goto exit; } + module_name = args[0]; + global_name = args[1]; return_value = _pickle_Unpickler_find_class_impl(self, module_name, global_name); exit: @@ -562,4 +562,4 @@ _pickle_loads(PyObject *module, PyObject *const *args, Py_ssize_t nargs, PyObjec exit: return return_value; } -/*[clinic end generated code: output=4b32d63ff58b64d8 input=a9049054013a1b77]*/ +/*[clinic end generated code: output=225f06abcf27ed2b input=a9049054013a1b77]*/ diff --git a/Modules/clinic/_sre.c.h b/Modules/clinic/_sre.c.h index e8a3665c3fb1..e5bb32fbea69 100644 --- a/Modules/clinic/_sre.c.h +++ b/Modules/clinic/_sre.c.h @@ -653,11 +653,14 @@ _sre_SRE_Match_start(MatchObject *self, PyObject *const *args, Py_ssize_t nargs) PyObject *group = NULL; Py_ssize_t _return_value; - if (!_PyArg_UnpackStack(args, nargs, "start", - 0, 1, - &group)) { + if (!_PyArg_CheckPositional("start", nargs, 0, 1)) { goto exit; } + if (nargs < 1) { + goto skip_optional; + } + group = args[0]; +skip_optional: _return_value = _sre_SRE_Match_start_impl(self, group); if ((_return_value == -1) && PyErr_Occurred()) { goto exit; @@ -687,11 +690,14 @@ _sre_SRE_Match_end(MatchObject *self, PyObject *const *args, Py_ssize_t nargs) PyObject *group = NULL; Py_ssize_t _return_value; - if (!_PyArg_UnpackStack(args, nargs, "end", - 0, 1, - &group)) { + if (!_PyArg_CheckPositional("end", nargs, 0, 1)) { goto exit; } + if (nargs < 1) { + goto skip_optional; + } + group = args[0]; +skip_optional: _return_value = _sre_SRE_Match_end_impl(self, group); if ((_return_value == -1) && PyErr_Occurred()) { goto exit; @@ -720,11 +726,14 @@ _sre_SRE_Match_span(MatchObject *self, PyObject *const *args, Py_ssize_t nargs) PyObject *return_value = NULL; PyObject *group = NULL; - if (!_PyArg_UnpackStack(args, nargs, "span", - 0, 1, - &group)) { + if (!_PyArg_CheckPositional("span", nargs, 0, 1)) { goto exit; } + if (nargs < 1) { + goto skip_optional; + } + group = args[0]; +skip_optional: return_value = _sre_SRE_Match_span_impl(self, group); exit: @@ -789,4 +798,4 @@ _sre_SRE_Scanner_search(ScannerObject *self, PyObject *Py_UNUSED(ignored)) { return _sre_SRE_Scanner_search_impl(self); } -/*[clinic end generated code: output=7992634045212b26 input=a9049054013a1b77]*/ +/*[clinic end generated code: output=8d19359d6a4a3a7e input=a9049054013a1b77]*/ diff --git a/Modules/clinic/itertoolsmodule.c.h b/Modules/clinic/itertoolsmodule.c.h index 9fc429aa3c36..bcf4c5f97987 100644 --- a/Modules/clinic/itertoolsmodule.c.h +++ b/Modules/clinic/itertoolsmodule.c.h @@ -124,11 +124,10 @@ itertools__tee(PyTypeObject *type, PyObject *args, PyObject *kwargs) !_PyArg_NoKeywords("_tee", kwargs)) { goto exit; } - if (!PyArg_UnpackTuple(args, "_tee", - 1, 1, - &iterable)) { + if (!_PyArg_CheckPositional("_tee", PyTuple_GET_SIZE(args), 1, 1)) { goto exit; } + iterable = PyTuple_GET_ITEM(args, 0); return_value = itertools__tee_impl(type, iterable); exit: @@ -204,11 +203,10 @@ itertools_cycle(PyTypeObject *type, PyObject *args, PyObject *kwargs) !_PyArg_NoKeywords("cycle", kwargs)) { goto exit; } - if (!PyArg_UnpackTuple(args, "cycle", - 1, 1, - &iterable)) { + if (!_PyArg_CheckPositional("cycle", PyTuple_GET_SIZE(args), 1, 1)) { goto exit; } + iterable = PyTuple_GET_ITEM(args, 0); return_value = itertools_cycle_impl(type, iterable); exit: @@ -237,11 +235,11 @@ itertools_dropwhile(PyTypeObject *type, PyObject *args, PyObject *kwargs) !_PyArg_NoKeywords("dropwhile", kwargs)) { goto exit; } - if (!PyArg_UnpackTuple(args, "dropwhile", - 2, 2, - &func, &seq)) { + if (!_PyArg_CheckPositional("dropwhile", PyTuple_GET_SIZE(args), 2, 2)) { goto exit; } + func = PyTuple_GET_ITEM(args, 0); + seq = PyTuple_GET_ITEM(args, 1); return_value = itertools_dropwhile_impl(type, func, seq); exit: @@ -268,11 +266,11 @@ itertools_takewhile(PyTypeObject *type, PyObject *args, PyObject *kwargs) !_PyArg_NoKeywords("takewhile", kwargs)) { goto exit; } - if (!PyArg_UnpackTuple(args, "takewhile", - 2, 2, - &func, &seq)) { + if (!_PyArg_CheckPositional("takewhile", PyTuple_GET_SIZE(args), 2, 2)) { goto exit; } + func = PyTuple_GET_ITEM(args, 0); + seq = PyTuple_GET_ITEM(args, 1); return_value = itertools_takewhile_impl(type, func, seq); exit: @@ -299,11 +297,11 @@ itertools_starmap(PyTypeObject *type, PyObject *args, PyObject *kwargs) !_PyArg_NoKeywords("starmap", kwargs)) { goto exit; } - if (!PyArg_UnpackTuple(args, "starmap", - 2, 2, - &func, &seq)) { + if (!_PyArg_CheckPositional("starmap", PyTuple_GET_SIZE(args), 2, 2)) { goto exit; } + func = PyTuple_GET_ITEM(args, 0); + seq = PyTuple_GET_ITEM(args, 1); return_value = itertools_starmap_impl(type, func, seq); exit: @@ -496,11 +494,11 @@ itertools_filterfalse(PyTypeObject *type, PyObject *args, PyObject *kwargs) !_PyArg_NoKeywords("filterfalse", kwargs)) { goto exit; } - if (!PyArg_UnpackTuple(args, "filterfalse", - 2, 2, - &func, &seq)) { + if (!_PyArg_CheckPositional("filterfalse", PyTuple_GET_SIZE(args), 2, 2)) { goto exit; } + func = PyTuple_GET_ITEM(args, 0); + seq = PyTuple_GET_ITEM(args, 1); return_value = itertools_filterfalse_impl(type, func, seq); exit: @@ -542,4 +540,4 @@ itertools_count(PyTypeObject *type, PyObject *args, PyObject *kwargs) exit: return return_value; } -/*[clinic end generated code: output=f289354f54e04c13 input=a9049054013a1b77]*/ +/*[clinic end generated code: output=3d0ca69707b60715 input=a9049054013a1b77]*/ diff --git a/Modules/clinic/mathmodule.c.h b/Modules/clinic/mathmodule.c.h index 116578968759..1c8fc2fc2c3f 100644 --- a/Modules/clinic/mathmodule.c.h +++ b/Modules/clinic/mathmodule.c.h @@ -21,11 +21,11 @@ math_gcd(PyObject *module, PyObject *const *args, Py_ssize_t nargs) PyObject *a; PyObject *b; - if (!_PyArg_UnpackStack(args, nargs, "gcd", - 2, 2, - &a, &b)) { + if (!_PyArg_CheckPositional("gcd", nargs, 2, 2)) { goto exit; } + a = args[0]; + b = args[1]; return_value = math_gcd_impl(module, a, b); exit: @@ -307,11 +307,11 @@ math_dist(PyObject *module, PyObject *const *args, Py_ssize_t nargs) PyObject *p; PyObject *q; - if (!_PyArg_UnpackStack(args, nargs, "dist", - 2, 2, - &p, &q)) { + if (!_PyArg_CheckPositional("dist", nargs, 2, 2)) { goto exit; } + p = args[0]; + q = args[1]; return_value = math_dist_impl(module, p, q); exit: @@ -548,4 +548,4 @@ math_isclose(PyObject *module, PyObject *const *args, Py_ssize_t nargs, PyObject exit: return return_value; } -/*[clinic end generated code: output=2fe4fecd85585313 input=a9049054013a1b77]*/ +/*[clinic end generated code: output=f3264ab0ef57ba0a input=a9049054013a1b77]*/ diff --git a/Modules/clinic/selectmodule.c.h b/Modules/clinic/selectmodule.c.h index 655e24c3533f..c2ef26f240a7 100644 --- a/Modules/clinic/selectmodule.c.h +++ b/Modules/clinic/selectmodule.c.h @@ -45,11 +45,17 @@ select_select(PyObject *module, PyObject *const *args, Py_ssize_t nargs) PyObject *xlist; PyObject *timeout_obj = Py_None; - if (!_PyArg_UnpackStack(args, nargs, "select", - 3, 4, - &rlist, &wlist, &xlist, &timeout_obj)) { + if (!_PyArg_CheckPositional("select", nargs, 3, 4)) { goto exit; } + rlist = args[0]; + wlist = args[1]; + xlist = args[2]; + if (nargs < 4) { + goto skip_optional; + } + timeout_obj = args[3]; +skip_optional: return_value = select_select_impl(module, rlist, wlist, xlist, timeout_obj); exit: @@ -201,11 +207,14 @@ select_poll_poll(pollObject *self, PyObject *const *args, Py_ssize_t nargs) PyObject *return_value = NULL; PyObject *timeout_obj = Py_None; - if (!_PyArg_UnpackStack(args, nargs, "poll", - 0, 1, - &timeout_obj)) { + if (!_PyArg_CheckPositional("poll", nargs, 0, 1)) { goto exit; } + if (nargs < 1) { + goto skip_optional; + } + timeout_obj = args[0]; +skip_optional: return_value = select_poll_poll_impl(self, timeout_obj); exit: @@ -366,11 +375,14 @@ select_devpoll_poll(devpollObject *self, PyObject *const *args, Py_ssize_t nargs PyObject *return_value = NULL; PyObject *timeout_obj = Py_None; - if (!_PyArg_UnpackStack(args, nargs, "poll", - 0, 1, - &timeout_obj)) { + if (!_PyArg_CheckPositional("poll", nargs, 0, 1)) { goto exit; } + if (nargs < 1) { + goto skip_optional; + } + timeout_obj = args[0]; +skip_optional: return_value = select_devpoll_poll_impl(self, timeout_obj); exit: @@ -808,11 +820,22 @@ select_epoll___exit__(pyEpoll_Object *self, PyObject *const *args, Py_ssize_t na PyObject *exc_value = Py_None; PyObject *exc_tb = Py_None; - if (!_PyArg_UnpackStack(args, nargs, "__exit__", - 0, 3, - &exc_type, &exc_value, &exc_tb)) { + if (!_PyArg_CheckPositional("__exit__", nargs, 0, 3)) { goto exit; } + if (nargs < 1) { + goto skip_optional; + } + exc_type = args[0]; + if (nargs < 2) { + goto skip_optional; + } + exc_value = args[1]; + if (nargs < 3) { + goto skip_optional; + } + exc_tb = args[2]; +skip_optional: return_value = select_epoll___exit___impl(self, exc_type, exc_value, exc_tb); exit: @@ -1105,4 +1128,4 @@ select_kqueue_control(kqueue_queue_Object *self, PyObject *const *args, Py_ssize #ifndef SELECT_KQUEUE_CONTROL_METHODDEF #define SELECT_KQUEUE_CONTROL_METHODDEF #endif /* !defined(SELECT_KQUEUE_CONTROL_METHODDEF) */ -/*[clinic end generated code: output=20da8f9c050e1b65 input=a9049054013a1b77]*/ +/*[clinic end generated code: output=3e40b33a3294d03d input=a9049054013a1b77]*/ diff --git a/Objects/clinic/bytearrayobject.c.h b/Objects/clinic/bytearrayobject.c.h index 2d7c74200775..a4669f5076f3 100644 --- a/Objects/clinic/bytearrayobject.c.h +++ b/Objects/clinic/bytearrayobject.c.h @@ -545,11 +545,14 @@ bytearray_strip(PyByteArrayObject *self, PyObject *const *args, Py_ssize_t nargs PyObject *return_value = NULL; PyObject *bytes = Py_None; - if (!_PyArg_UnpackStack(args, nargs, "strip", - 0, 1, - &bytes)) { + if (!_PyArg_CheckPositional("strip", nargs, 0, 1)) { goto exit; } + if (nargs < 1) { + goto skip_optional; + } + bytes = args[0]; +skip_optional: return_value = bytearray_strip_impl(self, bytes); exit: @@ -576,11 +579,14 @@ bytearray_lstrip(PyByteArrayObject *self, PyObject *const *args, Py_ssize_t narg PyObject *return_value = NULL; PyObject *bytes = Py_None; - if (!_PyArg_UnpackStack(args, nargs, "lstrip", - 0, 1, - &bytes)) { + if (!_PyArg_CheckPositional("lstrip", nargs, 0, 1)) { goto exit; } + if (nargs < 1) { + goto skip_optional; + } + bytes = args[0]; +skip_optional: return_value = bytearray_lstrip_impl(self, bytes); exit: @@ -607,11 +613,14 @@ bytearray_rstrip(PyByteArrayObject *self, PyObject *const *args, Py_ssize_t narg PyObject *return_value = NULL; PyObject *bytes = Py_None; - if (!_PyArg_UnpackStack(args, nargs, "rstrip", - 0, 1, - &bytes)) { + if (!_PyArg_CheckPositional("rstrip", nargs, 0, 1)) { goto exit; } + if (nargs < 1) { + goto skip_optional; + } + bytes = args[0]; +skip_optional: return_value = bytearray_rstrip_impl(self, bytes); exit: @@ -815,4 +824,4 @@ bytearray_sizeof(PyByteArrayObject *self, PyObject *Py_UNUSED(ignored)) { return bytearray_sizeof_impl(self); } -/*[clinic end generated code: output=010e281b823d7df1 input=a9049054013a1b77]*/ +/*[clinic end generated code: output=f4353c27dcb4a13d input=a9049054013a1b77]*/ diff --git a/Objects/clinic/bytesobject.c.h b/Objects/clinic/bytesobject.c.h index 4e754d91af39..d75bbf1e5432 100644 --- a/Objects/clinic/bytesobject.c.h +++ b/Objects/clinic/bytesobject.c.h @@ -203,11 +203,14 @@ bytes_strip(PyBytesObject *self, PyObject *const *args, Py_ssize_t nargs) PyObject *return_value = NULL; PyObject *bytes = Py_None; - if (!_PyArg_UnpackStack(args, nargs, "strip", - 0, 1, - &bytes)) { + if (!_PyArg_CheckPositional("strip", nargs, 0, 1)) { goto exit; } + if (nargs < 1) { + goto skip_optional; + } + bytes = args[0]; +skip_optional: return_value = bytes_strip_impl(self, bytes); exit: @@ -234,11 +237,14 @@ bytes_lstrip(PyBytesObject *self, PyObject *const *args, Py_ssize_t nargs) PyObject *return_value = NULL; PyObject *bytes = Py_None; - if (!_PyArg_UnpackStack(args, nargs, "lstrip", - 0, 1, - &bytes)) { + if (!_PyArg_CheckPositional("lstrip", nargs, 0, 1)) { goto exit; } + if (nargs < 1) { + goto skip_optional; + } + bytes = args[0]; +skip_optional: return_value = bytes_lstrip_impl(self, bytes); exit: @@ -265,11 +271,14 @@ bytes_rstrip(PyBytesObject *self, PyObject *const *args, Py_ssize_t nargs) PyObject *return_value = NULL; PyObject *bytes = Py_None; - if (!_PyArg_UnpackStack(args, nargs, "rstrip", - 0, 1, - &bytes)) { + if (!_PyArg_CheckPositional("rstrip", nargs, 0, 1)) { goto exit; } + if (nargs < 1) { + goto skip_optional; + } + bytes = args[0]; +skip_optional: return_value = bytes_rstrip_impl(self, bytes); exit: @@ -559,4 +568,4 @@ bytes_fromhex(PyTypeObject *type, PyObject *arg) exit: return return_value; } -/*[clinic end generated code: output=810c8dfc72520ca4 input=a9049054013a1b77]*/ +/*[clinic end generated code: output=c6621bda84e63e51 input=a9049054013a1b77]*/ diff --git a/Objects/clinic/dictobject.c.h b/Objects/clinic/dictobject.c.h index 5db3a426f459..713781ce8806 100644 --- a/Objects/clinic/dictobject.c.h +++ b/Objects/clinic/dictobject.c.h @@ -21,11 +21,15 @@ dict_fromkeys(PyTypeObject *type, PyObject *const *args, Py_ssize_t nargs) PyObject *iterable; PyObject *value = Py_None; - if (!_PyArg_UnpackStack(args, nargs, "fromkeys", - 1, 2, - &iterable, &value)) { + if (!_PyArg_CheckPositional("fromkeys", nargs, 1, 2)) { goto exit; } + iterable = args[0]; + if (nargs < 2) { + goto skip_optional; + } + value = args[1]; +skip_optional: return_value = dict_fromkeys_impl(type, iterable, value); exit: @@ -60,11 +64,15 @@ dict_get(PyDictObject *self, PyObject *const *args, Py_ssize_t nargs) PyObject *key; PyObject *default_value = Py_None; - if (!_PyArg_UnpackStack(args, nargs, "get", - 1, 2, - &key, &default_value)) { + if (!_PyArg_CheckPositional("get", nargs, 1, 2)) { goto exit; } + key = args[0]; + if (nargs < 2) { + goto skip_optional; + } + default_value = args[1]; +skip_optional: return_value = dict_get_impl(self, key, default_value); exit: @@ -93,11 +101,15 @@ dict_setdefault(PyDictObject *self, PyObject *const *args, Py_ssize_t nargs) PyObject *key; PyObject *default_value = Py_None; - if (!_PyArg_UnpackStack(args, nargs, "setdefault", - 1, 2, - &key, &default_value)) { + if (!_PyArg_CheckPositional("setdefault", nargs, 1, 2)) { goto exit; } + key = args[0]; + if (nargs < 2) { + goto skip_optional; + } + default_value = args[1]; +skip_optional: return_value = dict_setdefault_impl(self, key, default_value); exit: @@ -121,4 +133,4 @@ dict___reversed__(PyDictObject *self, PyObject *Py_UNUSED(ignored)) { return dict___reversed___impl(self); } -/*[clinic end generated code: output=193e08cb8099fe22 input=a9049054013a1b77]*/ +/*[clinic end generated code: output=12c21ce3552d9617 input=a9049054013a1b77]*/ diff --git a/Objects/clinic/enumobject.c.h b/Objects/clinic/enumobject.c.h index 0f05cf84cb2f..27da2942195e 100644 --- a/Objects/clinic/enumobject.c.h +++ b/Objects/clinic/enumobject.c.h @@ -58,14 +58,13 @@ reversed_new(PyTypeObject *type, PyObject *args, PyObject *kwargs) !_PyArg_NoKeywords("reversed", kwargs)) { goto exit; } - if (!PyArg_UnpackTuple(args, "reversed", - 1, 1, - &seq)) { + if (!_PyArg_CheckPositional("reversed", PyTuple_GET_SIZE(args), 1, 1)) { goto exit; } + seq = PyTuple_GET_ITEM(args, 0); return_value = reversed_new_impl(type, seq); exit: return return_value; } -/*[clinic end generated code: output=9008c36999c57218 input=a9049054013a1b77]*/ +/*[clinic end generated code: output=831cec3db0e987c9 input=a9049054013a1b77]*/ diff --git a/Objects/clinic/floatobject.c.h b/Objects/clinic/floatobject.c.h index 741ca3bac5b5..4251d63c247b 100644 --- a/Objects/clinic/floatobject.c.h +++ b/Objects/clinic/floatobject.c.h @@ -58,11 +58,14 @@ float___round__(PyObject *self, PyObject *const *args, Py_ssize_t nargs) PyObject *return_value = NULL; PyObject *o_ndigits = NULL; - if (!_PyArg_UnpackStack(args, nargs, "__round__", - 0, 1, - &o_ndigits)) { + if (!_PyArg_CheckPositional("__round__", nargs, 0, 1)) { goto exit; } + if (nargs < 1) { + goto skip_optional; + } + o_ndigits = args[0]; +skip_optional: return_value = float___round___impl(self, o_ndigits); exit: @@ -173,11 +176,14 @@ float_new(PyTypeObject *type, PyObject *args, PyObject *kwargs) !_PyArg_NoKeywords("float", kwargs)) { goto exit; } - if (!PyArg_UnpackTuple(args, "float", - 0, 1, - &x)) { + if (!_PyArg_CheckPositional("float", PyTuple_GET_SIZE(args), 0, 1)) { goto exit; } + if (PyTuple_GET_SIZE(args) < 1) { + goto skip_optional; + } + x = PyTuple_GET_ITEM(args, 0); +skip_optional: return_value = float_new_impl(type, x); exit: @@ -345,4 +351,4 @@ float___format__(PyObject *self, PyObject *arg) exit: return return_value; } -/*[clinic end generated code: output=2631a60701a8f7d4 input=a9049054013a1b77]*/ +/*[clinic end generated code: output=c183029d87dd41fa input=a9049054013a1b77]*/ diff --git a/Objects/clinic/listobject.c.h b/Objects/clinic/listobject.c.h index 36174586141b..447cdefbe10f 100644 --- a/Objects/clinic/listobject.c.h +++ b/Objects/clinic/listobject.c.h @@ -289,11 +289,14 @@ list___init__(PyObject *self, PyObject *args, PyObject *kwargs) !_PyArg_NoKeywords("list", kwargs)) { goto exit; } - if (!PyArg_UnpackTuple(args, "list", - 0, 1, - &iterable)) { + if (!_PyArg_CheckPositional("list", PyTuple_GET_SIZE(args), 0, 1)) { goto exit; } + if (PyTuple_GET_SIZE(args) < 1) { + goto skip_optional; + } + iterable = PyTuple_GET_ITEM(args, 0); +skip_optional: return_value = list___init___impl((PyListObject *)self, iterable); exit: @@ -335,4 +338,4 @@ list___reversed__(PyListObject *self, PyObject *Py_UNUSED(ignored)) { return list___reversed___impl(self); } -/*[clinic end generated code: output=1f641f5aef3f886f input=a9049054013a1b77]*/ +/*[clinic end generated code: output=4a835f9880a72273 input=a9049054013a1b77]*/ diff --git a/Objects/clinic/tupleobject.c.h b/Objects/clinic/tupleobject.c.h index 0096f0cd0f69..fe2fae42eeaf 100644 --- a/Objects/clinic/tupleobject.c.h +++ b/Objects/clinic/tupleobject.c.h @@ -81,11 +81,14 @@ tuple_new(PyTypeObject *type, PyObject *args, PyObject *kwargs) !_PyArg_NoKeywords("tuple", kwargs)) { goto exit; } - if (!PyArg_UnpackTuple(args, "tuple", - 0, 1, - &iterable)) { + if (!_PyArg_CheckPositional("tuple", PyTuple_GET_SIZE(args), 0, 1)) { goto exit; } + if (PyTuple_GET_SIZE(args) < 1) { + goto skip_optional; + } + iterable = PyTuple_GET_ITEM(args, 0); +skip_optional: return_value = tuple_new_impl(type, iterable); exit: @@ -108,4 +111,4 @@ tuple___getnewargs__(PyTupleObject *self, PyObject *Py_UNUSED(ignored)) { return tuple___getnewargs___impl(self); } -/*[clinic end generated code: output=5312868473a41cfe input=a9049054013a1b77]*/ +/*[clinic end generated code: output=56fab9b7368aba49 input=a9049054013a1b77]*/ diff --git a/Objects/clinic/unicodeobject.c.h b/Objects/clinic/unicodeobject.c.h index 21e54f1011cf..3e40a62ab0e4 100644 --- a/Objects/clinic/unicodeobject.c.h +++ b/Objects/clinic/unicodeobject.c.h @@ -546,11 +546,14 @@ unicode_strip(PyObject *self, PyObject *const *args, Py_ssize_t nargs) PyObject *return_value = NULL; PyObject *chars = Py_None; - if (!_PyArg_UnpackStack(args, nargs, "strip", - 0, 1, - &chars)) { + if (!_PyArg_CheckPositional("strip", nargs, 0, 1)) { goto exit; } + if (nargs < 1) { + goto skip_optional; + } + chars = args[0]; +skip_optional: return_value = unicode_strip_impl(self, chars); exit: @@ -577,11 +580,14 @@ unicode_lstrip(PyObject *self, PyObject *const *args, Py_ssize_t nargs) PyObject *return_value = NULL; PyObject *chars = NULL; - if (!_PyArg_UnpackStack(args, nargs, "lstrip", - 0, 1, - &chars)) { + if (!_PyArg_CheckPositional("lstrip", nargs, 0, 1)) { goto exit; } + if (nargs < 1) { + goto skip_optional; + } + chars = args[0]; +skip_optional: return_value = unicode_lstrip_impl(self, chars); exit: @@ -608,11 +614,14 @@ unicode_rstrip(PyObject *self, PyObject *const *args, Py_ssize_t nargs) PyObject *return_value = NULL; PyObject *chars = NULL; - if (!_PyArg_UnpackStack(args, nargs, "rstrip", - 0, 1, - &chars)) { + if (!_PyArg_CheckPositional("rstrip", nargs, 0, 1)) { goto exit; } + if (nargs < 1) { + goto skip_optional; + } + chars = args[0]; +skip_optional: return_value = unicode_rstrip_impl(self, chars); exit: @@ -1098,4 +1107,4 @@ unicode_sizeof(PyObject *self, PyObject *Py_UNUSED(ignored)) { return unicode_sizeof_impl(self); } -/*[clinic end generated code: output=73ad9670e00a2490 input=a9049054013a1b77]*/ +/*[clinic end generated code: output=087ff163a10505ae input=a9049054013a1b77]*/ diff --git a/Python/clinic/bltinmodule.c.h b/Python/clinic/bltinmodule.c.h index 68d8dccea651..1b82f773edd8 100644 --- a/Python/clinic/bltinmodule.c.h +++ b/Python/clinic/bltinmodule.c.h @@ -217,11 +217,11 @@ builtin_divmod(PyObject *module, PyObject *const *args, Py_ssize_t nargs) PyObject *x; PyObject *y; - if (!_PyArg_UnpackStack(args, nargs, "divmod", - 2, 2, - &x, &y)) { + if (!_PyArg_CheckPositional("divmod", nargs, 2, 2)) { goto exit; } + x = args[0]; + y = args[1]; return_value = builtin_divmod_impl(module, x, y); exit: @@ -255,11 +255,19 @@ builtin_eval(PyObject *module, PyObject *const *args, Py_ssize_t nargs) PyObject *globals = Py_None; PyObject *locals = Py_None; - if (!_PyArg_UnpackStack(args, nargs, "eval", - 1, 3, - &source, &globals, &locals)) { + if (!_PyArg_CheckPositional("eval", nargs, 1, 3)) { goto exit; } + source = args[0]; + if (nargs < 2) { + goto skip_optional; + } + globals = args[1]; + if (nargs < 3) { + goto skip_optional; + } + locals = args[2]; +skip_optional: return_value = builtin_eval_impl(module, source, globals, locals); exit: @@ -293,11 +301,19 @@ builtin_exec(PyObject *module, PyObject *const *args, Py_ssize_t nargs) PyObject *globals = Py_None; PyObject *locals = Py_None; - if (!_PyArg_UnpackStack(args, nargs, "exec", - 1, 3, - &source, &globals, &locals)) { + if (!_PyArg_CheckPositional("exec", nargs, 1, 3)) { goto exit; } + source = args[0]; + if (nargs < 2) { + goto skip_optional; + } + globals = args[1]; + if (nargs < 3) { + goto skip_optional; + } + locals = args[2]; +skip_optional: return_value = builtin_exec_impl(module, source, globals, locals); exit: @@ -346,11 +362,11 @@ builtin_hasattr(PyObject *module, PyObject *const *args, Py_ssize_t nargs) PyObject *obj; PyObject *name; - if (!_PyArg_UnpackStack(args, nargs, "hasattr", - 2, 2, - &obj, &name)) { + if (!_PyArg_CheckPositional("hasattr", nargs, 2, 2)) { goto exit; } + obj = args[0]; + name = args[1]; return_value = builtin_hasattr_impl(module, obj, name); exit: @@ -392,11 +408,12 @@ builtin_setattr(PyObject *module, PyObject *const *args, Py_ssize_t nargs) PyObject *name; PyObject *value; - if (!_PyArg_UnpackStack(args, nargs, "setattr", - 3, 3, - &obj, &name, &value)) { + if (!_PyArg_CheckPositional("setattr", nargs, 3, 3)) { goto exit; } + obj = args[0]; + name = args[1]; + value = args[2]; return_value = builtin_setattr_impl(module, obj, name, value); exit: @@ -424,11 +441,11 @@ builtin_delattr(PyObject *module, PyObject *const *args, Py_ssize_t nargs) PyObject *obj; PyObject *name; - if (!_PyArg_UnpackStack(args, nargs, "delattr", - 2, 2, - &obj, &name)) { + if (!_PyArg_CheckPositional("delattr", nargs, 2, 2)) { goto exit; } + obj = args[0]; + name = args[1]; return_value = builtin_delattr_impl(module, obj, name); exit: @@ -534,11 +551,16 @@ builtin_pow(PyObject *module, PyObject *const *args, Py_ssize_t nargs) PyObject *y; PyObject *z = Py_None; - if (!_PyArg_UnpackStack(args, nargs, "pow", - 2, 3, - &x, &y, &z)) { + if (!_PyArg_CheckPositional("pow", nargs, 2, 3)) { goto exit; } + x = args[0]; + y = args[1]; + if (nargs < 3) { + goto skip_optional; + } + z = args[2]; +skip_optional: return_value = builtin_pow_impl(module, x, y, z); exit: @@ -569,11 +591,14 @@ builtin_input(PyObject *module, PyObject *const *args, Py_ssize_t nargs) PyObject *return_value = NULL; PyObject *prompt = NULL; - if (!_PyArg_UnpackStack(args, nargs, "input", - 0, 1, - &prompt)) { + if (!_PyArg_CheckPositional("input", nargs, 0, 1)) { goto exit; } + if (nargs < 1) { + goto skip_optional; + } + prompt = args[0]; +skip_optional: return_value = builtin_input_impl(module, prompt); exit: @@ -684,11 +709,11 @@ builtin_isinstance(PyObject *module, PyObject *const *args, Py_ssize_t nargs) PyObject *obj; PyObject *class_or_tuple; - if (!_PyArg_UnpackStack(args, nargs, "isinstance", - 2, 2, - &obj, &class_or_tuple)) { + if (!_PyArg_CheckPositional("isinstance", nargs, 2, 2)) { goto exit; } + obj = args[0]; + class_or_tuple = args[1]; return_value = builtin_isinstance_impl(module, obj, class_or_tuple); exit: @@ -719,14 +744,14 @@ builtin_issubclass(PyObject *module, PyObject *const *args, Py_ssize_t nargs) PyObject *cls; PyObject *class_or_tuple; - if (!_PyArg_UnpackStack(args, nargs, "issubclass", - 2, 2, - &cls, &class_or_tuple)) { + if (!_PyArg_CheckPositional("issubclass", nargs, 2, 2)) { goto exit; } + cls = args[0]; + class_or_tuple = args[1]; return_value = builtin_issubclass_impl(module, cls, class_or_tuple); exit: return return_value; } -/*[clinic end generated code: output=11b5cd918bd7eb18 input=a9049054013a1b77]*/ +/*[clinic end generated code: output=54e5e33dcc2659e0 input=a9049054013a1b77]*/ diff --git a/Python/clinic/context.c.h b/Python/clinic/context.c.h index 32b1883ebe68..bbe19db1bd93 100644 --- a/Python/clinic/context.c.h +++ b/Python/clinic/context.c.h @@ -25,11 +25,15 @@ _contextvars_Context_get(PyContext *self, PyObject *const *args, Py_ssize_t narg PyObject *key; PyObject *default_value = Py_None; - if (!_PyArg_UnpackStack(args, nargs, "get", - 1, 2, - &key, &default_value)) { + if (!_PyArg_CheckPositional("get", nargs, 1, 2)) { goto exit; } + key = args[0]; + if (nargs < 2) { + goto skip_optional; + } + default_value = args[1]; +skip_optional: return_value = _contextvars_Context_get_impl(self, key, default_value); exit: @@ -134,11 +138,14 @@ _contextvars_ContextVar_get(PyContextVar *self, PyObject *const *args, Py_ssize_ PyObject *return_value = NULL; PyObject *default_value = NULL; - if (!_PyArg_UnpackStack(args, nargs, "get", - 0, 1, - &default_value)) { + if (!_PyArg_CheckPositional("get", nargs, 0, 1)) { goto exit; } + if (nargs < 1) { + goto skip_optional; + } + default_value = args[0]; +skip_optional: return_value = _contextvars_ContextVar_get_impl(self, default_value); exit: @@ -170,4 +177,4 @@ PyDoc_STRVAR(_contextvars_ContextVar_reset__doc__, #define _CONTEXTVARS_CONTEXTVAR_RESET_METHODDEF \ {"reset", (PyCFunction)_contextvars_ContextVar_reset, METH_O, _contextvars_ContextVar_reset__doc__}, -/*[clinic end generated code: output=9c93e22bcadbaa2b input=a9049054013a1b77]*/ +/*[clinic end generated code: output=67c3a8f76b6cf4e7 input=a9049054013a1b77]*/ diff --git a/Python/clinic/import.c.h b/Python/clinic/import.c.h index 783ed4ebb96a..9ee20fbe5089 100644 --- a/Python/clinic/import.c.h +++ b/Python/clinic/import.c.h @@ -318,11 +318,15 @@ _imp_create_dynamic(PyObject *module, PyObject *const *args, Py_ssize_t nargs) PyObject *spec; PyObject *file = NULL; - if (!_PyArg_UnpackStack(args, nargs, "create_dynamic", - 1, 2, - &spec, &file)) { + if (!_PyArg_CheckPositional("create_dynamic", nargs, 1, 2)) { goto exit; } + spec = args[0]; + if (nargs < 2) { + goto skip_optional; + } + file = args[1]; +skip_optional: return_value = _imp_create_dynamic_impl(module, spec, file); exit: @@ -433,4 +437,4 @@ _imp_source_hash(PyObject *module, PyObject *const *args, Py_ssize_t nargs, PyOb #ifndef _IMP_EXEC_DYNAMIC_METHODDEF #define _IMP_EXEC_DYNAMIC_METHODDEF #endif /* !defined(_IMP_EXEC_DYNAMIC_METHODDEF) */ -/*[clinic end generated code: output=22062cee6e8ba7f3 input=a9049054013a1b77]*/ +/*[clinic end generated code: output=2409b8feeafe7c4b input=a9049054013a1b77]*/ diff --git a/Python/clinic/sysmodule.c.h b/Python/clinic/sysmodule.c.h index 7370ab59ac00..fc9794bf255a 100644 --- a/Python/clinic/sysmodule.c.h +++ b/Python/clinic/sysmodule.c.h @@ -32,11 +32,12 @@ sys_excepthook(PyObject *module, PyObject *const *args, Py_ssize_t nargs) PyObject *value; PyObject *traceback; - if (!_PyArg_UnpackStack(args, nargs, "excepthook", - 3, 3, - &exctype, &value, &traceback)) { + if (!_PyArg_CheckPositional("excepthook", nargs, 3, 3)) { goto exit; } + exctype = args[0]; + value = args[1]; + traceback = args[2]; return_value = sys_excepthook_impl(module, exctype, value, traceback); exit: @@ -87,11 +88,14 @@ sys_exit(PyObject *module, PyObject *const *args, Py_ssize_t nargs) PyObject *return_value = NULL; PyObject *status = NULL; - if (!_PyArg_UnpackStack(args, nargs, "exit", - 0, 1, - &status)) { + if (!_PyArg_CheckPositional("exit", nargs, 0, 1)) { goto exit; } + if (nargs < 1) { + goto skip_optional; + } + status = args[0]; +skip_optional: return_value = sys_exit_impl(module, status); exit: @@ -1046,4 +1050,4 @@ sys_getandroidapilevel(PyObject *module, PyObject *Py_UNUSED(ignored)) #ifndef SYS_GETANDROIDAPILEVEL_METHODDEF #define SYS_GETANDROIDAPILEVEL_METHODDEF #endif /* !defined(SYS_GETANDROIDAPILEVEL_METHODDEF) */ -/*[clinic end generated code: output=6a5202e5bfe5e6bd input=a9049054013a1b77]*/ +/*[clinic end generated code: output=109787af3401cd27 input=a9049054013a1b77]*/ diff --git a/Tools/clinic/clinic.py b/Tools/clinic/clinic.py index 4087d3fec933..7f435f154692 100755 --- a/Tools/clinic/clinic.py +++ b/Tools/clinic/clinic.py @@ -643,18 +643,6 @@ def output_templates(self, f): f.return_converter.type == 'PyObject *') positional = parameters and parameters[-1].is_positional_only() - all_boring_objects = False # yes, this will be false if there are 0 parameters, it's fine - first_optional = len(parameters) - for i, p in enumerate(parameters): - c = p.converter - if type(c) != object_converter: - break - if c.format_unit != 'O': - break - if p.default is not unspecified: - first_optional = min(first_optional, i) - else: - all_boring_objects = True new_or_init = f.kind in (METHOD_NEW, METHOD_INIT) @@ -827,34 +815,6 @@ def insert_keywords(s): parser_definition = parser_body(parser_prototype, ' {option_group_parsing}') - elif positional and all_boring_objects: - # positional-only, but no option groups, - # and nothing but normal objects: - # PyArg_UnpackTuple! - - if not new_or_init: - flags = "METH_FASTCALL" - parser_prototype = parser_prototype_fastcall - - parser_definition = parser_body(parser_prototype, normalize_snippet(""" - if (!_PyArg_UnpackStack(args, nargs, "{name}", - {unpack_min}, {unpack_max}, - {parse_arguments})) {{ - goto exit; - }} - """, indent=4)) - else: - flags = "METH_VARARGS" - parser_prototype = parser_prototype_varargs - - parser_definition = parser_body(parser_prototype, normalize_snippet(""" - if (!PyArg_UnpackTuple(args, "{name}", - {unpack_min}, {unpack_max}, - {parse_arguments})) {{ - goto exit; - }} - """, indent=4)) - elif positional: if not new_or_init: # positional-only, but no option groups From webhook-mailer at python.org Fri Jan 11 13:17:14 2019 From: webhook-mailer at python.org (Eric Snow) Date: Fri, 11 Jan 2019 18:17:14 -0000 Subject: [Python-checkins] bpo-34569: Fix subinterpreter 32-bit ABI, pystate.c/_new_long_object() (gh-9127) Message-ID: https://github.com/python/cpython/commit/a909460a09cca79bd051c45b02e650862a57dbd9 commit: a909460a09cca79bd051c45b02e650862a57dbd9 branch: master author: Michael Felt committer: Eric Snow date: 2019-01-11T11:17:03-07:00 summary: bpo-34569: Fix subinterpreter 32-bit ABI, pystate.c/_new_long_object() (gh-9127) This fixes ShareableTypeTests.test_int() in Lib/test/test__xxsubinterpreters.py. files: A Misc/NEWS.d/next/Tests/2018-09-09-14-36-59.bpo-34569.okj1Xh.rst M Python/pystate.c diff --git a/Misc/NEWS.d/next/Tests/2018-09-09-14-36-59.bpo-34569.okj1Xh.rst b/Misc/NEWS.d/next/Tests/2018-09-09-14-36-59.bpo-34569.okj1Xh.rst new file mode 100644 index 000000000000..bd433adfc357 --- /dev/null +++ b/Misc/NEWS.d/next/Tests/2018-09-09-14-36-59.bpo-34569.okj1Xh.rst @@ -0,0 +1,2 @@ +The experimental PEP 554 data channels now correctly pass negative PyLong +objects between subinterpreters on 32-bit systems. Patch by Michael Felt. diff --git a/Python/pystate.c b/Python/pystate.c index 98882eb7589c..4dc3b81e4cdb 100644 --- a/Python/pystate.c +++ b/Python/pystate.c @@ -1467,7 +1467,7 @@ _str_shared(PyObject *obj, _PyCrossInterpreterData *data) static PyObject * _new_long_object(_PyCrossInterpreterData *data) { - return PyLong_FromLongLong((int64_t)(data->data)); + return PyLong_FromLongLong((intptr_t)(data->data)); } static int From webhook-mailer at python.org Fri Jan 11 16:27:08 2019 From: webhook-mailer at python.org (Eric Snow) Date: Fri, 11 Jan 2019 21:27:08 -0000 Subject: [Python-checkins] bpo-35423: Stop using the "pending calls" machinery for signals. (gh-10972) Message-ID: https://github.com/python/cpython/commit/fdf282d609fd172d52b59a6f1f062eb701494528 commit: fdf282d609fd172d52b59a6f1f062eb701494528 branch: master author: Eric Snow committer: GitHub date: 2019-01-11T14:26:55-07:00 summary: bpo-35423: Stop using the "pending calls" machinery for signals. (gh-10972) This change separates the signal handling trigger in the eval loop from the "pending calls" machinery. There is no semantic change and the difference in performance is insignificant. The change makes both components less confusing. It also eliminates the risk of changes to the pending calls affecting signal handling. This is particularly relevant for some upcoming pending calls changes I have in the works. files: A Misc/NEWS.d/next/Core and Builtins/2018-12-05-16-24-05.bpo-35423.UIie_O.rst M Include/internal/pycore_ceval.h M Modules/signalmodule.c M Python/ceval.c diff --git a/Include/internal/pycore_ceval.h b/Include/internal/pycore_ceval.h index c8c63b1c7fda..b9f2d7d17585 100644 --- a/Include/internal/pycore_ceval.h +++ b/Include/internal/pycore_ceval.h @@ -45,6 +45,8 @@ struct _ceval_runtime_state { /* Request for dropping the GIL */ _Py_atomic_int gil_drop_request; struct _pending_calls pending; + /* Request for checking signals. */ + _Py_atomic_int signals_pending; struct _gil_runtime_state gil; }; diff --git a/Misc/NEWS.d/next/Core and Builtins/2018-12-05-16-24-05.bpo-35423.UIie_O.rst b/Misc/NEWS.d/next/Core and Builtins/2018-12-05-16-24-05.bpo-35423.UIie_O.rst new file mode 100644 index 000000000000..271ec48b3475 --- /dev/null +++ b/Misc/NEWS.d/next/Core and Builtins/2018-12-05-16-24-05.bpo-35423.UIie_O.rst @@ -0,0 +1,3 @@ +Separate the signal handling trigger in the eval loop from the "pending +calls" machinery. There is no semantic change and the difference in +performance is insignificant. diff --git a/Modules/signalmodule.c b/Modules/signalmodule.c index 4f8f71a0a1df..9d49cbd14400 100644 --- a/Modules/signalmodule.c +++ b/Modules/signalmodule.c @@ -202,12 +202,15 @@ It raises KeyboardInterrupt."); static int report_wakeup_write_error(void *data) { + PyObject *exc, *val, *tb; int save_errno = errno; errno = (int) (intptr_t) data; + PyErr_Fetch(&exc, &val, &tb); PyErr_SetFromErrno(PyExc_OSError); PySys_WriteStderr("Exception ignored when trying to write to the " "signal wakeup fd:\n"); PyErr_WriteUnraisable(NULL); + PyErr_Restore(exc, val, tb); errno = save_errno; return 0; } @@ -216,6 +219,8 @@ report_wakeup_write_error(void *data) static int report_wakeup_send_error(void* data) { + PyObject *exc, *val, *tb; + PyErr_Fetch(&exc, &val, &tb); /* PyErr_SetExcFromWindowsErr() invokes FormatMessage() which recognizes the error codes used by both GetLastError() and WSAGetLastError */ @@ -223,6 +228,7 @@ report_wakeup_send_error(void* data) PySys_WriteStderr("Exception ignored when trying to send to the " "signal wakeup fd:\n"); PyErr_WriteUnraisable(NULL); + PyErr_Restore(exc, val, tb); return 0; } #endif /* MS_WINDOWS */ diff --git a/Python/ceval.c b/Python/ceval.c index de6ff2994550..3e82ceb95251 100644 --- a/Python/ceval.c +++ b/Python/ceval.c @@ -100,6 +100,7 @@ static long dxp[256]; _Py_atomic_store_relaxed( \ &_PyRuntime.ceval.eval_breaker, \ GIL_REQUEST | \ + _Py_atomic_load_relaxed(&_PyRuntime.ceval.signals_pending) | \ _Py_atomic_load_relaxed(&_PyRuntime.ceval.pending.calls_to_do) | \ _PyRuntime.ceval.pending.async_exc) @@ -128,6 +129,18 @@ static long dxp[256]; COMPUTE_EVAL_BREAKER(); \ } while (0) +#define SIGNAL_PENDING_SIGNALS() \ + do { \ + _Py_atomic_store_relaxed(&_PyRuntime.ceval.signals_pending, 1); \ + _Py_atomic_store_relaxed(&_PyRuntime.ceval.eval_breaker, 1); \ + } while (0) + +#define UNSIGNAL_PENDING_SIGNALS() \ + do { \ + _Py_atomic_store_relaxed(&_PyRuntime.ceval.signals_pending, 0); \ + COMPUTE_EVAL_BREAKER(); \ + } while (0) + #define SIGNAL_ASYNC_EXC() \ do { \ _PyRuntime.ceval.pending.async_exc = 1; \ @@ -306,7 +319,7 @@ _PyEval_SignalReceived(void) /* bpo-30703: Function called when the C signal handler of Python gets a signal. We cannot queue a callback using Py_AddPendingCall() since that function is not async-signal-safe. */ - SIGNAL_PENDING_CALLS(); + SIGNAL_PENDING_SIGNALS(); } /* This implementation is thread-safe. It allows @@ -356,21 +369,28 @@ Py_AddPendingCall(int (*func)(void *), void *arg) return result; } -int -Py_MakePendingCalls(void) +static int +handle_signals(void) { - static int busy = 0; - int i; - int r = 0; - - assert(PyGILState_Check()); + /* Only handle signals on main thread. */ + if (_PyRuntime.ceval.pending.main_thread && + PyThread_get_thread_ident() != _PyRuntime.ceval.pending.main_thread) + { + return 0; + } - if (!_PyRuntime.ceval.pending.lock) { - /* initial allocation of the lock */ - _PyRuntime.ceval.pending.lock = PyThread_allocate_lock(); - if (_PyRuntime.ceval.pending.lock == NULL) - return -1; + UNSIGNAL_PENDING_SIGNALS(); + if (PyErr_CheckSignals() < 0) { + SIGNAL_PENDING_SIGNALS(); /* We're not done yet */ + return -1; } + return 0; +} + +static int +make_pending_calls(void) +{ + static int busy = 0; /* only service pending calls on main thread */ if (_PyRuntime.ceval.pending.main_thread && @@ -378,22 +398,28 @@ Py_MakePendingCalls(void) { return 0; } + /* don't perform recursive pending calls */ - if (busy) + if (busy) { return 0; + } busy = 1; /* unsignal before starting to call callbacks, so that any callback added in-between re-signals */ UNSIGNAL_PENDING_CALLS(); + int res = 0; - /* Python signal handler doesn't really queue a callback: it only signals - that a signal was received, see _PyEval_SignalReceived(). */ - if (PyErr_CheckSignals() < 0) { - goto error; + if (!_PyRuntime.ceval.pending.lock) { + /* initial allocation of the lock */ + _PyRuntime.ceval.pending.lock = PyThread_allocate_lock(); + if (_PyRuntime.ceval.pending.lock == NULL) { + res = -1; + goto error; + } } /* perform a bounded number of calls, in case of recursion */ - for (i=0; i https://github.com/python/cpython/commit/cb08a71c5c534f33d9486677534dafb087c30e8c commit: cb08a71c5c534f33d9486677534dafb087c30e8c branch: master author: Ammar Askar committer: Serhiy Storchaka date: 2019-01-12T08:23:41+02:00 summary: bpo-34838: Use subclass_of for math.dist. (GH-9659) Argument clinic now generates fast inline code for positional parsing, so the manually implemented type check in math.dist can be removed. files: M Lib/test/test_math.py M Modules/clinic/mathmodule.c.h M Modules/mathmodule.c diff --git a/Lib/test/test_math.py b/Lib/test/test_math.py index 9b2f55e1f410..b476a39e0aeb 100644 --- a/Lib/test/test_math.py +++ b/Lib/test/test_math.py @@ -867,6 +867,8 @@ class T(tuple): dist((1, 2, 3, 4), (5, 6, 7)) with self.assertRaises(ValueError): # Check dimension agree dist((1, 2, 3), (4, 5, 6, 7)) + with self.assertRaises(TypeError): # Rejects invalid types + dist("abc", "xyz") # Verify that the one dimensional case is equivalent to abs() for i in range(20): diff --git a/Modules/clinic/mathmodule.c.h b/Modules/clinic/mathmodule.c.h index 1c8fc2fc2c3f..82a4c4a0e7a4 100644 --- a/Modules/clinic/mathmodule.c.h +++ b/Modules/clinic/mathmodule.c.h @@ -310,7 +310,15 @@ math_dist(PyObject *module, PyObject *const *args, Py_ssize_t nargs) if (!_PyArg_CheckPositional("dist", nargs, 2, 2)) { goto exit; } + if (!PyTuple_Check(args[0])) { + _PyArg_BadArgument("dist", 1, "tuple", args[0]); + goto exit; + } p = args[0]; + if (!PyTuple_Check(args[1])) { + _PyArg_BadArgument("dist", 2, "tuple", args[1]); + goto exit; + } q = args[1]; return_value = math_dist_impl(module, p, q); @@ -548,4 +556,4 @@ math_isclose(PyObject *module, PyObject *const *args, Py_ssize_t nargs, PyObject exit: return return_value; } -/*[clinic end generated code: output=f3264ab0ef57ba0a input=a9049054013a1b77]*/ +/*[clinic end generated code: output=0664f30046da09fe input=a9049054013a1b77]*/ diff --git a/Modules/mathmodule.c b/Modules/mathmodule.c index d56c91cedc26..2db2b45dd204 100644 --- a/Modules/mathmodule.c +++ b/Modules/mathmodule.c @@ -2101,8 +2101,8 @@ vector_norm(Py_ssize_t n, double *vec, double max, int found_nan) /*[clinic input] math.dist - p: object - q: object + p: object(subclass_of='&PyTuple_Type') + q: object(subclass_of='&PyTuple_Type') / Return the Euclidean distance between two points p and q. @@ -2116,7 +2116,7 @@ Roughly equivalent to: static PyObject * math_dist_impl(PyObject *module, PyObject *p, PyObject *q) -/*[clinic end generated code: output=56bd9538d06bbcfe input=8c83c07c7a524664]*/ +/*[clinic end generated code: output=56bd9538d06bbcfe input=937122eaa5f19272]*/ { PyObject *item; double max = 0.0; @@ -2126,11 +2126,6 @@ math_dist_impl(PyObject *module, PyObject *p, PyObject *q) double diffs_on_stack[NUM_STACK_ELEMS]; double *diffs = diffs_on_stack; - if (!PyTuple_Check(p) || !PyTuple_Check(q)) { - PyErr_SetString(PyExc_TypeError, "dist argument must be a tuple"); - return NULL; - } - m = PyTuple_GET_SIZE(p); n = PyTuple_GET_SIZE(q); if (m != n) { From webhook-mailer at python.org Sat Jan 12 01:25:46 2019 From: webhook-mailer at python.org (Serhiy Storchaka) Date: Sat, 12 Jan 2019 06:25:46 -0000 Subject: [Python-checkins] bpo-35582: Inline arguments tuple unpacking in handwritten code. (GH-11524) Message-ID: https://github.com/python/cpython/commit/793426687509be24a42663a27e568cc92dcc07f6 commit: 793426687509be24a42663a27e568cc92dcc07f6 branch: master author: Serhiy Storchaka committer: GitHub date: 2019-01-12T08:25:41+02:00 summary: bpo-35582: Inline arguments tuple unpacking in handwritten code. (GH-11524) Inline PyArg_UnpackTuple() and _PyArg_UnpackStack() in performance sensitive code in the builtins and operator modules. files: M Modules/_operator.c M Python/bltinmodule.c diff --git a/Modules/_operator.c b/Modules/_operator.c index d6c6a18d81b4..d291ec1f920e 100644 --- a/Modules/_operator.c +++ b/Modules/_operator.c @@ -1011,15 +1011,12 @@ itemgetter_call(itemgetterobject *ig, PyObject *args, PyObject *kw) Py_ssize_t i, nitems=ig->nitems; assert(PyTuple_CheckExact(args)); - if (kw == NULL && PyTuple_GET_SIZE(args) == 1) { - obj = PyTuple_GET_ITEM(args, 0); - } - else { - if (!_PyArg_NoKeywords("itemgetter", kw)) - return NULL; - if (!PyArg_UnpackTuple(args, "itemgetter", 1, 1, &obj)) - return NULL; - } + if (!_PyArg_NoKeywords("itemgetter", kw)) + return NULL; + if (!_PyArg_CheckPositional("itemgetter", PyTuple_GET_SIZE(args), 1, 1)) + return NULL; + + obj = PyTuple_GET_ITEM(args, 0); if (nitems == 1) { if (ig->index >= 0 && PyTuple_CheckExact(obj) @@ -1317,8 +1314,9 @@ attrgetter_call(attrgetterobject *ag, PyObject *args, PyObject *kw) if (!_PyArg_NoKeywords("attrgetter", kw)) return NULL; - if (!PyArg_UnpackTuple(args, "attrgetter", 1, 1, &obj)) + if (!_PyArg_CheckPositional("attrgetter", PyTuple_GET_SIZE(args), 1, 1)) return NULL; + obj = PyTuple_GET_ITEM(args, 0); if (ag->nattrs == 1) /* ag->attr is always a tuple */ return dotted_getattr(obj, PyTuple_GET_ITEM(ag->attr, 0)); @@ -1561,8 +1559,9 @@ methodcaller_call(methodcallerobject *mc, PyObject *args, PyObject *kw) if (!_PyArg_NoKeywords("methodcaller", kw)) return NULL; - if (!PyArg_UnpackTuple(args, "methodcaller", 1, 1, &obj)) + if (!_PyArg_CheckPositional("methodcaller", PyTuple_GET_SIZE(args), 1, 1)) return NULL; + obj = PyTuple_GET_ITEM(args, 0); method = PyObject_GetAttr(obj, mc->name); if (method == NULL) return NULL; diff --git a/Python/bltinmodule.c b/Python/bltinmodule.c index e19bc5604ba1..332142fc6ffc 100644 --- a/Python/bltinmodule.c +++ b/Python/bltinmodule.c @@ -1067,19 +1067,21 @@ builtin_exec_impl(PyObject *module, PyObject *source, PyObject *globals, static PyObject * builtin_getattr(PyObject *self, PyObject *const *args, Py_ssize_t nargs) { - PyObject *v, *result, *dflt = NULL; - PyObject *name; + PyObject *v, *name, *result; - if (!_PyArg_UnpackStack(args, nargs, "getattr", 2, 3, &v, &name, &dflt)) + if (!_PyArg_CheckPositional("getattr", nargs, 2, 3)) return NULL; + v = args[0]; + name = args[1]; if (!PyUnicode_Check(name)) { PyErr_SetString(PyExc_TypeError, "getattr(): attribute name must be string"); return NULL; } - if (dflt != NULL) { + if (nargs > 2) { if (_PyObject_LookupAttr(v, name, &result) == 0) { + PyObject *dflt = args[2]; Py_INCREF(dflt); return dflt; } @@ -1372,11 +1374,11 @@ static PyObject * builtin_next(PyObject *self, PyObject *const *args, Py_ssize_t nargs) { PyObject *it, *res; - PyObject *def = NULL; - if (!_PyArg_UnpackStack(args, nargs, "next", 1, 2, &it, &def)) + if (!_PyArg_CheckPositional("next", nargs, 1, 2)) return NULL; + it = args[0]; if (!PyIter_Check(it)) { PyErr_Format(PyExc_TypeError, "'%.200s' object is not an iterator", @@ -1387,7 +1389,8 @@ builtin_next(PyObject *self, PyObject *const *args, Py_ssize_t nargs) res = (*it->ob_type->tp_iternext)(it); if (res != NULL) { return res; - } else if (def != NULL) { + } else if (nargs > 1) { + PyObject *def = args[1]; if (PyErr_Occurred()) { if(!PyErr_ExceptionMatches(PyExc_StopIteration)) return NULL; @@ -1503,20 +1506,22 @@ builtin_hex(PyObject *module, PyObject *number) /* AC: cannot convert yet, as needs PEP 457 group support in inspect */ static PyObject * -builtin_iter(PyObject *self, PyObject *args) +builtin_iter(PyObject *self, PyObject *const *args, Py_ssize_t nargs) { - PyObject *v, *w = NULL; + PyObject *v; - if (!PyArg_UnpackTuple(args, "iter", 1, 2, &v, &w)) + if (!_PyArg_CheckPositional("iter", nargs, 1, 2)) return NULL; - if (w == NULL) + v = args[0]; + if (nargs == 1) return PyObject_GetIter(v); if (!PyCallable_Check(v)) { PyErr_SetString(PyExc_TypeError, "iter(v, w): v must be callable"); return NULL; } - return PyCallIter_New(v, w); + PyObject *sentinel = args[1]; + return PyCallIter_New(v, sentinel); } PyDoc_STRVAR(iter_doc, @@ -2718,7 +2723,7 @@ static PyMethodDef builtin_methods[] = { BUILTIN_INPUT_METHODDEF BUILTIN_ISINSTANCE_METHODDEF BUILTIN_ISSUBCLASS_METHODDEF - {"iter", builtin_iter, METH_VARARGS, iter_doc}, + {"iter", (PyCFunction)(void(*)(void))builtin_iter, METH_FASTCALL, iter_doc}, BUILTIN_LEN_METHODDEF BUILTIN_LOCALS_METHODDEF {"max", (PyCFunction)(void(*)(void))builtin_max, METH_VARARGS | METH_KEYWORDS, max_doc}, From webhook-mailer at python.org Sat Jan 12 01:26:38 2019 From: webhook-mailer at python.org (Serhiy Storchaka) Date: Sat, 12 Jan 2019 06:26:38 -0000 Subject: [Python-checkins] bpo-35719: Optimize multi-argument math functions. (GH-11527) Message-ID: https://github.com/python/cpython/commit/d0d3e99120b19a4b800f0f381b2807c93aeecf0e commit: d0d3e99120b19a4b800f0f381b2807c93aeecf0e branch: master author: Serhiy Storchaka committer: GitHub date: 2019-01-12T08:26:34+02:00 summary: bpo-35719: Optimize multi-argument math functions. (GH-11527) Use the fast call convention for math functions atan2(), copysign(), hypot() and remainder() and inline unpacking arguments. This sped up them by 1.3--2.5 times. files: A Misc/NEWS.d/next/Library/2019-01-11-20-21-59.bpo-35719.qyRcpE.rst M Modules/mathmodule.c diff --git a/Misc/NEWS.d/next/Library/2019-01-11-20-21-59.bpo-35719.qyRcpE.rst b/Misc/NEWS.d/next/Library/2019-01-11-20-21-59.bpo-35719.qyRcpE.rst new file mode 100644 index 000000000000..e46e14296479 --- /dev/null +++ b/Misc/NEWS.d/next/Library/2019-01-11-20-21-59.bpo-35719.qyRcpE.rst @@ -0,0 +1,2 @@ +Sped up multi-argument :mod:`math` functions atan2(), copysign(), +remainder() and hypot() by 1.3--2.5 times. diff --git a/Modules/mathmodule.c b/Modules/mathmodule.c index 2db2b45dd204..a190f5ccf7e3 100644 --- a/Modules/mathmodule.c +++ b/Modules/mathmodule.c @@ -997,14 +997,14 @@ math_1_to_int(PyObject *arg, double (*func) (double), int can_overflow) } static PyObject * -math_2(PyObject *args, double (*func) (double, double), const char *funcname) +math_2(PyObject *const *args, Py_ssize_t nargs, + double (*func) (double, double), const char *funcname) { - PyObject *ox, *oy; double x, y, r; - if (! PyArg_UnpackTuple(args, funcname, 2, 2, &ox, &oy)) + if (!_PyArg_CheckPositional(funcname, nargs, 2, 2)) return NULL; - x = PyFloat_AsDouble(ox); - y = PyFloat_AsDouble(oy); + x = PyFloat_AsDouble(args[0]); + y = PyFloat_AsDouble(args[1]); if ((x == -1.0 || y == -1.0) && PyErr_Occurred()) return NULL; errno = 0; @@ -1042,8 +1042,8 @@ math_2(PyObject *args, double (*func) (double, double), const char *funcname) PyDoc_STRVAR(math_##funcname##_doc, docstring); #define FUNC2(funcname, func, docstring) \ - static PyObject * math_##funcname(PyObject *self, PyObject *args) { \ - return math_2(args, func, #funcname); \ + static PyObject * math_##funcname(PyObject *self, PyObject *const *args, Py_ssize_t nargs) { \ + return math_2(args, nargs, func, #funcname); \ }\ PyDoc_STRVAR(math_##funcname##_doc, docstring); @@ -2181,9 +2181,9 @@ math_dist_impl(PyObject *module, PyObject *p, PyObject *q) /* AC: cannot convert yet, waiting for *args support */ static PyObject * -math_hypot(PyObject *self, PyObject *args) +math_hypot(PyObject *self, PyObject *const *args, Py_ssize_t nargs) { - Py_ssize_t i, n; + Py_ssize_t i; PyObject *item; double max = 0.0; double x, result; @@ -2191,15 +2191,14 @@ math_hypot(PyObject *self, PyObject *args) double coord_on_stack[NUM_STACK_ELEMS]; double *coordinates = coord_on_stack; - n = PyTuple_GET_SIZE(args); - if (n > NUM_STACK_ELEMS) { - coordinates = (double *) PyObject_Malloc(n * sizeof(double)); + if (nargs > NUM_STACK_ELEMS) { + coordinates = (double *) PyObject_Malloc(nargs * sizeof(double)); if (coordinates == NULL) { return PyErr_NoMemory(); } } - for (i=0 ; i https://github.com/python/cpython/commit/44cc4822bb3799858201e61294c5863f93ec12e2 commit: 44cc4822bb3799858201e61294c5863f93ec12e2 branch: master author: Serhiy Storchaka committer: GitHub date: 2019-01-12T09:22:29+02:00 summary: bpo-33817: Fix _PyBytes_Resize() for empty bytes object. (GH-11516) Add also tests for PyUnicode_FromFormat() and PyBytes_FromFormat() with empty result. files: A Misc/NEWS.d/next/C API/2019-01-11-11-16-16.bpo-33817.nJ4yIj.rst M Lib/test/test_bytes.py M Lib/test/test_unicode.py M Objects/bytesobject.c diff --git a/Lib/test/test_bytes.py b/Lib/test/test_bytes.py index cc433217ce16..f7454d9b36a8 100644 --- a/Lib/test/test_bytes.py +++ b/Lib/test/test_bytes.py @@ -1001,6 +1001,12 @@ def ptr_formatter(ptr): self.assertRaises(OverflowError, PyBytes_FromFormat, b'%c', c_int(256)) + # Issue #33817: empty strings + self.assertEqual(PyBytes_FromFormat(b''), + b'') + self.assertEqual(PyBytes_FromFormat(b'%s', b''), + b'') + def test_bytes_blocking(self): class IterationBlocked(list): __bytes__ = None diff --git a/Lib/test/test_unicode.py b/Lib/test/test_unicode.py index fb7bb2d523fe..c277e705b9f5 100644 --- a/Lib/test/test_unicode.py +++ b/Lib/test/test_unicode.py @@ -2680,6 +2680,12 @@ def check_format(expected, format, *args): check_format('%.%s', b'%.%s', b'abc') + # Issue #33817: empty strings + check_format('', + b'') + check_format('', + b'%s', b'') + # Test PyUnicode_AsWideChar() @support.cpython_only def test_aswidechar(self): diff --git a/Misc/NEWS.d/next/C API/2019-01-11-11-16-16.bpo-33817.nJ4yIj.rst b/Misc/NEWS.d/next/C API/2019-01-11-11-16-16.bpo-33817.nJ4yIj.rst new file mode 100644 index 000000000000..ca4ccb26d361 --- /dev/null +++ b/Misc/NEWS.d/next/C API/2019-01-11-11-16-16.bpo-33817.nJ4yIj.rst @@ -0,0 +1 @@ +Fixed :c:func:`_PyBytes_Resize` for empty bytes objects. diff --git a/Objects/bytesobject.c b/Objects/bytesobject.c index adf0cff5f3b2..40ef47144e52 100644 --- a/Objects/bytesobject.c +++ b/Objects/bytesobject.c @@ -2991,9 +2991,22 @@ _PyBytes_Resize(PyObject **pv, Py_ssize_t newsize) /* return early if newsize equals to v->ob_size */ return 0; } + if (Py_SIZE(v) == 0) { + if (newsize == 0) { + return 0; + } + *pv = _PyBytes_FromSize(newsize, 0); + Py_DECREF(v); + return (*pv == NULL) ? -1 : 0; + } if (Py_REFCNT(v) != 1) { goto error; } + if (newsize == 0) { + *pv = _PyBytes_FromSize(0, 0); + Py_DECREF(v); + return (*pv == NULL) ? -1 : 0; + } /* XXX UNREF/NEWREF interface should be more symmetrical */ _Py_DEC_REFTOTAL; _Py_ForgetReference(v); From webhook-mailer at python.org Sat Jan 12 02:22:55 2019 From: webhook-mailer at python.org (Serhiy Storchaka) Date: Sat, 12 Jan 2019 07:22:55 -0000 Subject: [Python-checkins] bpo-33817: Fix _PyString_Resize() and _PyUnicode_Resize() for empty strings. (GH-11515) Message-ID: https://github.com/python/cpython/commit/08a81df05004147ee174ece645679576ab867860 commit: 08a81df05004147ee174ece645679576ab867860 branch: 2.7 author: Serhiy Storchaka committer: GitHub date: 2019-01-12T09:22:52+02:00 summary: bpo-33817: Fix _PyString_Resize() and _PyUnicode_Resize() for empty strings. (GH-11515) files: A Misc/NEWS.d/next/C API/2019-01-11-11-16-36.bpo-33817.qyYxjw.rst M Lib/test/test_str.py M Lib/test/test_unicode.py M Objects/stringobject.c M Objects/unicodeobject.c diff --git a/Lib/test/test_str.py b/Lib/test/test_str.py index 8b306f4e8a4b..73ed542a5114 100644 --- a/Lib/test/test_str.py +++ b/Lib/test/test_str.py @@ -474,8 +474,106 @@ def __rmod__(self, other): self.assertEqual('lhs %% %r' % SubclassedStr('rhs'), "Success, self.__rmod__('lhs %% %r') was called") + +class CAPITest(unittest.TestCase): + + # Test PyString_FromFormat() + def test_from_format(self): + ctypes = test_support.import_module('ctypes') + _testcapi = test_support.import_module('_testcapi') + from ctypes import pythonapi, py_object + from ctypes import ( + c_int, c_uint, + c_long, c_ulong, + c_size_t, c_ssize_t, + c_char_p) + + PyString_FromFormat = pythonapi.PyString_FromFormat + PyString_FromFormat.restype = py_object + + # basic tests + self.assertEqual(PyString_FromFormat(b'format'), + b'format') + self.assertEqual(PyString_FromFormat(b'Hello %s !', b'world'), + b'Hello world !') + + # test formatters + self.assertEqual(PyString_FromFormat(b'c=%c', c_int(0)), + b'c=\0') + self.assertEqual(PyString_FromFormat(b'c=%c', c_int(ord('@'))), + b'c=@') + self.assertEqual(PyString_FromFormat(b'c=%c', c_int(255)), + b'c=\xff') + self.assertEqual(PyString_FromFormat(b'd=%d ld=%ld zd=%zd', + c_int(1), c_long(2), + c_size_t(3)), + b'd=1 ld=2 zd=3') + self.assertEqual(PyString_FromFormat(b'd=%d ld=%ld zd=%zd', + c_int(-1), c_long(-2), + c_size_t(-3)), + b'd=-1 ld=-2 zd=-3') + self.assertEqual(PyString_FromFormat(b'u=%u lu=%lu zu=%zu', + c_uint(123), c_ulong(456), + c_size_t(789)), + b'u=123 lu=456 zu=789') + self.assertEqual(PyString_FromFormat(b'i=%i', c_int(123)), + b'i=123') + self.assertEqual(PyString_FromFormat(b'i=%i', c_int(-123)), + b'i=-123') + self.assertEqual(PyString_FromFormat(b'x=%x', c_int(0xabc)), + b'x=abc') + + self.assertEqual(PyString_FromFormat(b's=%s', c_char_p(b'cstr')), + b's=cstr') + + # test minimum and maximum integer values + size_max = c_size_t(-1).value + for formatstr, ctypes_type, value, py_formatter in ( + (b'%d', c_int, _testcapi.INT_MIN, str), + (b'%d', c_int, _testcapi.INT_MAX, str), + (b'%ld', c_long, _testcapi.LONG_MIN, str), + (b'%ld', c_long, _testcapi.LONG_MAX, str), + (b'%lu', c_ulong, _testcapi.ULONG_MAX, str), + (b'%zd', c_ssize_t, _testcapi.PY_SSIZE_T_MIN, str), + (b'%zd', c_ssize_t, _testcapi.PY_SSIZE_T_MAX, str), + (b'%zu', c_size_t, size_max, str), + ): + self.assertEqual(PyString_FromFormat(formatstr, ctypes_type(value)), + py_formatter(value).encode('ascii')), + + # width and precision (width is currently ignored) + self.assertEqual(PyString_FromFormat(b'%5s', b'a'), + b'a') + self.assertEqual(PyString_FromFormat(b'%.3s', b'abcdef'), + b'abc') + + # '%%' formatter + self.assertEqual(PyString_FromFormat(b'%%'), + b'%') + self.assertEqual(PyString_FromFormat(b'[%%]'), + b'[%]') + self.assertEqual(PyString_FromFormat(b'%%%c', c_int(ord('_'))), + b'%_') + self.assertEqual(PyString_FromFormat(b'%%s'), + b'%s') + + # Invalid formats and partial formatting + self.assertEqual(PyString_FromFormat(b'%'), b'%') + self.assertEqual(PyString_FromFormat(b'x=%i y=%', c_int(2), c_int(3)), + b'x=2 y=%') + + self.assertEqual(PyString_FromFormat(b'%c', c_int(-1)), b'\xff') + self.assertEqual(PyString_FromFormat(b'%c', c_int(256)), b'\0') + + # Issue #33817: empty strings + self.assertEqual(PyString_FromFormat(b''), + b'') + self.assertEqual(PyString_FromFormat(b'%s', b''), + b'') + + def test_main(): - test_support.run_unittest(StrTest) + test_support.run_unittest(StrTest, CAPITest) if __name__ == "__main__": test_main() diff --git a/Lib/test/test_unicode.py b/Lib/test/test_unicode.py index 560b84cb82c2..92476f68a53c 100644 --- a/Lib/test/test_unicode.py +++ b/Lib/test/test_unicode.py @@ -1824,6 +1824,12 @@ def check_format(expected, format, *args): check_format(u'%s', b'%.%s', b'abc') + # Issue #33817: empty strings + check_format(u'', + b'') + check_format(u'', + b'%s', b'') + @test_support.cpython_only def test_encode_decimal(self): from _testcapi import unicode_encodedecimal diff --git a/Misc/NEWS.d/next/C API/2019-01-11-11-16-36.bpo-33817.qyYxjw.rst b/Misc/NEWS.d/next/C API/2019-01-11-11-16-36.bpo-33817.qyYxjw.rst new file mode 100644 index 000000000000..c7360ec4bc51 --- /dev/null +++ b/Misc/NEWS.d/next/C API/2019-01-11-11-16-36.bpo-33817.qyYxjw.rst @@ -0,0 +1,4 @@ +Fixed :c:func:`_PyString_Resize` and :c:func:`_PyUnicode_Resize` for empty +strings. This fixed also :c:func:`PyString_FromFormat` and +:c:func:`PyUnicode_FromFormat` when they return an empty string (e.g. +``PyString_FromFormat("%s", "")``). diff --git a/Objects/stringobject.c b/Objects/stringobject.c index b21afb4424d1..efb0d1401b96 100644 --- a/Objects/stringobject.c +++ b/Objects/stringobject.c @@ -3893,13 +3893,31 @@ _PyString_Resize(PyObject **pv, Py_ssize_t newsize) register PyObject *v; register PyStringObject *sv; v = *pv; - if (!PyString_Check(v) || Py_REFCNT(v) != 1 || newsize < 0 || - PyString_CHECK_INTERNED(v)) { + if (!PyString_Check(v) || newsize < 0) { *pv = 0; Py_DECREF(v); PyErr_BadInternalCall(); return -1; } + if (Py_SIZE(v) == 0) { + if (newsize == 0) { + return 0; + } + *pv = PyString_FromStringAndSize(NULL, newsize); + Py_DECREF(v); + return (*pv == NULL) ? -1 : 0; + } + if (Py_REFCNT(v) != 1 || PyString_CHECK_INTERNED(v)) { + *pv = 0; + Py_DECREF(v); + PyErr_BadInternalCall(); + return -1; + } + if (newsize == 0) { + *pv = PyString_FromStringAndSize(NULL, 0); + Py_DECREF(v); + return (*pv == NULL) ? -1 : 0; + } /* XXX UNREF/NEWREF interface should be more symmetrical */ _Py_DEC_REFTOTAL; _Py_ForgetReference(v); diff --git a/Objects/unicodeobject.c b/Objects/unicodeobject.c index 21d994cdd6b6..a859fa05214d 100644 --- a/Objects/unicodeobject.c +++ b/Objects/unicodeobject.c @@ -421,10 +421,27 @@ int _PyUnicode_Resize(PyUnicodeObject **unicode, Py_ssize_t length) return -1; } v = *unicode; - if (v == NULL || !PyUnicode_Check(v) || Py_REFCNT(v) != 1 || length < 0) { + if (v == NULL || !PyUnicode_Check(v) || length < 0) { PyErr_BadInternalCall(); return -1; } + if (v->length == 0) { + if (length == 0) { + return 0; + } + *unicode = _PyUnicode_New(length); + Py_DECREF(v); + return (*unicode == NULL) ? -1 : 0; + } + if (Py_REFCNT(v) != 1) { + PyErr_BadInternalCall(); + return -1; + } + if (length == 0) { + *unicode = _PyUnicode_New(0); + Py_DECREF(v); + return (*unicode == NULL) ? -1 : 0; + } /* Resizing unicode_empty and single character objects is not possible since these are being shared. We simply return a fresh From webhook-mailer at python.org Sat Jan 12 02:40:16 2019 From: webhook-mailer at python.org (Miss Islington (bot)) Date: Sat, 12 Jan 2019 07:40:16 -0000 Subject: [Python-checkins] bpo-33817: Fix _PyBytes_Resize() for empty bytes object. (GH-11516) Message-ID: https://github.com/python/cpython/commit/d39c19255910b9dce08c595f511597e98b09e91f commit: d39c19255910b9dce08c595f511597e98b09e91f branch: 3.7 author: Miss Islington (bot) <31488909+miss-islington at users.noreply.github.com> committer: GitHub date: 2019-01-11T23:40:09-08:00 summary: bpo-33817: Fix _PyBytes_Resize() for empty bytes object. (GH-11516) Add also tests for PyUnicode_FromFormat() and PyBytes_FromFormat() with empty result. (cherry picked from commit 44cc4822bb3799858201e61294c5863f93ec12e2) Co-authored-by: Serhiy Storchaka files: A Misc/NEWS.d/next/C API/2019-01-11-11-16-16.bpo-33817.nJ4yIj.rst M Lib/test/test_bytes.py M Lib/test/test_unicode.py M Objects/bytesobject.c diff --git a/Lib/test/test_bytes.py b/Lib/test/test_bytes.py index 145411efbb9d..274616bf998d 100644 --- a/Lib/test/test_bytes.py +++ b/Lib/test/test_bytes.py @@ -999,6 +999,12 @@ def ptr_formatter(ptr): self.assertRaises(OverflowError, PyBytes_FromFormat, b'%c', c_int(256)) + # Issue #33817: empty strings + self.assertEqual(PyBytes_FromFormat(b''), + b'') + self.assertEqual(PyBytes_FromFormat(b'%s', b''), + b'') + def test_bytes_blocking(self): class IterationBlocked(list): __bytes__ = None diff --git a/Lib/test/test_unicode.py b/Lib/test/test_unicode.py index 3cc018c0cc2c..1aad9334074c 100644 --- a/Lib/test/test_unicode.py +++ b/Lib/test/test_unicode.py @@ -2676,6 +2676,12 @@ def check_format(expected, format, *args): check_format('%.%s', b'%.%s', b'abc') + # Issue #33817: empty strings + check_format('', + b'') + check_format('', + b'%s', b'') + # Test PyUnicode_AsWideChar() @support.cpython_only def test_aswidechar(self): diff --git a/Misc/NEWS.d/next/C API/2019-01-11-11-16-16.bpo-33817.nJ4yIj.rst b/Misc/NEWS.d/next/C API/2019-01-11-11-16-16.bpo-33817.nJ4yIj.rst new file mode 100644 index 000000000000..ca4ccb26d361 --- /dev/null +++ b/Misc/NEWS.d/next/C API/2019-01-11-11-16-16.bpo-33817.nJ4yIj.rst @@ -0,0 +1 @@ +Fixed :c:func:`_PyBytes_Resize` for empty bytes objects. diff --git a/Objects/bytesobject.c b/Objects/bytesobject.c index 711faba64548..5f9e1eccf2e4 100644 --- a/Objects/bytesobject.c +++ b/Objects/bytesobject.c @@ -2990,9 +2990,22 @@ _PyBytes_Resize(PyObject **pv, Py_ssize_t newsize) /* return early if newsize equals to v->ob_size */ return 0; } + if (Py_SIZE(v) == 0) { + if (newsize == 0) { + return 0; + } + *pv = _PyBytes_FromSize(newsize, 0); + Py_DECREF(v); + return (*pv == NULL) ? -1 : 0; + } if (Py_REFCNT(v) != 1) { goto error; } + if (newsize == 0) { + *pv = _PyBytes_FromSize(0, 0); + Py_DECREF(v); + return (*pv == NULL) ? -1 : 0; + } /* XXX UNREF/NEWREF interface should be more symmetrical */ _Py_DEC_REFTOTAL; _Py_ForgetReference(v); From webhook-mailer at python.org Sat Jan 12 02:46:53 2019 From: webhook-mailer at python.org (Serhiy Storchaka) Date: Sat, 12 Jan 2019 07:46:53 -0000 Subject: [Python-checkins] bpo-35494: Improve syntax error messages for unbalanced parentheses in f-string. (GH-11161) Message-ID: https://github.com/python/cpython/commit/58159ef856846d0235e0779aeb6013d70499570d commit: 58159ef856846d0235e0779aeb6013d70499570d branch: master author: Serhiy Storchaka committer: GitHub date: 2019-01-12T09:46:50+02:00 summary: bpo-35494: Improve syntax error messages for unbalanced parentheses in f-string. (GH-11161) files: A Misc/NEWS.d/next/Core and Builtins/2018-12-14-18-02-34.bpo-35494.IWOPtb.rst M Lib/test/test_fstring.py M Python/ast.c diff --git a/Lib/test/test_fstring.py b/Lib/test/test_fstring.py index fe3804b2215d..9e45770f80b8 100644 --- a/Lib/test/test_fstring.py +++ b/Lib/test/test_fstring.py @@ -368,9 +368,27 @@ def test_unterminated_string(self): ]) def test_mismatched_parens(self): - self.assertAllRaise(SyntaxError, 'f-string: mismatched', + self.assertAllRaise(SyntaxError, r"f-string: closing parenthesis '\}' " + r"does not match opening parenthesis '\('", ["f'{((}'", ]) + self.assertAllRaise(SyntaxError, r"f-string: closing parenthesis '\)' " + r"does not match opening parenthesis '\['", + ["f'{a[4)}'", + ]) + self.assertAllRaise(SyntaxError, r"f-string: closing parenthesis '\]' " + r"does not match opening parenthesis '\('", + ["f'{a(4]}'", + ]) + self.assertAllRaise(SyntaxError, r"f-string: closing parenthesis '\}' " + r"does not match opening parenthesis '\['", + ["f'{a[4}'", + ]) + self.assertAllRaise(SyntaxError, r"f-string: closing parenthesis '\}' " + r"does not match opening parenthesis '\('", + ["f'{a(4}'", + ]) + self.assertRaises(SyntaxError, eval, "f'{" + "("*500 + "}'") def test_double_braces(self): self.assertEqual(f'{{', '{') @@ -448,7 +466,9 @@ def test_comments(self): ["f'{1#}'", # error because the expression becomes "(1#)" "f'{3(#)}'", "f'{#}'", - "f'{)#}'", # When wrapped in parens, this becomes + ]) + self.assertAllRaise(SyntaxError, r"f-string: unmatched '\)'", + ["f'{)#}'", # When wrapped in parens, this becomes # '()#)'. Make sure that doesn't compile. ]) @@ -577,7 +597,7 @@ def test_parens_in_expressions(self): "f'{,}'", # this is (,), which is an error ]) - self.assertAllRaise(SyntaxError, "f-string: expecting '}'", + self.assertAllRaise(SyntaxError, r"f-string: unmatched '\)'", ["f'{3)+(4}'", ]) @@ -1003,16 +1023,6 @@ def test_str_format_differences(self): self.assertEqual('{d[a]}'.format(d=d), 'string') self.assertEqual('{d[0]}'.format(d=d), 'integer') - def test_invalid_expressions(self): - self.assertAllRaise(SyntaxError, - r"closing parenthesis '\)' does not match " - r"opening parenthesis '\[' \(, line 1\)", - [r"f'{a[4)}'"]) - self.assertAllRaise(SyntaxError, - r"closing parenthesis '\]' does not match " - r"opening parenthesis '\(' \(, line 1\)", - [r"f'{a(4]}'"]) - def test_errors(self): # see issue 26287 self.assertAllRaise(TypeError, 'unsupported', diff --git a/Misc/NEWS.d/next/Core and Builtins/2018-12-14-18-02-34.bpo-35494.IWOPtb.rst b/Misc/NEWS.d/next/Core and Builtins/2018-12-14-18-02-34.bpo-35494.IWOPtb.rst new file mode 100644 index 000000000000..0813b35ec87d --- /dev/null +++ b/Misc/NEWS.d/next/Core and Builtins/2018-12-14-18-02-34.bpo-35494.IWOPtb.rst @@ -0,0 +1 @@ +Improved syntax error messages for unbalanced parentheses in f-string. diff --git a/Python/ast.c b/Python/ast.c index 8a305a80ffac..69dfe3c3c435 100644 --- a/Python/ast.c +++ b/Python/ast.c @@ -13,6 +13,8 @@ #include #include +#define MAXLEVEL 200 /* Max parentheses level */ + static int validate_stmts(asdl_seq *); static int validate_exprs(asdl_seq *, expr_context_ty, int); static int validate_nonempty_seq(asdl_seq *, const char *, const char *); @@ -4479,6 +4481,7 @@ fstring_find_expr(const char **str, const char *end, int raw, int recurse_lvl, /* Keep track of nesting level for braces/parens/brackets in expressions. */ Py_ssize_t nested_depth = 0; + char parenstack[MAXLEVEL]; /* Can only nest one level deep. */ if (recurse_lvl >= 2) { @@ -4553,10 +4556,12 @@ fstring_find_expr(const char **str, const char *end, int raw, int recurse_lvl, /* Start looking for the end of the string. */ quote_char = ch; } else if (ch == '[' || ch == '{' || ch == '(') { + if (nested_depth >= MAXLEVEL) { + ast_error(c, n, "f-string: too many nested parenthesis"); + return -1; + } + parenstack[nested_depth] = ch; nested_depth++; - } else if (nested_depth != 0 && - (ch == ']' || ch == '}' || ch == ')')) { - nested_depth--; } else if (ch == '#') { /* Error: can't include a comment character, inside parens or not. */ @@ -4573,6 +4578,23 @@ fstring_find_expr(const char **str, const char *end, int raw, int recurse_lvl, } /* Normal way out of this loop. */ break; + } else if (ch == ']' || ch == '}' || ch == ')') { + if (!nested_depth) { + ast_error(c, n, "f-string: unmatched '%c'", ch); + return -1; + } + nested_depth--; + int opening = parenstack[nested_depth]; + if (!((opening == '(' && ch == ')') || + (opening == '[' && ch == ']') || + (opening == '{' && ch == '}'))) + { + ast_error(c, n, + "f-string: closing parenthesis '%c' " + "does not match opening parenthesis '%c'", + ch, opening); + return -1; + } } else { /* Just consume this char and loop around. */ } @@ -4587,7 +4609,8 @@ fstring_find_expr(const char **str, const char *end, int raw, int recurse_lvl, return -1; } if (nested_depth) { - ast_error(c, n, "f-string: mismatched '(', '{', or '['"); + int opening = parenstack[nested_depth - 1]; + ast_error(c, n, "f-string: unmatched '%c'", opening); return -1; } From webhook-mailer at python.org Sat Jan 12 03:12:27 2019 From: webhook-mailer at python.org (Serhiy Storchaka) Date: Sat, 12 Jan 2019 08:12:27 -0000 Subject: [Python-checkins] bpo-35634: Raise an error when first passed kwargs contains duplicated keys. (GH-11438) Message-ID: https://github.com/python/cpython/commit/f1ec3cefad4639797c37eaa8c074830188fa0a44 commit: f1ec3cefad4639797c37eaa8c074830188fa0a44 branch: master author: Serhiy Storchaka committer: GitHub date: 2019-01-12T10:12:24+02:00 summary: bpo-35634: Raise an error when first passed kwargs contains duplicated keys. (GH-11438) files: A Misc/NEWS.d/next/Core and Builtins/2019-01-05-18-39-49.bpo-35634.nVP_gs.rst M Lib/test/test_extcall.py M Python/ceval.c diff --git a/Lib/test/test_extcall.py b/Lib/test/test_extcall.py index 2c1848337b2d..a3ff4413eee6 100644 --- a/Lib/test/test_extcall.py +++ b/Lib/test/test_extcall.py @@ -316,6 +316,52 @@ ... TypeError: dir() got multiple values for keyword argument 'b' +Test a kwargs mapping with duplicated keys. + + >>> from collections.abc import Mapping + >>> class MultiDict(Mapping): + ... def __init__(self, items): + ... self._items = items + ... + ... def __iter__(self): + ... return (k for k, v in self._items) + ... + ... def __getitem__(self, key): + ... for k, v in self._items: + ... if k == key: + ... return v + ... raise KeyError(key) + ... + ... def __len__(self): + ... return len(self._items) + ... + ... def keys(self): + ... return [k for k, v in self._items] + ... + ... def values(self): + ... return [v for k, v in self._items] + ... + ... def items(self): + ... return [(k, v) for k, v in self._items] + ... + >>> g(**MultiDict([('x', 1), ('y', 2)])) + 1 () {'y': 2} + + >>> g(**MultiDict([('x', 1), ('x', 2)])) + Traceback (most recent call last): + ... + TypeError: g() got multiple values for keyword argument 'x' + + >>> g(a=3, **MultiDict([('x', 1), ('x', 2)])) + Traceback (most recent call last): + ... + TypeError: g() got multiple values for keyword argument 'x' + + >>> g(**MultiDict([('a', 3)]), **MultiDict([('x', 1), ('x', 2)])) + Traceback (most recent call last): + ... + TypeError: g() got multiple values for keyword argument 'x' + Another helper function >>> def f2(*a, **b): diff --git a/Misc/NEWS.d/next/Core and Builtins/2019-01-05-18-39-49.bpo-35634.nVP_gs.rst b/Misc/NEWS.d/next/Core and Builtins/2019-01-05-18-39-49.bpo-35634.nVP_gs.rst new file mode 100644 index 000000000000..24e578960654 --- /dev/null +++ b/Misc/NEWS.d/next/Core and Builtins/2019-01-05-18-39-49.bpo-35634.nVP_gs.rst @@ -0,0 +1,3 @@ +``func(**kwargs)`` will now raise an error when ``kwargs`` is a mapping +containing multiple entries with the same key. An error was already raised +when other keyword arguments are passed before ``**kwargs`` since Python 3.6. diff --git a/Python/ceval.c b/Python/ceval.c index 3e82ceb95251..3db7c7c92a0e 100644 --- a/Python/ceval.c +++ b/Python/ceval.c @@ -70,7 +70,7 @@ static PyObject * unicode_concatenate(PyObject *, PyObject *, PyFrameObject *, const _Py_CODEUNIT *); static PyObject * special_lookup(PyObject *, _Py_Identifier *); static int check_args_iterable(PyObject *func, PyObject *vararg); -static void format_kwargs_mapping_error(PyObject *func, PyObject *kwargs); +static void format_kwargs_error(PyObject *func, PyObject *kwargs); static void format_awaitable_error(PyTypeObject *, int); #define NAME_ERROR_MSG \ @@ -2660,37 +2660,8 @@ _PyEval_EvalFrameDefault(PyFrameObject *f, int throwflag) for (i = oparg; i > 0; i--) { PyObject *arg = PEEK(i); if (_PyDict_MergeEx(sum, arg, 2) < 0) { - PyObject *func = PEEK(2 + oparg); - if (PyErr_ExceptionMatches(PyExc_AttributeError)) { - format_kwargs_mapping_error(func, arg); - } - else if (PyErr_ExceptionMatches(PyExc_KeyError)) { - PyObject *exc, *val, *tb; - PyErr_Fetch(&exc, &val, &tb); - if (val && PyTuple_Check(val) && PyTuple_GET_SIZE(val) == 1) { - PyObject *key = PyTuple_GET_ITEM(val, 0); - if (!PyUnicode_Check(key)) { - PyErr_Format(PyExc_TypeError, - "%.200s%.200s keywords must be strings", - PyEval_GetFuncName(func), - PyEval_GetFuncDesc(func)); - } else { - PyErr_Format(PyExc_TypeError, - "%.200s%.200s got multiple " - "values for keyword argument '%U'", - PyEval_GetFuncName(func), - PyEval_GetFuncDesc(func), - key); - } - Py_XDECREF(exc); - Py_XDECREF(val); - Py_XDECREF(tb); - } - else { - PyErr_Restore(exc, val, tb); - } - } Py_DECREF(sum); + format_kwargs_error(PEEK(2 + oparg), arg); goto error; } } @@ -3286,17 +3257,9 @@ _PyEval_EvalFrameDefault(PyFrameObject *f, int throwflag) PyObject *d = PyDict_New(); if (d == NULL) goto error; - if (PyDict_Update(d, kwargs) != 0) { + if (_PyDict_MergeEx(d, kwargs, 2) < 0) { Py_DECREF(d); - /* PyDict_Update raises attribute - * error (percolated from an attempt - * to get 'keys' attribute) instead of - * a type error if its second argument - * is not a mapping. - */ - if (PyErr_ExceptionMatches(PyExc_AttributeError)) { - format_kwargs_mapping_error(SECOND(), kwargs); - } + format_kwargs_error(SECOND(), kwargs); Py_DECREF(kwargs); goto error; } @@ -5063,14 +5026,48 @@ check_args_iterable(PyObject *func, PyObject *args) } static void -format_kwargs_mapping_error(PyObject *func, PyObject *kwargs) +format_kwargs_error(PyObject *func, PyObject *kwargs) { - PyErr_Format(PyExc_TypeError, - "%.200s%.200s argument after ** " - "must be a mapping, not %.200s", - PyEval_GetFuncName(func), - PyEval_GetFuncDesc(func), - kwargs->ob_type->tp_name); + /* _PyDict_MergeEx raises attribute + * error (percolated from an attempt + * to get 'keys' attribute) instead of + * a type error if its second argument + * is not a mapping. + */ + if (PyErr_ExceptionMatches(PyExc_AttributeError)) { + PyErr_Format(PyExc_TypeError, + "%.200s%.200s argument after ** " + "must be a mapping, not %.200s", + PyEval_GetFuncName(func), + PyEval_GetFuncDesc(func), + kwargs->ob_type->tp_name); + } + else if (PyErr_ExceptionMatches(PyExc_KeyError)) { + PyObject *exc, *val, *tb; + PyErr_Fetch(&exc, &val, &tb); + if (val && PyTuple_Check(val) && PyTuple_GET_SIZE(val) == 1) { + PyObject *key = PyTuple_GET_ITEM(val, 0); + if (!PyUnicode_Check(key)) { + PyErr_Format(PyExc_TypeError, + "%.200s%.200s keywords must be strings", + PyEval_GetFuncName(func), + PyEval_GetFuncDesc(func)); + } else { + PyErr_Format(PyExc_TypeError, + "%.200s%.200s got multiple " + "values for keyword argument '%U'", + PyEval_GetFuncName(func), + PyEval_GetFuncDesc(func), + key); + } + Py_XDECREF(exc); + Py_XDECREF(val); + Py_XDECREF(tb); + } + else { + PyErr_Restore(exc, val, tb); + } + } } static void From webhook-mailer at python.org Sat Jan 12 03:30:38 2019 From: webhook-mailer at python.org (Serhiy Storchaka) Date: Sat, 12 Jan 2019 08:30:38 -0000 Subject: [Python-checkins] bpo-35552: Fix reading past the end in PyUnicode_FromFormat() and PyBytes_FromFormat(). (GH-11276) Message-ID: https://github.com/python/cpython/commit/d586ccb04f79863c819b212ec5b9d873964078e4 commit: d586ccb04f79863c819b212ec5b9d873964078e4 branch: master author: Serhiy Storchaka committer: GitHub date: 2019-01-12T10:30:35+02:00 summary: bpo-35552: Fix reading past the end in PyUnicode_FromFormat() and PyBytes_FromFormat(). (GH-11276) Format characters "%s" and "%V" in PyUnicode_FromFormat() and "%s" in PyBytes_FromFormat() no longer read memory past the limit if precision is specified. files: A Misc/NEWS.d/next/Core and Builtins/2018-12-21-13-29-30.bpo-35552.1DzQQc.rst M Objects/bytesobject.c M Objects/unicodeobject.c diff --git a/Misc/NEWS.d/next/Core and Builtins/2018-12-21-13-29-30.bpo-35552.1DzQQc.rst b/Misc/NEWS.d/next/Core and Builtins/2018-12-21-13-29-30.bpo-35552.1DzQQc.rst new file mode 100644 index 000000000000..dbc00bcd75e9 --- /dev/null +++ b/Misc/NEWS.d/next/Core and Builtins/2018-12-21-13-29-30.bpo-35552.1DzQQc.rst @@ -0,0 +1,3 @@ +Format characters ``%s`` and ``%V`` in :c:func:`PyUnicode_FromFormat` and +``%s`` in :c:func:`PyBytes_FromFormat` no longer read memory past the +limit if *precision* is specified. diff --git a/Objects/bytesobject.c b/Objects/bytesobject.c index 40ef47144e52..b299d4871702 100644 --- a/Objects/bytesobject.c +++ b/Objects/bytesobject.c @@ -312,9 +312,15 @@ PyBytes_FromFormatV(const char *format, va_list vargs) Py_ssize_t i; p = va_arg(vargs, const char*); - i = strlen(p); - if (prec > 0 && i > prec) - i = prec; + if (prec <= 0) { + i = strlen(p); + } + else { + i = 0; + while (i < prec && p[i]) { + i++; + } + } s = _PyBytesWriter_WriteBytes(&writer, s, p, i); if (s == NULL) goto error; diff --git a/Objects/unicodeobject.c b/Objects/unicodeobject.c index 304ea7471f49..f1d23b66fa14 100644 --- a/Objects/unicodeobject.c +++ b/Objects/unicodeobject.c @@ -2578,9 +2578,15 @@ unicode_fromformat_write_cstr(_PyUnicodeWriter *writer, const char *str, PyObject *unicode; int res; - length = strlen(str); - if (precision != -1) - length = Py_MIN(length, precision); + if (precision == -1) { + length = strlen(str); + } + else { + length = 0; + while (length < precision && str[length]) { + length++; + } + } unicode = PyUnicode_DecodeUTF8Stateful(str, length, "replace", NULL); if (unicode == NULL) return -1; From webhook-mailer at python.org Sat Jan 12 03:52:59 2019 From: webhook-mailer at python.org (Miss Islington (bot)) Date: Sat, 12 Jan 2019 08:52:59 -0000 Subject: [Python-checkins] bpo-35552: Fix reading past the end in PyUnicode_FromFormat() and PyBytes_FromFormat(). (GH-11276) Message-ID: https://github.com/python/cpython/commit/cbc7c2c791185ad44b4b3ede72309df5f252f4cb commit: cbc7c2c791185ad44b4b3ede72309df5f252f4cb branch: 3.7 author: Miss Islington (bot) <31488909+miss-islington at users.noreply.github.com> committer: GitHub date: 2019-01-12T00:52:55-08:00 summary: bpo-35552: Fix reading past the end in PyUnicode_FromFormat() and PyBytes_FromFormat(). (GH-11276) Format characters "%s" and "%V" in PyUnicode_FromFormat() and "%s" in PyBytes_FromFormat() no longer read memory past the limit if precision is specified. (cherry picked from commit d586ccb04f79863c819b212ec5b9d873964078e4) Co-authored-by: Serhiy Storchaka files: A Misc/NEWS.d/next/Core and Builtins/2018-12-21-13-29-30.bpo-35552.1DzQQc.rst M Objects/bytesobject.c M Objects/unicodeobject.c diff --git a/Misc/NEWS.d/next/Core and Builtins/2018-12-21-13-29-30.bpo-35552.1DzQQc.rst b/Misc/NEWS.d/next/Core and Builtins/2018-12-21-13-29-30.bpo-35552.1DzQQc.rst new file mode 100644 index 000000000000..dbc00bcd75e9 --- /dev/null +++ b/Misc/NEWS.d/next/Core and Builtins/2018-12-21-13-29-30.bpo-35552.1DzQQc.rst @@ -0,0 +1,3 @@ +Format characters ``%s`` and ``%V`` in :c:func:`PyUnicode_FromFormat` and +``%s`` in :c:func:`PyBytes_FromFormat` no longer read memory past the +limit if *precision* is specified. diff --git a/Objects/bytesobject.c b/Objects/bytesobject.c index 5f9e1eccf2e4..172c7f38b9e2 100644 --- a/Objects/bytesobject.c +++ b/Objects/bytesobject.c @@ -311,9 +311,15 @@ PyBytes_FromFormatV(const char *format, va_list vargs) Py_ssize_t i; p = va_arg(vargs, const char*); - i = strlen(p); - if (prec > 0 && i > prec) - i = prec; + if (prec <= 0) { + i = strlen(p); + } + else { + i = 0; + while (i < prec && p[i]) { + i++; + } + } s = _PyBytesWriter_WriteBytes(&writer, s, p, i); if (s == NULL) goto error; diff --git a/Objects/unicodeobject.c b/Objects/unicodeobject.c index 35c8a24b7c0c..b67ffac4e9fb 100644 --- a/Objects/unicodeobject.c +++ b/Objects/unicodeobject.c @@ -2579,9 +2579,15 @@ unicode_fromformat_write_cstr(_PyUnicodeWriter *writer, const char *str, PyObject *unicode; int res; - length = strlen(str); - if (precision != -1) - length = Py_MIN(length, precision); + if (precision == -1) { + length = strlen(str); + } + else { + length = 0; + while (length < precision && str[length]) { + length++; + } + } unicode = PyUnicode_DecodeUTF8Stateful(str, length, "replace", NULL); if (unicode == NULL) return -1; From solipsis at pitrou.net Sat Jan 12 04:08:59 2019 From: solipsis at pitrou.net (solipsis at pitrou.net) Date: Sat, 12 Jan 2019 09:08:59 +0000 Subject: [Python-checkins] Daily reference leaks (4243df51fe43): sum=4 Message-ID: <20190112090859.1.AEAFA89C49049404@psf.io> results for 4243df51fe43 on branch "default" -------------------------------------------- test_functools leaked [0, 3, 1] memory blocks, sum=4 Command line was: ['./python', '-m', 'test.regrtest', '-uall', '-R', '3:3:/home/psf-users/antoine/refleaks/reflogGxm_pC', '--timeout', '7200'] From webhook-mailer at python.org Sat Jan 12 04:21:01 2019 From: webhook-mailer at python.org (Serhiy Storchaka) Date: Sat, 12 Jan 2019 09:21:01 -0000 Subject: [Python-checkins] [2.7] bpo-35552: Fix reading past the end in PyString_FromFormat(). (GH-11276) (GH-11534) Message-ID: https://github.com/python/cpython/commit/555755ecff2669f4e020147d7d3a0aec71abb679 commit: 555755ecff2669f4e020147d7d3a0aec71abb679 branch: 2.7 author: Serhiy Storchaka committer: GitHub date: 2019-01-12T11:20:50+02:00 summary: [2.7] bpo-35552: Fix reading past the end in PyString_FromFormat(). (GH-11276) (GH-11534) Format character "%s" in PyString_FromFormat() no longer read memory past the limit if precision is specified. (cherry picked from commit d586ccb04f79863c819b212ec5b9d873964078e4) files: A Misc/NEWS.d/next/Core and Builtins/2018-12-21-13-29-30.bpo-35552.1DzQQc.rst M Objects/stringobject.c diff --git a/Misc/NEWS.d/next/Core and Builtins/2018-12-21-13-29-30.bpo-35552.1DzQQc.rst b/Misc/NEWS.d/next/Core and Builtins/2018-12-21-13-29-30.bpo-35552.1DzQQc.rst new file mode 100644 index 000000000000..47ff76ac2624 --- /dev/null +++ b/Misc/NEWS.d/next/Core and Builtins/2018-12-21-13-29-30.bpo-35552.1DzQQc.rst @@ -0,0 +1,2 @@ +Format character ``%s`` in :c:func:`PyString_FromFormat` no longer read +memory past the limit if *precision* is specified. diff --git a/Objects/stringobject.c b/Objects/stringobject.c index efb0d1401b96..c47d32f4060f 100644 --- a/Objects/stringobject.c +++ b/Objects/stringobject.c @@ -360,9 +360,15 @@ PyString_FromFormatV(const char *format, va_list vargs) break; case 's': p = va_arg(vargs, char*); - i = strlen(p); - if (n > 0 && i > n) - i = n; + if (n <= 0) { + i = strlen(p); + } + else { + i = 0; + while (i < n && p[i]) { + i++; + } + } Py_MEMCPY(s, p, i); s += i; break; From webhook-mailer at python.org Sat Jan 12 12:21:58 2019 From: webhook-mailer at python.org (Tal Einat) Date: Sat, 12 Jan 2019 17:21:58 -0000 Subject: [Python-checkins] bpo-34512: Document platform-specific strftime() behavior for non-ASCII format strings (GH-8948) Message-ID: https://github.com/python/cpython/commit/1cffd0eed313011c0c2bb071c8affeb4a7ed05c7 commit: 1cffd0eed313011c0c2bb071c8affeb4a7ed05c7 branch: master author: Alexey Izbyshev committer: Tal Einat date: 2019-01-12T19:21:54+02:00 summary: bpo-34512: Document platform-specific strftime() behavior for non-ASCII format strings (GH-8948) files: M Doc/library/datetime.rst diff --git a/Doc/library/datetime.rst b/Doc/library/datetime.rst index db3a6522c24f..121f73bbe852 100644 --- a/Doc/library/datetime.rst +++ b/Doc/library/datetime.rst @@ -2034,6 +2034,12 @@ calls the platform C library's :func:`strftime` function, and platform variations are common. To see the full set of format codes supported on your platform, consult the :manpage:`strftime(3)` documentation. +For the same reason, handling of format strings containing Unicode code points +that can't be represented in the charset of the current locale is also +platform-dependent. On some platforms such code points are preserved intact in +the output, while on others ``strftime`` may raise :exc:`UnicodeError` or return +an empty string instead. + The following is a list of all the format codes that the C standard (1989 version) requires, and these work on all platforms with a standard C implementation. Note that the 1999 version of the C standard added additional From webhook-mailer at python.org Sat Jan 12 12:27:33 2019 From: webhook-mailer at python.org (Miss Islington (bot)) Date: Sat, 12 Jan 2019 17:27:33 -0000 Subject: [Python-checkins] bpo-34512: Document platform-specific strftime() behavior for non-ASCII format strings (GH-8948) Message-ID: https://github.com/python/cpython/commit/678c5c07521caca809b1356d954975e6234c49ae commit: 678c5c07521caca809b1356d954975e6234c49ae branch: 3.7 author: Miss Islington (bot) <31488909+miss-islington at users.noreply.github.com> committer: GitHub date: 2019-01-12T09:27:30-08:00 summary: bpo-34512: Document platform-specific strftime() behavior for non-ASCII format strings (GH-8948) (cherry picked from commit 1cffd0eed313011c0c2bb071c8affeb4a7ed05c7) Co-authored-by: Alexey Izbyshev files: M Doc/library/datetime.rst diff --git a/Doc/library/datetime.rst b/Doc/library/datetime.rst index db3a6522c24f..121f73bbe852 100644 --- a/Doc/library/datetime.rst +++ b/Doc/library/datetime.rst @@ -2034,6 +2034,12 @@ calls the platform C library's :func:`strftime` function, and platform variations are common. To see the full set of format codes supported on your platform, consult the :manpage:`strftime(3)` documentation. +For the same reason, handling of format strings containing Unicode code points +that can't be represented in the charset of the current locale is also +platform-dependent. On some platforms such code points are preserved intact in +the output, while on others ``strftime`` may raise :exc:`UnicodeError` or return +an empty string instead. + The following is a list of all the format codes that the C standard (1989 version) requires, and these work on all platforms with a standard C implementation. Note that the 1999 version of the C standard added additional From webhook-mailer at python.org Sat Jan 12 12:28:10 2019 From: webhook-mailer at python.org (Miss Islington (bot)) Date: Sat, 12 Jan 2019 17:28:10 -0000 Subject: [Python-checkins] bpo-34512: Document platform-specific strftime() behavior for non-ASCII format strings (GH-8948) Message-ID: https://github.com/python/cpython/commit/77b80c956f39df34722bd8646cf5b83d149832c4 commit: 77b80c956f39df34722bd8646cf5b83d149832c4 branch: 2.7 author: Miss Islington (bot) <31488909+miss-islington at users.noreply.github.com> committer: GitHub date: 2019-01-12T09:28:06-08:00 summary: bpo-34512: Document platform-specific strftime() behavior for non-ASCII format strings (GH-8948) (cherry picked from commit 1cffd0eed313011c0c2bb071c8affeb4a7ed05c7) Co-authored-by: Alexey Izbyshev files: M Doc/library/datetime.rst diff --git a/Doc/library/datetime.rst b/Doc/library/datetime.rst index 1e575d5d572e..2d164b201a46 100644 --- a/Doc/library/datetime.rst +++ b/Doc/library/datetime.rst @@ -1611,6 +1611,12 @@ calls the platform C library's :func:`strftime` function, and platform variations are common. To see the full set of format codes supported on your platform, consult the :manpage:`strftime(3)` documentation. +For the same reason, handling of format strings containing Unicode code points +that can't be represented in the charset of the current locale is also +platform-dependent. On some platforms such code points are preserved intact in +the output, while on others ``strftime`` may raise :exc:`UnicodeError` or return +an empty string instead. + The following is a list of all the format codes that the C standard (1989 version) requires, and these work on all platforms with a standard C implementation. Note that the 1999 version of the C standard added additional From webhook-mailer at python.org Sat Jan 12 23:05:23 2019 From: webhook-mailer at python.org (INADA Naoki) Date: Sun, 13 Jan 2019 04:05:23 -0000 Subject: [Python-checkins] bpo-16806: Fix `lineno` and `col_offset` for multi-line string tokens (GH-10021) Message-ID: https://github.com/python/cpython/commit/995d9b92979768125ced4da3a56f755bcdf80f6e commit: 995d9b92979768125ced4da3a56f755bcdf80f6e branch: master author: Anthony Sottile committer: INADA Naoki date: 2019-01-13T13:05:13+09:00 summary: bpo-16806: Fix `lineno` and `col_offset` for multi-line string tokens (GH-10021) files: A Misc/NEWS.d/next/Core and Builtins/2018-10-20-18-05-58.bpo-16806.zr3A9N.rst M Lib/test/test_ast.py M Lib/test/test_fstring.py M Lib/test/test_opcodes.py M Lib/test/test_string_literals.py M Misc/ACKS M Parser/parsetok.c M Parser/tokenizer.c M Parser/tokenizer.h M Python/ast.c M Python/importlib.h M Python/importlib_external.h M Python/importlib_zipimport.h diff --git a/Lib/test/test_ast.py b/Lib/test/test_ast.py index db9a6caf42f1..2c8d8ab7e3fe 100644 --- a/Lib/test/test_ast.py +++ b/Lib/test/test_ast.py @@ -683,6 +683,25 @@ def test_get_docstring_none(self): node = ast.parse('async def foo():\n x = "not docstring"') self.assertIsNone(ast.get_docstring(node.body[0])) + def test_multi_line_docstring_col_offset_and_lineno_issue16806(self): + node = ast.parse( + '"""line one\nline two"""\n\n' + 'def foo():\n """line one\n line two"""\n\n' + ' def bar():\n """line one\n line two"""\n' + ' """line one\n line two"""\n' + '"""line one\nline two"""\n\n' + ) + self.assertEqual(node.body[0].col_offset, 0) + self.assertEqual(node.body[0].lineno, 1) + self.assertEqual(node.body[1].body[0].col_offset, 2) + self.assertEqual(node.body[1].body[0].lineno, 5) + self.assertEqual(node.body[1].body[1].body[0].col_offset, 4) + self.assertEqual(node.body[1].body[1].body[0].lineno, 9) + self.assertEqual(node.body[1].body[2].col_offset, 2) + self.assertEqual(node.body[1].body[2].lineno, 11) + self.assertEqual(node.body[2].col_offset, 0) + self.assertEqual(node.body[2].lineno, 13) + def test_literal_eval(self): self.assertEqual(ast.literal_eval('[1, 2, 3]'), [1, 2, 3]) self.assertEqual(ast.literal_eval('{"foo": 42}'), {"foo": 42}) diff --git a/Lib/test/test_fstring.py b/Lib/test/test_fstring.py index 9e45770f80b8..9d60be3a29a1 100644 --- a/Lib/test/test_fstring.py +++ b/Lib/test/test_fstring.py @@ -270,10 +270,7 @@ def test_ast_line_numbers_duplicate_expression(self): self.assertEqual(binop.right.col_offset, 7) # FIXME: this is wrong def test_ast_line_numbers_multiline_fstring(self): - # FIXME: This test demonstrates invalid behavior due to JoinedStr's - # immediate child nodes containing the wrong lineno. The enclosed - # expressions have valid line information and column offsets. - # See bpo-16806 and bpo-30465 for details. + # See bpo-30465 for details. expr = """ a = 10 f''' @@ -298,19 +295,16 @@ def test_ast_line_numbers_multiline_fstring(self): self.assertEqual(type(t.body[1].value.values[1]), ast.FormattedValue) self.assertEqual(type(t.body[1].value.values[2]), ast.Constant) self.assertEqual(type(t.body[1].value.values[2].value), str) - # NOTE: the following invalid behavior is described in bpo-16806. - # - line number should be the *first* line (3), not the *last* (8) - # - column offset should not be -1 - self.assertEqual(t.body[1].lineno, 8) - self.assertEqual(t.body[1].value.lineno, 8) - self.assertEqual(t.body[1].value.values[0].lineno, 8) - self.assertEqual(t.body[1].value.values[1].lineno, 8) - self.assertEqual(t.body[1].value.values[2].lineno, 8) - self.assertEqual(t.body[1].col_offset, -1) - self.assertEqual(t.body[1].value.col_offset, -1) - self.assertEqual(t.body[1].value.values[0].col_offset, -1) - self.assertEqual(t.body[1].value.values[1].col_offset, -1) - self.assertEqual(t.body[1].value.values[2].col_offset, -1) + self.assertEqual(t.body[1].lineno, 3) + self.assertEqual(t.body[1].value.lineno, 3) + self.assertEqual(t.body[1].value.values[0].lineno, 3) + self.assertEqual(t.body[1].value.values[1].lineno, 3) + self.assertEqual(t.body[1].value.values[2].lineno, 3) + self.assertEqual(t.body[1].col_offset, 0) + self.assertEqual(t.body[1].value.col_offset, 0) + self.assertEqual(t.body[1].value.values[0].col_offset, 0) + self.assertEqual(t.body[1].value.values[1].col_offset, 0) + self.assertEqual(t.body[1].value.values[2].col_offset, 0) # NOTE: the following lineno information and col_offset is correct for # expressions within FormattedValues. binop = t.body[1].value.values[1].value @@ -321,8 +315,8 @@ def test_ast_line_numbers_multiline_fstring(self): self.assertEqual(binop.lineno, 4) self.assertEqual(binop.left.lineno, 4) self.assertEqual(binop.right.lineno, 6) - self.assertEqual(binop.col_offset, 3) - self.assertEqual(binop.left.col_offset, 3) + self.assertEqual(binop.col_offset, 4) + self.assertEqual(binop.left.col_offset, 4) self.assertEqual(binop.right.col_offset, 7) def test_docstring(self): diff --git a/Lib/test/test_opcodes.py b/Lib/test/test_opcodes.py index b2a22861880f..527aca664d38 100644 --- a/Lib/test/test_opcodes.py +++ b/Lib/test/test_opcodes.py @@ -27,7 +27,7 @@ def test_setup_annotations_line(self): with open(ann_module.__file__) as f: txt = f.read() co = compile(txt, ann_module.__file__, 'exec') - self.assertEqual(co.co_firstlineno, 6) + self.assertEqual(co.co_firstlineno, 3) except OSError: pass diff --git a/Lib/test/test_string_literals.py b/Lib/test/test_string_literals.py index 55bcde4c43fb..635ba57e6c95 100644 --- a/Lib/test/test_string_literals.py +++ b/Lib/test/test_string_literals.py @@ -117,7 +117,7 @@ def test_eval_str_invalid_escape(self): eval("'''\n\\z'''") self.assertEqual(len(w), 1) self.assertEqual(w[0].filename, '') - self.assertEqual(w[0].lineno, 2) + self.assertEqual(w[0].lineno, 1) with warnings.catch_warnings(record=True) as w: warnings.simplefilter('error', category=SyntaxWarning) @@ -126,7 +126,7 @@ def test_eval_str_invalid_escape(self): exc = cm.exception self.assertEqual(w, []) self.assertEqual(exc.filename, '') - self.assertEqual(exc.lineno, 2) + self.assertEqual(exc.lineno, 1) def test_eval_str_raw(self): self.assertEqual(eval(""" r'x' """), 'x') @@ -166,7 +166,7 @@ def test_eval_bytes_invalid_escape(self): eval("b'''\n\\z'''") self.assertEqual(len(w), 1) self.assertEqual(w[0].filename, '') - self.assertEqual(w[0].lineno, 2) + self.assertEqual(w[0].lineno, 1) with warnings.catch_warnings(record=True) as w: warnings.simplefilter('error', category=SyntaxWarning) @@ -175,7 +175,7 @@ def test_eval_bytes_invalid_escape(self): exc = cm.exception self.assertEqual(w, []) self.assertEqual(exc.filename, '') - self.assertEqual(exc.lineno, 2) + self.assertEqual(exc.lineno, 1) def test_eval_bytes_raw(self): self.assertEqual(eval(""" br'x' """), b'x') diff --git a/Misc/ACKS b/Misc/ACKS index 49b28153d3f7..81b51f751991 100644 --- a/Misc/ACKS +++ b/Misc/ACKS @@ -1845,3 +1845,4 @@ Gennadiy Zlobin Doug Zongker Peter ?strand Zheao Li +Carsten Klein diff --git a/Misc/NEWS.d/next/Core and Builtins/2018-10-20-18-05-58.bpo-16806.zr3A9N.rst b/Misc/NEWS.d/next/Core and Builtins/2018-10-20-18-05-58.bpo-16806.zr3A9N.rst new file mode 100644 index 000000000000..1cdddeb5c83e --- /dev/null +++ b/Misc/NEWS.d/next/Core and Builtins/2018-10-20-18-05-58.bpo-16806.zr3A9N.rst @@ -0,0 +1 @@ +Fix ``lineno`` and ``col_offset`` for multi-line string tokens. diff --git a/Parser/parsetok.c b/Parser/parsetok.c index fc878d89d563..d37e28a0a36c 100644 --- a/Parser/parsetok.c +++ b/Parser/parsetok.c @@ -205,6 +205,8 @@ parsetok(struct tok_state *tok, grammar *g, int start, perrdetail *err_ret, size_t len; char *str; col_offset = -1; + int lineno; + const char *line_start; type = PyTokenizer_Get(tok, &a, &b); if (type == ERRORTOKEN) { @@ -253,8 +255,15 @@ parsetok(struct tok_state *tok, grammar *g, int start, perrdetail *err_ret, } } #endif - if (a != NULL && a >= tok->line_start) { - col_offset = Py_SAFE_DOWNCAST(a - tok->line_start, + + /* Nodes of type STRING, especially multi line strings + must be handled differently in order to get both + the starting line number and the column offset right. + (cf. issue 16806) */ + lineno = type == STRING ? tok->first_lineno : tok->lineno; + line_start = type == STRING ? tok->multi_line_start : tok->line_start; + if (a != NULL && a >= line_start) { + col_offset = Py_SAFE_DOWNCAST(a - line_start, intptr_t, int); } else { @@ -263,7 +272,7 @@ parsetok(struct tok_state *tok, grammar *g, int start, perrdetail *err_ret, if ((err_ret->error = PyParser_AddToken(ps, (int)type, str, - tok->lineno, col_offset, + lineno, col_offset, &(err_ret->expected))) != E_OK) { if (err_ret->error != E_DONE) { PyObject_FREE(str); diff --git a/Parser/tokenizer.c b/Parser/tokenizer.c index 0e6c1a85e035..3e3cf2cd7f58 100644 --- a/Parser/tokenizer.c +++ b/Parser/tokenizer.c @@ -1519,6 +1519,13 @@ tok_get(struct tok_state *tok, char **p_start, char **p_end) int quote_size = 1; /* 1 or 3 */ int end_quote_size = 0; + /* Nodes of type STRING, especially multi line strings + must be handled differently in order to get both + the starting line number and the column offset right. + (cf. issue 16806) */ + tok->first_lineno = tok->lineno; + tok->multi_line_start = tok->line_start; + /* Find the quote size and start of string */ c = tok_nextc(tok); if (c == quote) { diff --git a/Parser/tokenizer.h b/Parser/tokenizer.h index cd18d25dc192..096ce687ec54 100644 --- a/Parser/tokenizer.h +++ b/Parser/tokenizer.h @@ -38,6 +38,8 @@ struct tok_state { int pendin; /* Pending indents (if > 0) or dedents (if < 0) */ const char *prompt, *nextprompt; /* For interactive prompting */ int lineno; /* Current line number */ + int first_lineno; /* First line of a single line or multi line string + expression (cf. issue 16806) */ int level; /* () [] {} Parentheses nesting level */ /* Used to allow free continuations inside them */ #ifndef PGEN @@ -58,6 +60,9 @@ struct tok_state { char *encoding; /* Source encoding. */ int cont_line; /* whether we are in a continuation line. */ const char* line_start; /* pointer to start of current line */ + const char* multi_line_start; /* pointer to start of first line of + a single line or multi line string + expression (cf. issue 16806) */ #ifndef PGEN PyObject *decoding_readline; /* open(...).readline */ PyObject *decoding_buffer; diff --git a/Python/ast.c b/Python/ast.c index 69dfe3c3c435..d71f44a6dbe2 100644 --- a/Python/ast.c +++ b/Python/ast.c @@ -4284,9 +4284,13 @@ fstring_fix_node_location(const node *parent, node *n, char *expr_str) start--; } cols += (int)(substr - start); - /* Fix lineno in mulitline strings. */ - while ((substr = strchr(substr + 1, '\n'))) - lines--; + /* adjust the start based on the number of newlines encountered + before the f-string expression */ + for (char* p = parent->n_str; p < substr; p++) { + if (*p == '\n') { + lines++; + } + } } } fstring_shift_node_locations(n, lines, cols); diff --git a/Python/importlib.h b/Python/importlib.h index 5c38196c7c85..dd78c0b9d2b9 100644 --- a/Python/importlib.h +++ b/Python/importlib.h @@ -204,7 +204,7 @@ const unsigned char _Py_M__importlib_bootstrap[] = { 0,0,114,38,0,0,0,114,39,0,0,0,114,48,0,0, 0,114,10,0,0,0,114,10,0,0,0,114,10,0,0,0, 114,11,0,0,0,114,20,0,0,0,52,0,0,0,115,12, - 0,0,0,8,4,4,2,8,8,8,12,8,25,8,13,114, + 0,0,0,8,1,4,5,8,8,8,12,8,25,8,13,114, 20,0,0,0,99,0,0,0,0,0,0,0,0,0,0,0, 0,2,0,0,0,64,0,0,0,115,48,0,0,0,101,0, 90,1,100,0,90,2,100,1,90,3,100,2,100,3,132,0, @@ -255,8 +255,8 @@ const unsigned char _Py_M__importlib_bootstrap[] = { 0,0,0,114,3,0,0,0,114,31,0,0,0,114,38,0, 0,0,114,39,0,0,0,114,48,0,0,0,114,10,0,0, 0,114,10,0,0,0,114,10,0,0,0,114,11,0,0,0, - 114,49,0,0,0,120,0,0,0,115,10,0,0,0,8,2, - 4,2,8,4,8,4,8,5,114,49,0,0,0,99,0,0, + 114,49,0,0,0,120,0,0,0,115,10,0,0,0,8,1, + 4,3,8,4,8,4,8,5,114,49,0,0,0,99,0,0, 0,0,0,0,0,0,0,0,0,0,2,0,0,0,64,0, 0,0,115,36,0,0,0,101,0,90,1,100,0,90,2,100, 1,100,2,132,0,90,3,100,3,100,4,132,0,90,4,100, @@ -730,7 +730,7 @@ const unsigned char _Py_M__importlib_bootstrap[] = { 114,123,0,0,0,218,6,115,101,116,116,101,114,114,130,0, 0,0,114,124,0,0,0,114,10,0,0,0,114,10,0,0, 0,114,10,0,0,0,114,11,0,0,0,114,112,0,0,0, - 49,1,0,0,115,32,0,0,0,8,35,4,2,4,1,2, + 49,1,0,0,115,32,0,0,0,8,1,4,36,4,1,2, 255,12,12,8,10,8,12,2,1,10,8,4,1,10,3,2, 1,10,7,2,1,10,3,4,1,114,112,0,0,0,169,2, 114,113,0,0,0,114,115,0,0,0,99,2,0,0,0,2, @@ -1147,7 +1147,7 @@ const unsigned char _Py_M__importlib_bootstrap[] = { 0,0,0,114,169,0,0,0,114,115,0,0,0,114,97,0, 0,0,114,153,0,0,0,114,10,0,0,0,114,10,0,0, 0,114,10,0,0,0,114,11,0,0,0,114,158,0,0,0, - 195,2,0,0,115,42,0,0,0,8,7,4,2,2,1,10, + 195,2,0,0,115,42,0,0,0,8,2,4,7,2,1,10, 8,2,1,12,8,2,1,12,11,2,1,10,7,2,1,10, 4,2,1,2,1,12,4,2,1,2,1,12,4,2,1,2, 1,12,4,114,158,0,0,0,99,0,0,0,0,0,0,0, @@ -1280,7 +1280,7 @@ const unsigned char _Py_M__importlib_bootstrap[] = { 0,114,90,0,0,0,114,168,0,0,0,114,169,0,0,0, 114,115,0,0,0,114,10,0,0,0,114,10,0,0,0,114, 10,0,0,0,114,11,0,0,0,114,172,0,0,0,12,3, - 0,0,115,44,0,0,0,8,7,4,2,2,1,10,8,2, + 0,0,115,44,0,0,0,8,2,4,7,2,1,10,8,2, 1,12,6,2,1,12,8,2,1,10,3,2,1,10,8,2, 1,10,8,2,1,2,1,12,4,2,1,2,1,12,4,2, 1,2,1,114,172,0,0,0,99,0,0,0,0,0,0,0, @@ -1757,8 +1757,8 @@ const unsigned char _Py_M__importlib_bootstrap[] = { 0,114,215,0,0,0,114,218,0,0,0,114,219,0,0,0, 114,223,0,0,0,114,224,0,0,0,114,226,0,0,0,114, 10,0,0,0,114,10,0,0,0,114,10,0,0,0,114,11, - 0,0,0,218,8,60,109,111,100,117,108,101,62,8,0,0, - 0,115,94,0,0,0,4,17,4,2,8,8,8,8,4,2, + 0,0,0,218,8,60,109,111,100,117,108,101,62,1,0,0, + 0,115,94,0,0,0,4,24,4,2,8,8,8,8,4,2, 4,3,16,4,14,68,14,21,14,16,8,37,8,17,8,11, 14,8,8,11,8,12,8,16,8,36,14,101,16,26,10,45, 14,72,8,17,8,17,8,30,8,37,8,42,8,15,14,73, diff --git a/Python/importlib_external.h b/Python/importlib_external.h index 791bfc413221..ca83eb575c20 100644 --- a/Python/importlib_external.h +++ b/Python/importlib_external.h @@ -1137,8 +1137,8 @@ const unsigned char _Py_M__importlib_bootstrap_external[] = { 0,0,114,127,0,0,0,114,182,0,0,0,114,212,0,0, 0,114,217,0,0,0,114,220,0,0,0,114,3,0,0,0, 114,3,0,0,0,114,3,0,0,0,114,6,0,0,0,114, - 208,0,0,0,243,2,0,0,115,10,0,0,0,8,3,4, - 2,8,8,8,3,8,8,114,208,0,0,0,99,0,0,0, + 208,0,0,0,243,2,0,0,115,10,0,0,0,8,2,4, + 3,8,8,8,3,8,8,114,208,0,0,0,99,0,0,0, 0,0,0,0,0,0,0,0,0,3,0,0,0,64,0,0, 0,115,74,0,0,0,101,0,90,1,100,0,90,2,100,1, 100,2,132,0,90,3,100,3,100,4,132,0,90,4,100,5, @@ -1236,7 +1236,7 @@ const unsigned char _Py_M__importlib_bootstrap_external[] = { 32,32,32,78,114,3,0,0,0,41,3,114,119,0,0,0, 114,44,0,0,0,114,26,0,0,0,114,3,0,0,0,114, 3,0,0,0,114,6,0,0,0,114,225,0,0,0,50,3, - 0,0,115,2,0,0,0,0,4,122,21,83,111,117,114,99, + 0,0,115,2,0,0,0,0,1,122,21,83,111,117,114,99, 101,76,111,97,100,101,114,46,115,101,116,95,100,97,116,97, 99,2,0,0,0,0,0,0,0,5,0,0,0,10,0,0, 0,67,0,0,0,115,82,0,0,0,124,0,160,0,124,1, @@ -1520,7 +1520,7 @@ const unsigned char _Py_M__importlib_bootstrap_external[] = { 1,0,0,90,13,95,95,99,108,97,115,115,99,101,108,108, 95,95,114,3,0,0,0,114,3,0,0,0,114,249,0,0, 0,114,6,0,0,0,114,239,0,0,0,160,3,0,0,115, - 30,0,0,0,8,3,4,2,8,6,8,4,8,3,2,1, + 30,0,0,0,8,2,4,3,8,6,8,4,8,3,2,1, 14,11,2,1,10,4,8,7,2,1,10,5,8,4,8,6, 8,6,114,239,0,0,0,99,0,0,0,0,0,0,0,0, 0,0,0,0,3,0,0,0,64,0,0,0,115,46,0,0, @@ -1768,7 +1768,7 @@ const unsigned char _Py_M__importlib_bootstrap_external[] = { 114,213,0,0,0,114,229,0,0,0,114,136,0,0,0,114, 179,0,0,0,114,3,0,0,0,114,3,0,0,0,114,3, 0,0,0,114,6,0,0,0,114,15,1,0,0,46,4,0, - 0,115,22,0,0,0,8,6,4,2,8,4,8,4,8,3, + 0,115,22,0,0,0,8,2,4,6,8,4,8,4,8,3, 8,8,8,6,8,6,8,4,8,4,2,1,114,15,1,0, 0,99,0,0,0,0,0,0,0,0,0,0,0,0,2,0, 0,0,64,0,0,0,115,96,0,0,0,101,0,90,1,100, @@ -1913,7 +1913,7 @@ const unsigned char _Py_M__importlib_bootstrap_external[] = { 114,36,1,0,0,114,37,1,0,0,114,40,1,0,0,114, 186,0,0,0,114,3,0,0,0,114,3,0,0,0,114,3, 0,0,0,114,6,0,0,0,114,22,1,0,0,99,4,0, - 0,115,22,0,0,0,8,5,4,2,8,6,8,10,8,4, + 0,115,22,0,0,0,8,1,4,6,8,6,8,10,8,4, 8,13,8,3,8,3,8,3,8,3,8,3,114,22,1,0, 0,99,0,0,0,0,0,0,0,0,0,0,0,0,3,0, 0,0,64,0,0,0,115,80,0,0,0,101,0,90,1,100, @@ -2462,7 +2462,7 @@ const unsigned char _Py_M__importlib_bootstrap_external[] = { 0,114,65,1,0,0,114,207,0,0,0,114,72,1,0,0, 114,37,1,0,0,114,3,0,0,0,114,3,0,0,0,114, 3,0,0,0,114,6,0,0,0,114,57,1,0,0,79,5, - 0,0,115,22,0,0,0,8,7,4,2,8,14,8,4,4, + 0,0,115,22,0,0,0,8,2,4,7,8,14,8,4,4, 2,8,12,8,5,10,48,8,31,2,1,10,17,114,57,1, 0,0,99,4,0,0,0,0,0,0,0,6,0,0,0,8, 0,0,0,67,0,0,0,115,146,0,0,0,124,0,160,0, @@ -2641,8 +2641,8 @@ const unsigned char _Py_M__importlib_bootstrap_external[] = { 0,114,57,1,0,0,114,77,1,0,0,114,184,0,0,0, 114,85,1,0,0,114,87,1,0,0,114,3,0,0,0,114, 3,0,0,0,114,3,0,0,0,114,6,0,0,0,218,8, - 60,109,111,100,117,108,101,62,8,0,0,0,115,126,0,0, - 0,4,15,4,1,4,1,2,1,2,255,4,4,8,17,8, + 60,109,111,100,117,108,101,62,1,0,0,0,115,126,0,0, + 0,4,22,4,1,4,1,2,1,2,255,4,4,8,17,8, 5,8,5,8,6,8,6,8,12,8,10,8,9,8,5,8, 7,8,9,12,22,10,127,0,7,16,1,12,2,4,1,4, 2,6,2,6,2,8,2,18,71,8,40,8,19,8,12,8, diff --git a/Python/importlib_zipimport.h b/Python/importlib_zipimport.h index e00010c11fc9..299d1c5653b3 100644 --- a/Python/importlib_zipimport.h +++ b/Python/importlib_zipimport.h @@ -484,7 +484,7 @@ const unsigned char _Py_M__zipimport[] = { 64,0,0,0,114,65,0,0,0,114,78,0,0,0,114,82, 0,0,0,114,83,0,0,0,114,9,0,0,0,114,9,0, 0,0,114,9,0,0,0,114,10,0,0,0,114,4,0,0, - 0,45,0,0,0,115,24,0,0,0,8,13,4,5,8,46, + 0,45,0,0,0,115,24,0,0,0,8,1,4,17,8,46, 10,32,10,12,8,10,8,21,8,11,8,26,8,13,8,38, 8,18,122,12,95,95,105,110,105,116,95,95,46,112,121,99, 84,114,60,0,0,0,70,41,3,122,4,46,112,121,99,84, @@ -1044,7 +1044,7 @@ const unsigned char _Py_M__zipimport[] = { 34,0,0,0,114,182,0,0,0,114,183,0,0,0,114,184, 0,0,0,114,189,0,0,0,114,9,0,0,0,114,9,0, 0,0,114,9,0,0,0,114,10,0,0,0,114,80,0,0, - 0,212,2,0,0,115,14,0,0,0,8,5,4,1,4,2, + 0,212,2,0,0,115,14,0,0,0,8,1,4,5,4,2, 8,4,8,9,8,6,8,11,114,80,0,0,0,41,45,114, 84,0,0,0,90,26,95,102,114,111,122,101,110,95,105,109, 112,111,114,116,108,105,98,95,101,120,116,101,114,110,97,108, @@ -1065,8 +1065,8 @@ const unsigned char _Py_M__zipimport[] = { 0,0,114,170,0,0,0,114,152,0,0,0,114,150,0,0, 0,114,44,0,0,0,114,80,0,0,0,114,9,0,0,0, 114,9,0,0,0,114,9,0,0,0,114,10,0,0,0,218, - 8,60,109,111,100,117,108,101,62,13,0,0,0,115,88,0, - 0,0,4,4,8,1,16,1,8,1,8,1,8,1,8,1, + 8,60,109,111,100,117,108,101,62,1,0,0,0,115,88,0, + 0,0,4,16,8,1,16,1,8,1,8,1,8,1,8,1, 8,1,8,2,8,3,6,1,14,3,16,4,4,2,8,2, 4,1,4,1,4,2,14,127,0,127,0,1,12,1,12,1, 2,1,2,252,4,9,8,4,8,9,8,31,8,126,2,254, From solipsis at pitrou.net Sun Jan 13 04:13:48 2019 From: solipsis at pitrou.net (solipsis at pitrou.net) Date: Sun, 13 Jan 2019 09:13:48 +0000 Subject: [Python-checkins] Daily reference leaks (4243df51fe43): sum=-1 Message-ID: <20190113091348.1.D71BCF295FABEEA7@psf.io> results for 4243df51fe43 on branch "default" -------------------------------------------- test_collections leaked [-7, 1, 0] memory blocks, sum=-6 test_functools leaked [0, 3, 1] memory blocks, sum=4 test_multiprocessing_fork leaked [1, 0, -2] memory blocks, sum=-1 test_multiprocessing_forkserver leaked [-2, 1, 1] memory blocks, sum=0 test_multiprocessing_spawn leaked [0, 2, 0] memory blocks, sum=2 Command line was: ['./python', '-m', 'test.regrtest', '-uall', '-R', '3:3:/home/psf-users/antoine/refleaks/reflogSRlBw0', '--timeout', '7200'] From webhook-mailer at python.org Sun Jan 13 10:02:04 2019 From: webhook-mailer at python.org (Tal Einat) Date: Sun, 13 Jan 2019 15:02:04 -0000 Subject: [Python-checkins] bpo-35196: Optimize Squeezer's write() interception (GH-10454) Message-ID: https://github.com/python/cpython/commit/39a33e99270848d34628cdbb1fdb727f9ede502a commit: 39a33e99270848d34628cdbb1fdb727f9ede502a branch: master author: Tal Einat committer: GitHub date: 2019-01-13T17:01:50+02:00 summary: bpo-35196: Optimize Squeezer's write() interception (GH-10454) The new functionality of Squeezer.reload() is also tested, along with some general re-working of the tests in test_squeezer.py. files: A Misc/NEWS.d/next/IDLE/2018-12-27-17-46-42.bpo-35196.9E-xUh.rst M Lib/idlelib/NEWS.txt M Lib/idlelib/editor.py M Lib/idlelib/idle_test/test_squeezer.py M Lib/idlelib/pyshell.py M Lib/idlelib/squeezer.py diff --git a/Lib/idlelib/NEWS.txt b/Lib/idlelib/NEWS.txt index 3d93e91d3147..d1748a21bce4 100644 --- a/Lib/idlelib/NEWS.txt +++ b/Lib/idlelib/NEWS.txt @@ -3,6 +3,8 @@ Released on 2019-10-20? ====================================== +bpo-35196: Speed up squeezer line counting. + bpo-35208: Squeezer now counts wrapped lines before newlines. bpo-35555: Gray out Code Context menu entry when it's not applicable. diff --git a/Lib/idlelib/editor.py b/Lib/idlelib/editor.py index f4437668a3ed..d13ac3786da4 100644 --- a/Lib/idlelib/editor.py +++ b/Lib/idlelib/editor.py @@ -317,9 +317,6 @@ def __init__(self, flist=None, filename=None, key=None, root=None): text.bind("<>", self.ZoomHeight(self).zoom_height_event) text.bind("<>", self.CodeContext(self).toggle_code_context_event) - squeezer = self.Squeezer(self) - text.bind("<>", - squeezer.squeeze_current_text_event) def _filename_to_unicode(self, filename): """Return filename as BMP unicode so diplayable in Tk.""" diff --git a/Lib/idlelib/idle_test/test_squeezer.py b/Lib/idlelib/idle_test/test_squeezer.py index da2c2dd50c65..7c28a107a90f 100644 --- a/Lib/idlelib/idle_test/test_squeezer.py +++ b/Lib/idlelib/idle_test/test_squeezer.py @@ -1,3 +1,5 @@ +"Test squeezer, coverage 95%" + from collections import namedtuple from textwrap import dedent from tkinter import Text, Tk @@ -33,10 +35,10 @@ def cleanup_root(): class CountLinesTest(unittest.TestCase): """Tests for the count_lines_with_wrapping function.""" - def check(self, expected, text, linewidth, tabwidth): + def check(self, expected, text, linewidth): return self.assertEqual( expected, - count_lines_with_wrapping(text, linewidth, tabwidth), + count_lines_with_wrapping(text, linewidth), ) def test_count_empty(self): @@ -55,37 +57,14 @@ def test_count_several_lines(self): """Test with several lines of text.""" self.assertEqual(count_lines_with_wrapping("1\n2\n3\n"), 3) - def test_tab_width(self): - """Test with various tab widths and line widths.""" - self.check(expected=1, text='\t' * 1, linewidth=8, tabwidth=4) - self.check(expected=1, text='\t' * 2, linewidth=8, tabwidth=4) - self.check(expected=2, text='\t' * 3, linewidth=8, tabwidth=4) - self.check(expected=2, text='\t' * 4, linewidth=8, tabwidth=4) - self.check(expected=3, text='\t' * 5, linewidth=8, tabwidth=4) - - # test longer lines and various tab widths - self.check(expected=4, text='\t' * 10, linewidth=12, tabwidth=4) - self.check(expected=10, text='\t' * 10, linewidth=12, tabwidth=8) - self.check(expected=2, text='\t' * 4, linewidth=10, tabwidth=3) - - # test tabwidth=1 - self.check(expected=2, text='\t' * 9, linewidth=5, tabwidth=1) - self.check(expected=2, text='\t' * 10, linewidth=5, tabwidth=1) - self.check(expected=3, text='\t' * 11, linewidth=5, tabwidth=1) - - # test for off-by-one errors - self.check(expected=2, text='\t' * 6, linewidth=12, tabwidth=4) - self.check(expected=3, text='\t' * 6, linewidth=11, tabwidth=4) - self.check(expected=2, text='\t' * 6, linewidth=13, tabwidth=4) - def test_empty_lines(self): - self.check(expected=1, text='\n', linewidth=80, tabwidth=8) - self.check(expected=2, text='\n\n', linewidth=80, tabwidth=8) - self.check(expected=10, text='\n' * 10, linewidth=80, tabwidth=8) + self.check(expected=1, text='\n', linewidth=80) + self.check(expected=2, text='\n\n', linewidth=80) + self.check(expected=10, text='\n' * 10, linewidth=80) def test_long_line(self): - self.check(expected=3, text='a' * 200, linewidth=80, tabwidth=8) - self.check(expected=3, text='a' * 200 + '\n', linewidth=80, tabwidth=8) + self.check(expected=3, text='a' * 200, linewidth=80) + self.check(expected=3, text='a' * 200 + '\n', linewidth=80) def test_several_lines_different_lengths(self): text = dedent("""\ @@ -94,82 +73,78 @@ def test_several_lines_different_lengths(self): 7 chars 13 characters""") - self.check(expected=5, text=text, linewidth=80, tabwidth=8) - self.check(expected=5, text=text + '\n', linewidth=80, tabwidth=8) - self.check(expected=6, text=text, linewidth=40, tabwidth=8) - self.check(expected=7, text=text, linewidth=20, tabwidth=8) - self.check(expected=11, text=text, linewidth=10, tabwidth=8) + self.check(expected=5, text=text, linewidth=80) + self.check(expected=5, text=text + '\n', linewidth=80) + self.check(expected=6, text=text, linewidth=40) + self.check(expected=7, text=text, linewidth=20) + self.check(expected=11, text=text, linewidth=10) class SqueezerTest(unittest.TestCase): """Tests for the Squeezer class.""" - def make_mock_editor_window(self): + def tearDown(self): + # Clean up the Squeezer class's reference to its instance, + # to avoid side-effects from one test case upon another. + if Squeezer._instance_weakref is not None: + Squeezer._instance_weakref = None + + def make_mock_editor_window(self, with_text_widget=False): """Create a mock EditorWindow instance.""" editwin = NonCallableMagicMock() # isinstance(editwin, PyShell) must be true for Squeezer to enable - # auto-squeezing; in practice this will always be true + # auto-squeezing; in practice this will always be true. editwin.__class__ = PyShell + + if with_text_widget: + editwin.root = get_test_tk_root(self) + text_widget = self.make_text_widget(root=editwin.root) + editwin.text = editwin.per.bottom = text_widget + return editwin def make_squeezer_instance(self, editor_window=None): """Create an actual Squeezer instance with a mock EditorWindow.""" if editor_window is None: editor_window = self.make_mock_editor_window() - return Squeezer(editor_window) + squeezer = Squeezer(editor_window) + squeezer.get_line_width = Mock(return_value=80) + return squeezer + + def make_text_widget(self, root=None): + if root is None: + root = get_test_tk_root(self) + text_widget = Text(root) + text_widget["font"] = ('Courier', 10) + text_widget.mark_set("iomark", "1.0") + return text_widget + + def set_idleconf_option_with_cleanup(self, configType, section, option, value): + prev_val = idleConf.GetOption(configType, section, option) + idleConf.SetOption(configType, section, option, value) + self.addCleanup(idleConf.SetOption, + configType, section, option, prev_val) def test_count_lines(self): - """Test Squeezer.count_lines() with various inputs. - - This checks that Squeezer.count_lines() calls the - count_lines_with_wrapping() function with the appropriate parameters. - """ - for tabwidth, linewidth in [(4, 80), (1, 79), (8, 80), (3, 120)]: - self._test_count_lines_helper(linewidth=linewidth, - tabwidth=tabwidth) - - def _prepare_mock_editwin_for_count_lines(self, editwin, - linewidth, tabwidth): - """Prepare a mock EditorWindow object for Squeezer.count_lines.""" - CHAR_WIDTH = 10 - BORDER_WIDTH = 2 - PADDING_WIDTH = 1 - - # Prepare all the required functionality on the mock EditorWindow object - # so that the calculations in Squeezer.count_lines() can run. - editwin.get_tk_tabwidth.return_value = tabwidth - editwin.text.winfo_width.return_value = \ - linewidth * CHAR_WIDTH + 2 * (BORDER_WIDTH + PADDING_WIDTH) - text_opts = { - 'border': BORDER_WIDTH, - 'padx': PADDING_WIDTH, - 'font': None, - } - editwin.text.cget = lambda opt: text_opts[opt] - - # monkey-path tkinter.font.Font with a mock object, so that - # Font.measure('0') returns CHAR_WIDTH - mock_font = Mock() - def measure(char): - if char == '0': - return CHAR_WIDTH - raise ValueError("measure should only be called on '0'!") - mock_font.return_value.measure = measure - patcher = patch('idlelib.squeezer.Font', mock_font) - patcher.start() - self.addCleanup(patcher.stop) - - def _test_count_lines_helper(self, linewidth, tabwidth): - """Helper for test_count_lines.""" + """Test Squeezer.count_lines() with various inputs.""" editwin = self.make_mock_editor_window() - self._prepare_mock_editwin_for_count_lines(editwin, linewidth, tabwidth) squeezer = self.make_squeezer_instance(editwin) - mock_count_lines = Mock(return_value=SENTINEL_VALUE) - text = 'TEXT' - with patch('idlelib.squeezer.count_lines_with_wrapping', - mock_count_lines): - self.assertIs(squeezer.count_lines(text), SENTINEL_VALUE) - mock_count_lines.assert_called_with(text, linewidth, tabwidth) + for text_code, line_width, expected in [ + (r"'\n'", 80, 1), + (r"'\n' * 3", 80, 3), + (r"'a' * 40 + '\n'", 80, 1), + (r"'a' * 80 + '\n'", 80, 1), + (r"'a' * 200 + '\n'", 80, 3), + (r"'aa\t' * 20", 80, 2), + (r"'aa\t' * 21", 80, 3), + (r"'aa\t' * 20", 40, 4), + ]: + with self.subTest(text_code=text_code, + line_width=line_width, + expected=expected): + text = eval(text_code) + squeezer.get_line_width.return_value = line_width + self.assertEqual(squeezer.count_lines(text), expected) def test_init(self): """Test the creation of Squeezer instances.""" @@ -207,8 +182,6 @@ def test_write_not_stdout(self): def test_write_stdout(self): """Test Squeezer's overriding of the EditorWindow's write() method.""" editwin = self.make_mock_editor_window() - self._prepare_mock_editwin_for_count_lines(editwin, - linewidth=80, tabwidth=8) for text in ['', 'TEXT']: editwin.write = orig_write = Mock(return_value=SENTINEL_VALUE) @@ -232,12 +205,8 @@ def test_write_stdout(self): def test_auto_squeeze(self): """Test that the auto-squeezing creates an ExpandingButton properly.""" - root = get_test_tk_root(self) - text_widget = Text(root) - text_widget.mark_set("iomark", "1.0") - - editwin = self.make_mock_editor_window() - editwin.text = text_widget + editwin = self.make_mock_editor_window(with_text_widget=True) + text_widget = editwin.text squeezer = self.make_squeezer_instance(editwin) squeezer.auto_squeeze_min_lines = 5 squeezer.count_lines = Mock(return_value=6) @@ -248,58 +217,48 @@ def test_auto_squeeze(self): def test_squeeze_current_text_event(self): """Test the squeeze_current_text event.""" - root = get_test_tk_root(self) - - # squeezing text should work for both stdout and stderr + # Squeezing text should work for both stdout and stderr. for tag_name in ["stdout", "stderr"]: - text_widget = Text(root) - text_widget.mark_set("iomark", "1.0") - - editwin = self.make_mock_editor_window() - editwin.text = editwin.per.bottom = text_widget + editwin = self.make_mock_editor_window(with_text_widget=True) + text_widget = editwin.text squeezer = self.make_squeezer_instance(editwin) squeezer.count_lines = Mock(return_value=6) - # prepare some text in the Text widget + # Prepare some text in the Text widget. text_widget.insert("1.0", "SOME\nTEXT\n", tag_name) text_widget.mark_set("insert", "1.0") self.assertEqual(text_widget.get('1.0', 'end'), 'SOME\nTEXT\n\n') self.assertEqual(len(squeezer.expandingbuttons), 0) - # test squeezing the current text + # Test squeezing the current text. retval = squeezer.squeeze_current_text_event(event=Mock()) self.assertEqual(retval, "break") self.assertEqual(text_widget.get('1.0', 'end'), '\n\n') self.assertEqual(len(squeezer.expandingbuttons), 1) self.assertEqual(squeezer.expandingbuttons[0].s, 'SOME\nTEXT') - # test that expanding the squeezed text works and afterwards the - # Text widget contains the original text + # Test that expanding the squeezed text works and afterwards + # the Text widget contains the original text. squeezer.expandingbuttons[0].expand(event=Mock()) self.assertEqual(text_widget.get('1.0', 'end'), 'SOME\nTEXT\n\n') self.assertEqual(len(squeezer.expandingbuttons), 0) def test_squeeze_current_text_event_no_allowed_tags(self): """Test that the event doesn't squeeze text without a relevant tag.""" - root = get_test_tk_root(self) - - text_widget = Text(root) - text_widget.mark_set("iomark", "1.0") - - editwin = self.make_mock_editor_window() - editwin.text = editwin.per.bottom = text_widget + editwin = self.make_mock_editor_window(with_text_widget=True) + text_widget = editwin.text squeezer = self.make_squeezer_instance(editwin) squeezer.count_lines = Mock(return_value=6) - # prepare some text in the Text widget + # Prepare some text in the Text widget. text_widget.insert("1.0", "SOME\nTEXT\n", "TAG") text_widget.mark_set("insert", "1.0") self.assertEqual(text_widget.get('1.0', 'end'), 'SOME\nTEXT\n\n') self.assertEqual(len(squeezer.expandingbuttons), 0) - # test squeezing the current text + # Test squeezing the current text. retval = squeezer.squeeze_current_text_event(event=Mock()) self.assertEqual(retval, "break") self.assertEqual(text_widget.get('1.0', 'end'), 'SOME\nTEXT\n\n') @@ -307,23 +266,18 @@ def test_squeeze_current_text_event_no_allowed_tags(self): def test_squeeze_text_before_existing_squeezed_text(self): """Test squeezing text before existing squeezed text.""" - root = get_test_tk_root(self) - - text_widget = Text(root) - text_widget.mark_set("iomark", "1.0") - - editwin = self.make_mock_editor_window() - editwin.text = editwin.per.bottom = text_widget + editwin = self.make_mock_editor_window(with_text_widget=True) + text_widget = editwin.text squeezer = self.make_squeezer_instance(editwin) squeezer.count_lines = Mock(return_value=6) - # prepare some text in the Text widget and squeeze it + # Prepare some text in the Text widget and squeeze it. text_widget.insert("1.0", "SOME\nTEXT\n", "stdout") text_widget.mark_set("insert", "1.0") squeezer.squeeze_current_text_event(event=Mock()) self.assertEqual(len(squeezer.expandingbuttons), 1) - # test squeezing the current text + # Test squeezing the current text. text_widget.insert("1.0", "MORE\nSTUFF\n", "stdout") text_widget.mark_set("insert", "1.0") retval = squeezer.squeeze_current_text_event(event=Mock()) @@ -336,27 +290,30 @@ def test_squeeze_text_before_existing_squeezed_text(self): squeezer.expandingbuttons[1], )) - GetOptionSignature = namedtuple('GetOptionSignature', - 'configType section option default type warn_on_default raw') - @classmethod - def _make_sig(cls, configType, section, option, default=sentinel.NOT_GIVEN, - type=sentinel.NOT_GIVEN, - warn_on_default=sentinel.NOT_GIVEN, - raw=sentinel.NOT_GIVEN): - return cls.GetOptionSignature(configType, section, option, default, - type, warn_on_default, raw) - - @classmethod - def get_GetOption_signature(cls, mock_call_obj): - args, kwargs = mock_call_obj[-2:] - return cls._make_sig(*args, **kwargs) - def test_reload(self): """Test the reload() class-method.""" - self.assertIsInstance(Squeezer.auto_squeeze_min_lines, int) - idleConf.SetOption('main', 'PyShell', 'auto-squeeze-min-lines', '42') + editwin = self.make_mock_editor_window(with_text_widget=True) + text_widget = editwin.text + squeezer = self.make_squeezer_instance(editwin) + + orig_zero_char_width = squeezer.zero_char_width + orig_auto_squeeze_min_lines = squeezer.auto_squeeze_min_lines + + # Increase both font size and auto-squeeze-min-lines. + text_widget["font"] = ('Courier', 20) + new_auto_squeeze_min_lines = orig_auto_squeeze_min_lines + 10 + self.set_idleconf_option_with_cleanup( + 'main', 'PyShell', 'auto-squeeze-min-lines', + str(new_auto_squeeze_min_lines)) + + Squeezer.reload() + self.assertGreater(squeezer.zero_char_width, orig_zero_char_width) + self.assertEqual(squeezer.auto_squeeze_min_lines, + new_auto_squeeze_min_lines) + + def test_reload_no_squeezer_instances(self): + """Test that Squeezer.reload() runs without any instances existing.""" Squeezer.reload() - self.assertEqual(Squeezer.auto_squeeze_min_lines, 42) class ExpandingButtonTest(unittest.TestCase): @@ -369,7 +326,7 @@ def make_mock_squeezer(self): squeezer = Mock() squeezer.editwin.text = Text(root) - # Set default values for the configuration settings + # Set default values for the configuration settings. squeezer.auto_squeeze_min_lines = 50 return squeezer @@ -382,23 +339,23 @@ def test_init(self, MockHovertip): expandingbutton = ExpandingButton('TEXT', 'TAGS', 50, squeezer) self.assertEqual(expandingbutton.s, 'TEXT') - # check that the underlying tkinter.Button is properly configured + # Check that the underlying tkinter.Button is properly configured. self.assertEqual(expandingbutton.master, text_widget) self.assertTrue('50 lines' in expandingbutton.cget('text')) - # check that the text widget still contains no text + # Check that the text widget still contains no text. self.assertEqual(text_widget.get('1.0', 'end'), '\n') - # check that the mouse events are bound + # Check that the mouse events are bound. self.assertIn('', expandingbutton.bind()) right_button_code = '' % ('2' if macosx.isAquaTk() else '3') self.assertIn(right_button_code, expandingbutton.bind()) - # check that ToolTip was called once, with appropriate values + # Check that ToolTip was called once, with appropriate values. self.assertEqual(MockHovertip.call_count, 1) MockHovertip.assert_called_with(expandingbutton, ANY, hover_delay=ANY) - # check that 'right-click' appears in the tooltip text + # Check that 'right-click' appears in the tooltip text. tooltip_text = MockHovertip.call_args[0][1] self.assertIn('right-click', tooltip_text.lower()) @@ -407,29 +364,30 @@ def test_expand(self): squeezer = self.make_mock_squeezer() expandingbutton = ExpandingButton('TEXT', 'TAGS', 50, squeezer) - # insert the button into the text widget - # (this is normally done by the Squeezer class) + # Insert the button into the text widget + # (this is normally done by the Squeezer class). text_widget = expandingbutton.text text_widget.window_create("1.0", window=expandingbutton) - # set base_text to the text widget, so that changes are actually made - # to it (by ExpandingButton) and we can inspect these changes afterwards + # Set base_text to the text widget, so that changes are actually + # made to it (by ExpandingButton) and we can inspect these + # changes afterwards. expandingbutton.base_text = expandingbutton.text # trigger the expand event retval = expandingbutton.expand(event=Mock()) self.assertEqual(retval, None) - # check that the text was inserted into the text widget + # Check that the text was inserted into the text widget. self.assertEqual(text_widget.get('1.0', 'end'), 'TEXT\n') - # check that the 'TAGS' tag was set on the inserted text + # Check that the 'TAGS' tag was set on the inserted text. text_end_index = text_widget.index('end-1c') self.assertEqual(text_widget.get('1.0', text_end_index), 'TEXT') self.assertEqual(text_widget.tag_nextrange('TAGS', '1.0'), ('1.0', text_end_index)) - # check that the button removed itself from squeezer.expandingbuttons + # Check that the button removed itself from squeezer.expandingbuttons. self.assertEqual(squeezer.expandingbuttons.remove.call_count, 1) squeezer.expandingbuttons.remove.assert_called_with(expandingbutton) @@ -441,55 +399,54 @@ def test_expand_dangerous_oupput(self): expandingbutton.set_is_dangerous() self.assertTrue(expandingbutton.is_dangerous) - # insert the button into the text widget - # (this is normally done by the Squeezer class) + # Insert the button into the text widget + # (this is normally done by the Squeezer class). text_widget = expandingbutton.text text_widget.window_create("1.0", window=expandingbutton) - # set base_text to the text widget, so that changes are actually made - # to it (by ExpandingButton) and we can inspect these changes afterwards + # Set base_text to the text widget, so that changes are actually + # made to it (by ExpandingButton) and we can inspect these + # changes afterwards. expandingbutton.base_text = expandingbutton.text - # patch the message box module to always return False + # Patch the message box module to always return False. with patch('idlelib.squeezer.tkMessageBox') as mock_msgbox: mock_msgbox.askokcancel.return_value = False mock_msgbox.askyesno.return_value = False - - # trigger the expand event + # Trigger the expand event. retval = expandingbutton.expand(event=Mock()) - # check that the event chain was broken and no text was inserted + # Check that the event chain was broken and no text was inserted. self.assertEqual(retval, 'break') self.assertEqual(expandingbutton.text.get('1.0', 'end-1c'), '') - # patch the message box module to always return True + # Patch the message box module to always return True. with patch('idlelib.squeezer.tkMessageBox') as mock_msgbox: mock_msgbox.askokcancel.return_value = True mock_msgbox.askyesno.return_value = True - - # trigger the expand event + # Trigger the expand event. retval = expandingbutton.expand(event=Mock()) - # check that the event chain wasn't broken and the text was inserted + # Check that the event chain wasn't broken and the text was inserted. self.assertEqual(retval, None) self.assertEqual(expandingbutton.text.get('1.0', 'end-1c'), text) def test_copy(self): """Test the copy event.""" - # testing with the actual clipboard proved problematic, so this test - # replaces the clipboard manipulation functions with mocks and checks - # that they are called appropriately + # Testing with the actual clipboard proved problematic, so this + # test replaces the clipboard manipulation functions with mocks + # and checks that they are called appropriately. squeezer = self.make_mock_squeezer() expandingbutton = ExpandingButton('TEXT', 'TAGS', 50, squeezer) expandingbutton.clipboard_clear = Mock() expandingbutton.clipboard_append = Mock() - # trigger the copy event + # Trigger the copy event. retval = expandingbutton.copy(event=Mock()) self.assertEqual(retval, None) - # check that the expanding button called clipboard_clear() and - # clipboard_append('TEXT') once each + # Vheck that the expanding button called clipboard_clear() and + # clipboard_append('TEXT') once each. self.assertEqual(expandingbutton.clipboard_clear.call_count, 1) self.assertEqual(expandingbutton.clipboard_append.call_count, 1) expandingbutton.clipboard_append.assert_called_with('TEXT') @@ -502,13 +459,13 @@ def test_view(self): with patch('idlelib.squeezer.view_text', autospec=view_text)\ as mock_view_text: - # trigger the view event + # Trigger the view event. expandingbutton.view(event=Mock()) - # check that the expanding button called view_text + # Check that the expanding button called view_text. self.assertEqual(mock_view_text.call_count, 1) - # check that the proper text was passed + # Check that the proper text was passed. self.assertEqual(mock_view_text.call_args[0][2], 'TEXT') def test_rmenu(self): diff --git a/Lib/idlelib/pyshell.py b/Lib/idlelib/pyshell.py index b6172fd6b9bb..ea49aff08b43 100755 --- a/Lib/idlelib/pyshell.py +++ b/Lib/idlelib/pyshell.py @@ -899,6 +899,9 @@ def __init__(self, flist=None): if use_subprocess: text.bind("<>", self.view_restart_mark) text.bind("<>", self.restart_shell) + squeezer = self.Squeezer(self) + text.bind("<>", + squeezer.squeeze_current_text_event) self.save_stdout = sys.stdout self.save_stderr = sys.stderr diff --git a/Lib/idlelib/squeezer.py b/Lib/idlelib/squeezer.py index 8960356799a4..869498d753a2 100644 --- a/Lib/idlelib/squeezer.py +++ b/Lib/idlelib/squeezer.py @@ -15,6 +15,7 @@ messages and their tracebacks. """ import re +import weakref import tkinter as tk from tkinter.font import Font @@ -26,7 +27,7 @@ from idlelib import macosx -def count_lines_with_wrapping(s, linewidth=80, tabwidth=8): +def count_lines_with_wrapping(s, linewidth=80): """Count the number of lines in a given string. Lines are counted as if the string was wrapped so that lines are never over @@ -34,25 +35,27 @@ def count_lines_with_wrapping(s, linewidth=80, tabwidth=8): Tabs are considered tabwidth characters long. """ + tabwidth = 8 # Currently always true in Shell. pos = 0 linecount = 1 current_column = 0 for m in re.finditer(r"[\t\n]", s): - # process the normal chars up to tab or newline + # Process the normal chars up to tab or newline. numchars = m.start() - pos pos += numchars current_column += numchars - # deal with tab or newline + # Deal with tab or newline. if s[pos] == '\n': - # Avoid the `current_column == 0` edge-case, and while we're at it, - # don't bother adding 0. + # Avoid the `current_column == 0` edge-case, and while we're + # at it, don't bother adding 0. if current_column > linewidth: - # If the current column was exactly linewidth, divmod would give - # (1,0), even though a new line hadn't yet been started. The same - # is true if length is any exact multiple of linewidth. Therefore, - # subtract 1 before dividing a non-empty line. + # If the current column was exactly linewidth, divmod + # would give (1,0), even though a new line hadn't yet + # been started. The same is true if length is any exact + # multiple of linewidth. Therefore, subtract 1 before + # dividing a non-empty line. linecount += (current_column - 1) // linewidth linecount += 1 current_column = 0 @@ -60,21 +63,21 @@ def count_lines_with_wrapping(s, linewidth=80, tabwidth=8): assert s[pos] == '\t' current_column += tabwidth - (current_column % tabwidth) - # if a tab passes the end of the line, consider the entire tab as - # being on the next line + # If a tab passes the end of the line, consider the entire + # tab as being on the next line. if current_column > linewidth: linecount += 1 current_column = tabwidth - pos += 1 # after the tab or newline + pos += 1 # After the tab or newline. - # process remaining chars (no more tabs or newlines) + # Process remaining chars (no more tabs or newlines). current_column += len(s) - pos - # avoid divmod(-1, linewidth) + # Avoid divmod(-1, linewidth). if current_column > 0: linecount += (current_column - 1) // linewidth else: - # the text ended with a newline; don't count an extra line after it + # Text ended with newline; don't count an extra line after it. linecount -= 1 return linecount @@ -98,9 +101,7 @@ def __init__(self, s, tags, numoflines, squeezer): self.squeezer = squeezer self.editwin = editwin = squeezer.editwin self.text = text = editwin.text - - # the base Text widget of the PyShell object, used to change text - # before the iomark + # The base Text widget is needed to change text before iomark. self.base_text = editwin.per.bottom line_plurality = "lines" if numoflines != 1 else "line" @@ -119,7 +120,7 @@ def __init__(self, s, tags, numoflines, squeezer): self.bind("", self.context_menu_event) else: self.bind("", self.context_menu_event) - self.selection_handle( + self.selection_handle( # X windows only. lambda offset, length: s[int(offset):int(offset) + int(length)]) self.is_dangerous = None @@ -182,7 +183,7 @@ def view(self, event=None): modal=False, wrap='none') rmenu_specs = ( - # item structure: (label, method_name) + # Item structure: (label, method_name). ('copy', 'copy'), ('view', 'view'), ) @@ -202,6 +203,8 @@ class Squeezer: This avoids IDLE's shell slowing down considerably, and even becoming completely unresponsive, when very long outputs are written. """ + _instance_weakref = None + @classmethod def reload(cls): """Load class variables from config.""" @@ -210,6 +213,14 @@ def reload(cls): type="int", default=50, ) + # Loading the font info requires a Tk root. IDLE doesn't rely + # on Tkinter's "default root", so the instance will reload + # font info using its editor windows's Tk root. + if cls._instance_weakref is not None: + instance = cls._instance_weakref() + if instance is not None: + instance.load_font() + def __init__(self, editwin): """Initialize settings for Squeezer. @@ -223,44 +234,58 @@ def __init__(self, editwin): self.editwin = editwin self.text = text = editwin.text - # Get the base Text widget of the PyShell object, used to change text - # before the iomark. PyShell deliberately disables changing text before - # the iomark via its 'text' attribute, which is actually a wrapper for - # the actual Text widget. Squeezer, however, needs to make such changes. + # Get the base Text widget of the PyShell object, used to change + # text before the iomark. PyShell deliberately disables changing + # text before the iomark via its 'text' attribute, which is + # actually a wrapper for the actual Text widget. Squeezer, + # however, needs to make such changes. self.base_text = editwin.per.bottom + Squeezer._instance_weakref = weakref.ref(self) + self.load_font() + + # Twice the text widget's border width and internal padding; + # pre-calculated here for the get_line_width() method. + self.window_width_delta = 2 * ( + int(text.cget('border')) + + int(text.cget('padx')) + ) + self.expandingbuttons = [] - from idlelib.pyshell import PyShell # done here to avoid import cycle - if isinstance(editwin, PyShell): - # If we get a PyShell instance, replace its write method with a - # wrapper, which inserts an ExpandingButton instead of a long text. - def mywrite(s, tags=(), write=editwin.write): - # only auto-squeeze text which has just the "stdout" tag - if tags != "stdout": - return write(s, tags) - - # only auto-squeeze text with at least the minimum - # configured number of lines - numoflines = self.count_lines(s) - if numoflines < self.auto_squeeze_min_lines: - return write(s, tags) - - # create an ExpandingButton instance - expandingbutton = ExpandingButton(s, tags, numoflines, - self) - - # insert the ExpandingButton into the Text widget - text.mark_gravity("iomark", tk.RIGHT) - text.window_create("iomark", window=expandingbutton, - padx=3, pady=5) - text.see("iomark") - text.update() - text.mark_gravity("iomark", tk.LEFT) - - # add the ExpandingButton to the Squeezer's list - self.expandingbuttons.append(expandingbutton) - - editwin.write = mywrite + + # Replace the PyShell instance's write method with a wrapper, + # which inserts an ExpandingButton instead of a long text. + def mywrite(s, tags=(), write=editwin.write): + # Only auto-squeeze text which has just the "stdout" tag. + if tags != "stdout": + return write(s, tags) + + # Only auto-squeeze text with at least the minimum + # configured number of lines. + auto_squeeze_min_lines = self.auto_squeeze_min_lines + # First, a very quick check to skip very short texts. + if len(s) < auto_squeeze_min_lines: + return write(s, tags) + # Now the full line-count check. + numoflines = self.count_lines(s) + if numoflines < auto_squeeze_min_lines: + return write(s, tags) + + # Create an ExpandingButton instance. + expandingbutton = ExpandingButton(s, tags, numoflines, self) + + # Insert the ExpandingButton into the Text widget. + text.mark_gravity("iomark", tk.RIGHT) + text.window_create("iomark", window=expandingbutton, + padx=3, pady=5) + text.see("iomark") + text.update() + text.mark_gravity("iomark", tk.LEFT) + + # Add the ExpandingButton to the Squeezer's list. + self.expandingbuttons.append(expandingbutton) + + editwin.write = mywrite def count_lines(self, s): """Count the number of lines in a given text. @@ -273,25 +298,24 @@ def count_lines(self, s): Tabs are considered tabwidth characters long. """ - # Tab width is configurable - tabwidth = self.editwin.get_tk_tabwidth() - - # Get the Text widget's size - linewidth = self.editwin.text.winfo_width() - # Deduct the border and padding - linewidth -= 2*sum([int(self.editwin.text.cget(opt)) - for opt in ('border', 'padx')]) - - # Get the Text widget's font - font = Font(self.editwin.text, name=self.editwin.text.cget('font')) - # Divide the size of the Text widget by the font's width. - # According to Tk8.5 docs, the Text widget's width is set - # according to the width of its font's '0' (zero) character, - # so we will use this as an approximation. - # see: http://www.tcl.tk/man/tcl8.5/TkCmd/text.htm#M-width - linewidth //= font.measure('0') - - return count_lines_with_wrapping(s, linewidth, tabwidth) + linewidth = self.get_line_width() + return count_lines_with_wrapping(s, linewidth) + + def get_line_width(self): + # The maximum line length in pixels: The width of the text + # widget, minus twice the border width and internal padding. + linewidth_pixels = \ + self.base_text.winfo_width() - self.window_width_delta + + # Divide the width of the Text widget by the font width, + # which is taken to be the width of '0' (zero). + # http://www.tcl.tk/man/tcl8.6/TkCmd/text.htm#M21 + return linewidth_pixels // self.zero_char_width + + def load_font(self): + text = self.base_text + self.zero_char_width = \ + Font(text, font=text.cget('font')).measure('0') def squeeze_current_text_event(self, event): """squeeze-current-text event handler @@ -301,29 +325,29 @@ def squeeze_current_text_event(self, event): If the insert cursor is not in a squeezable block of text, give the user a small warning and do nothing. """ - # set tag_name to the first valid tag found on the "insert" cursor + # Set tag_name to the first valid tag found on the "insert" cursor. tag_names = self.text.tag_names(tk.INSERT) for tag_name in ("stdout", "stderr"): if tag_name in tag_names: break else: - # the insert cursor doesn't have a "stdout" or "stderr" tag + # The insert cursor doesn't have a "stdout" or "stderr" tag. self.text.bell() return "break" - # find the range to squeeze + # Find the range to squeeze. start, end = self.text.tag_prevrange(tag_name, tk.INSERT + "+1c") s = self.text.get(start, end) - # if the last char is a newline, remove it from the range + # If the last char is a newline, remove it from the range. if len(s) > 0 and s[-1] == '\n': end = self.text.index("%s-1c" % end) s = s[:-1] - # delete the text + # Delete the text. self.base_text.delete(start, end) - # prepare an ExpandingButton + # Prepare an ExpandingButton. numoflines = self.count_lines(s) expandingbutton = ExpandingButton(s, tag_name, numoflines, self) @@ -331,9 +355,9 @@ def squeeze_current_text_event(self, event): self.text.window_create(start, window=expandingbutton, padx=3, pady=5) - # insert the ExpandingButton to the list of ExpandingButtons, while - # keeping the list ordered according to the position of the buttons in - # the Text widget + # Insert the ExpandingButton to the list of ExpandingButtons, + # while keeping the list ordered according to the position of + # the buttons in the Text widget. i = len(self.expandingbuttons) while i > 0 and self.text.compare(self.expandingbuttons[i-1], ">", expandingbutton): diff --git a/Misc/NEWS.d/next/IDLE/2018-12-27-17-46-42.bpo-35196.9E-xUh.rst b/Misc/NEWS.d/next/IDLE/2018-12-27-17-46-42.bpo-35196.9E-xUh.rst new file mode 100644 index 000000000000..ee90d76010d9 --- /dev/null +++ b/Misc/NEWS.d/next/IDLE/2018-12-27-17-46-42.bpo-35196.9E-xUh.rst @@ -0,0 +1 @@ +Speed up squeezer line counting. From webhook-mailer at python.org Sun Jan 13 11:43:11 2019 From: webhook-mailer at python.org (Miss Islington (bot)) Date: Sun, 13 Jan 2019 16:43:11 -0000 Subject: [Python-checkins] bpo-35196: Optimize Squeezer's write() interception (GH-10454) Message-ID: https://github.com/python/cpython/commit/47bd7770229b5238a438703ee1d52da2e983ec9e commit: 47bd7770229b5238a438703ee1d52da2e983ec9e branch: 3.7 author: Miss Islington (bot) <31488909+miss-islington at users.noreply.github.com> committer: GitHub date: 2019-01-13T08:43:08-08:00 summary: bpo-35196: Optimize Squeezer's write() interception (GH-10454) The new functionality of Squeezer.reload() is also tested, along with some general re-working of the tests in test_squeezer.py. (cherry picked from commit 39a33e99270848d34628cdbb1fdb727f9ede502a) Co-authored-by: Tal Einat files: A Misc/NEWS.d/next/IDLE/2018-12-27-17-46-42.bpo-35196.9E-xUh.rst M Lib/idlelib/NEWS.txt M Lib/idlelib/editor.py M Lib/idlelib/idle_test/test_squeezer.py M Lib/idlelib/pyshell.py M Lib/idlelib/squeezer.py diff --git a/Lib/idlelib/NEWS.txt b/Lib/idlelib/NEWS.txt index 80308427c516..8cd4011ac857 100644 --- a/Lib/idlelib/NEWS.txt +++ b/Lib/idlelib/NEWS.txt @@ -3,6 +3,8 @@ Released on 2019-??-?? ====================================== +bpo-35196: Speed up squeezer line counting. + bpo-35208: Squeezer now counts wrapped lines before newlines. bpo-35555: Gray out Code Context menu entry when it's not applicable. diff --git a/Lib/idlelib/editor.py b/Lib/idlelib/editor.py index f4437668a3ed..d13ac3786da4 100644 --- a/Lib/idlelib/editor.py +++ b/Lib/idlelib/editor.py @@ -317,9 +317,6 @@ def __init__(self, flist=None, filename=None, key=None, root=None): text.bind("<>", self.ZoomHeight(self).zoom_height_event) text.bind("<>", self.CodeContext(self).toggle_code_context_event) - squeezer = self.Squeezer(self) - text.bind("<>", - squeezer.squeeze_current_text_event) def _filename_to_unicode(self, filename): """Return filename as BMP unicode so diplayable in Tk.""" diff --git a/Lib/idlelib/idle_test/test_squeezer.py b/Lib/idlelib/idle_test/test_squeezer.py index da2c2dd50c65..7c28a107a90f 100644 --- a/Lib/idlelib/idle_test/test_squeezer.py +++ b/Lib/idlelib/idle_test/test_squeezer.py @@ -1,3 +1,5 @@ +"Test squeezer, coverage 95%" + from collections import namedtuple from textwrap import dedent from tkinter import Text, Tk @@ -33,10 +35,10 @@ def cleanup_root(): class CountLinesTest(unittest.TestCase): """Tests for the count_lines_with_wrapping function.""" - def check(self, expected, text, linewidth, tabwidth): + def check(self, expected, text, linewidth): return self.assertEqual( expected, - count_lines_with_wrapping(text, linewidth, tabwidth), + count_lines_with_wrapping(text, linewidth), ) def test_count_empty(self): @@ -55,37 +57,14 @@ def test_count_several_lines(self): """Test with several lines of text.""" self.assertEqual(count_lines_with_wrapping("1\n2\n3\n"), 3) - def test_tab_width(self): - """Test with various tab widths and line widths.""" - self.check(expected=1, text='\t' * 1, linewidth=8, tabwidth=4) - self.check(expected=1, text='\t' * 2, linewidth=8, tabwidth=4) - self.check(expected=2, text='\t' * 3, linewidth=8, tabwidth=4) - self.check(expected=2, text='\t' * 4, linewidth=8, tabwidth=4) - self.check(expected=3, text='\t' * 5, linewidth=8, tabwidth=4) - - # test longer lines and various tab widths - self.check(expected=4, text='\t' * 10, linewidth=12, tabwidth=4) - self.check(expected=10, text='\t' * 10, linewidth=12, tabwidth=8) - self.check(expected=2, text='\t' * 4, linewidth=10, tabwidth=3) - - # test tabwidth=1 - self.check(expected=2, text='\t' * 9, linewidth=5, tabwidth=1) - self.check(expected=2, text='\t' * 10, linewidth=5, tabwidth=1) - self.check(expected=3, text='\t' * 11, linewidth=5, tabwidth=1) - - # test for off-by-one errors - self.check(expected=2, text='\t' * 6, linewidth=12, tabwidth=4) - self.check(expected=3, text='\t' * 6, linewidth=11, tabwidth=4) - self.check(expected=2, text='\t' * 6, linewidth=13, tabwidth=4) - def test_empty_lines(self): - self.check(expected=1, text='\n', linewidth=80, tabwidth=8) - self.check(expected=2, text='\n\n', linewidth=80, tabwidth=8) - self.check(expected=10, text='\n' * 10, linewidth=80, tabwidth=8) + self.check(expected=1, text='\n', linewidth=80) + self.check(expected=2, text='\n\n', linewidth=80) + self.check(expected=10, text='\n' * 10, linewidth=80) def test_long_line(self): - self.check(expected=3, text='a' * 200, linewidth=80, tabwidth=8) - self.check(expected=3, text='a' * 200 + '\n', linewidth=80, tabwidth=8) + self.check(expected=3, text='a' * 200, linewidth=80) + self.check(expected=3, text='a' * 200 + '\n', linewidth=80) def test_several_lines_different_lengths(self): text = dedent("""\ @@ -94,82 +73,78 @@ def test_several_lines_different_lengths(self): 7 chars 13 characters""") - self.check(expected=5, text=text, linewidth=80, tabwidth=8) - self.check(expected=5, text=text + '\n', linewidth=80, tabwidth=8) - self.check(expected=6, text=text, linewidth=40, tabwidth=8) - self.check(expected=7, text=text, linewidth=20, tabwidth=8) - self.check(expected=11, text=text, linewidth=10, tabwidth=8) + self.check(expected=5, text=text, linewidth=80) + self.check(expected=5, text=text + '\n', linewidth=80) + self.check(expected=6, text=text, linewidth=40) + self.check(expected=7, text=text, linewidth=20) + self.check(expected=11, text=text, linewidth=10) class SqueezerTest(unittest.TestCase): """Tests for the Squeezer class.""" - def make_mock_editor_window(self): + def tearDown(self): + # Clean up the Squeezer class's reference to its instance, + # to avoid side-effects from one test case upon another. + if Squeezer._instance_weakref is not None: + Squeezer._instance_weakref = None + + def make_mock_editor_window(self, with_text_widget=False): """Create a mock EditorWindow instance.""" editwin = NonCallableMagicMock() # isinstance(editwin, PyShell) must be true for Squeezer to enable - # auto-squeezing; in practice this will always be true + # auto-squeezing; in practice this will always be true. editwin.__class__ = PyShell + + if with_text_widget: + editwin.root = get_test_tk_root(self) + text_widget = self.make_text_widget(root=editwin.root) + editwin.text = editwin.per.bottom = text_widget + return editwin def make_squeezer_instance(self, editor_window=None): """Create an actual Squeezer instance with a mock EditorWindow.""" if editor_window is None: editor_window = self.make_mock_editor_window() - return Squeezer(editor_window) + squeezer = Squeezer(editor_window) + squeezer.get_line_width = Mock(return_value=80) + return squeezer + + def make_text_widget(self, root=None): + if root is None: + root = get_test_tk_root(self) + text_widget = Text(root) + text_widget["font"] = ('Courier', 10) + text_widget.mark_set("iomark", "1.0") + return text_widget + + def set_idleconf_option_with_cleanup(self, configType, section, option, value): + prev_val = idleConf.GetOption(configType, section, option) + idleConf.SetOption(configType, section, option, value) + self.addCleanup(idleConf.SetOption, + configType, section, option, prev_val) def test_count_lines(self): - """Test Squeezer.count_lines() with various inputs. - - This checks that Squeezer.count_lines() calls the - count_lines_with_wrapping() function with the appropriate parameters. - """ - for tabwidth, linewidth in [(4, 80), (1, 79), (8, 80), (3, 120)]: - self._test_count_lines_helper(linewidth=linewidth, - tabwidth=tabwidth) - - def _prepare_mock_editwin_for_count_lines(self, editwin, - linewidth, tabwidth): - """Prepare a mock EditorWindow object for Squeezer.count_lines.""" - CHAR_WIDTH = 10 - BORDER_WIDTH = 2 - PADDING_WIDTH = 1 - - # Prepare all the required functionality on the mock EditorWindow object - # so that the calculations in Squeezer.count_lines() can run. - editwin.get_tk_tabwidth.return_value = tabwidth - editwin.text.winfo_width.return_value = \ - linewidth * CHAR_WIDTH + 2 * (BORDER_WIDTH + PADDING_WIDTH) - text_opts = { - 'border': BORDER_WIDTH, - 'padx': PADDING_WIDTH, - 'font': None, - } - editwin.text.cget = lambda opt: text_opts[opt] - - # monkey-path tkinter.font.Font with a mock object, so that - # Font.measure('0') returns CHAR_WIDTH - mock_font = Mock() - def measure(char): - if char == '0': - return CHAR_WIDTH - raise ValueError("measure should only be called on '0'!") - mock_font.return_value.measure = measure - patcher = patch('idlelib.squeezer.Font', mock_font) - patcher.start() - self.addCleanup(patcher.stop) - - def _test_count_lines_helper(self, linewidth, tabwidth): - """Helper for test_count_lines.""" + """Test Squeezer.count_lines() with various inputs.""" editwin = self.make_mock_editor_window() - self._prepare_mock_editwin_for_count_lines(editwin, linewidth, tabwidth) squeezer = self.make_squeezer_instance(editwin) - mock_count_lines = Mock(return_value=SENTINEL_VALUE) - text = 'TEXT' - with patch('idlelib.squeezer.count_lines_with_wrapping', - mock_count_lines): - self.assertIs(squeezer.count_lines(text), SENTINEL_VALUE) - mock_count_lines.assert_called_with(text, linewidth, tabwidth) + for text_code, line_width, expected in [ + (r"'\n'", 80, 1), + (r"'\n' * 3", 80, 3), + (r"'a' * 40 + '\n'", 80, 1), + (r"'a' * 80 + '\n'", 80, 1), + (r"'a' * 200 + '\n'", 80, 3), + (r"'aa\t' * 20", 80, 2), + (r"'aa\t' * 21", 80, 3), + (r"'aa\t' * 20", 40, 4), + ]: + with self.subTest(text_code=text_code, + line_width=line_width, + expected=expected): + text = eval(text_code) + squeezer.get_line_width.return_value = line_width + self.assertEqual(squeezer.count_lines(text), expected) def test_init(self): """Test the creation of Squeezer instances.""" @@ -207,8 +182,6 @@ def test_write_not_stdout(self): def test_write_stdout(self): """Test Squeezer's overriding of the EditorWindow's write() method.""" editwin = self.make_mock_editor_window() - self._prepare_mock_editwin_for_count_lines(editwin, - linewidth=80, tabwidth=8) for text in ['', 'TEXT']: editwin.write = orig_write = Mock(return_value=SENTINEL_VALUE) @@ -232,12 +205,8 @@ def test_write_stdout(self): def test_auto_squeeze(self): """Test that the auto-squeezing creates an ExpandingButton properly.""" - root = get_test_tk_root(self) - text_widget = Text(root) - text_widget.mark_set("iomark", "1.0") - - editwin = self.make_mock_editor_window() - editwin.text = text_widget + editwin = self.make_mock_editor_window(with_text_widget=True) + text_widget = editwin.text squeezer = self.make_squeezer_instance(editwin) squeezer.auto_squeeze_min_lines = 5 squeezer.count_lines = Mock(return_value=6) @@ -248,58 +217,48 @@ def test_auto_squeeze(self): def test_squeeze_current_text_event(self): """Test the squeeze_current_text event.""" - root = get_test_tk_root(self) - - # squeezing text should work for both stdout and stderr + # Squeezing text should work for both stdout and stderr. for tag_name in ["stdout", "stderr"]: - text_widget = Text(root) - text_widget.mark_set("iomark", "1.0") - - editwin = self.make_mock_editor_window() - editwin.text = editwin.per.bottom = text_widget + editwin = self.make_mock_editor_window(with_text_widget=True) + text_widget = editwin.text squeezer = self.make_squeezer_instance(editwin) squeezer.count_lines = Mock(return_value=6) - # prepare some text in the Text widget + # Prepare some text in the Text widget. text_widget.insert("1.0", "SOME\nTEXT\n", tag_name) text_widget.mark_set("insert", "1.0") self.assertEqual(text_widget.get('1.0', 'end'), 'SOME\nTEXT\n\n') self.assertEqual(len(squeezer.expandingbuttons), 0) - # test squeezing the current text + # Test squeezing the current text. retval = squeezer.squeeze_current_text_event(event=Mock()) self.assertEqual(retval, "break") self.assertEqual(text_widget.get('1.0', 'end'), '\n\n') self.assertEqual(len(squeezer.expandingbuttons), 1) self.assertEqual(squeezer.expandingbuttons[0].s, 'SOME\nTEXT') - # test that expanding the squeezed text works and afterwards the - # Text widget contains the original text + # Test that expanding the squeezed text works and afterwards + # the Text widget contains the original text. squeezer.expandingbuttons[0].expand(event=Mock()) self.assertEqual(text_widget.get('1.0', 'end'), 'SOME\nTEXT\n\n') self.assertEqual(len(squeezer.expandingbuttons), 0) def test_squeeze_current_text_event_no_allowed_tags(self): """Test that the event doesn't squeeze text without a relevant tag.""" - root = get_test_tk_root(self) - - text_widget = Text(root) - text_widget.mark_set("iomark", "1.0") - - editwin = self.make_mock_editor_window() - editwin.text = editwin.per.bottom = text_widget + editwin = self.make_mock_editor_window(with_text_widget=True) + text_widget = editwin.text squeezer = self.make_squeezer_instance(editwin) squeezer.count_lines = Mock(return_value=6) - # prepare some text in the Text widget + # Prepare some text in the Text widget. text_widget.insert("1.0", "SOME\nTEXT\n", "TAG") text_widget.mark_set("insert", "1.0") self.assertEqual(text_widget.get('1.0', 'end'), 'SOME\nTEXT\n\n') self.assertEqual(len(squeezer.expandingbuttons), 0) - # test squeezing the current text + # Test squeezing the current text. retval = squeezer.squeeze_current_text_event(event=Mock()) self.assertEqual(retval, "break") self.assertEqual(text_widget.get('1.0', 'end'), 'SOME\nTEXT\n\n') @@ -307,23 +266,18 @@ def test_squeeze_current_text_event_no_allowed_tags(self): def test_squeeze_text_before_existing_squeezed_text(self): """Test squeezing text before existing squeezed text.""" - root = get_test_tk_root(self) - - text_widget = Text(root) - text_widget.mark_set("iomark", "1.0") - - editwin = self.make_mock_editor_window() - editwin.text = editwin.per.bottom = text_widget + editwin = self.make_mock_editor_window(with_text_widget=True) + text_widget = editwin.text squeezer = self.make_squeezer_instance(editwin) squeezer.count_lines = Mock(return_value=6) - # prepare some text in the Text widget and squeeze it + # Prepare some text in the Text widget and squeeze it. text_widget.insert("1.0", "SOME\nTEXT\n", "stdout") text_widget.mark_set("insert", "1.0") squeezer.squeeze_current_text_event(event=Mock()) self.assertEqual(len(squeezer.expandingbuttons), 1) - # test squeezing the current text + # Test squeezing the current text. text_widget.insert("1.0", "MORE\nSTUFF\n", "stdout") text_widget.mark_set("insert", "1.0") retval = squeezer.squeeze_current_text_event(event=Mock()) @@ -336,27 +290,30 @@ def test_squeeze_text_before_existing_squeezed_text(self): squeezer.expandingbuttons[1], )) - GetOptionSignature = namedtuple('GetOptionSignature', - 'configType section option default type warn_on_default raw') - @classmethod - def _make_sig(cls, configType, section, option, default=sentinel.NOT_GIVEN, - type=sentinel.NOT_GIVEN, - warn_on_default=sentinel.NOT_GIVEN, - raw=sentinel.NOT_GIVEN): - return cls.GetOptionSignature(configType, section, option, default, - type, warn_on_default, raw) - - @classmethod - def get_GetOption_signature(cls, mock_call_obj): - args, kwargs = mock_call_obj[-2:] - return cls._make_sig(*args, **kwargs) - def test_reload(self): """Test the reload() class-method.""" - self.assertIsInstance(Squeezer.auto_squeeze_min_lines, int) - idleConf.SetOption('main', 'PyShell', 'auto-squeeze-min-lines', '42') + editwin = self.make_mock_editor_window(with_text_widget=True) + text_widget = editwin.text + squeezer = self.make_squeezer_instance(editwin) + + orig_zero_char_width = squeezer.zero_char_width + orig_auto_squeeze_min_lines = squeezer.auto_squeeze_min_lines + + # Increase both font size and auto-squeeze-min-lines. + text_widget["font"] = ('Courier', 20) + new_auto_squeeze_min_lines = orig_auto_squeeze_min_lines + 10 + self.set_idleconf_option_with_cleanup( + 'main', 'PyShell', 'auto-squeeze-min-lines', + str(new_auto_squeeze_min_lines)) + + Squeezer.reload() + self.assertGreater(squeezer.zero_char_width, orig_zero_char_width) + self.assertEqual(squeezer.auto_squeeze_min_lines, + new_auto_squeeze_min_lines) + + def test_reload_no_squeezer_instances(self): + """Test that Squeezer.reload() runs without any instances existing.""" Squeezer.reload() - self.assertEqual(Squeezer.auto_squeeze_min_lines, 42) class ExpandingButtonTest(unittest.TestCase): @@ -369,7 +326,7 @@ def make_mock_squeezer(self): squeezer = Mock() squeezer.editwin.text = Text(root) - # Set default values for the configuration settings + # Set default values for the configuration settings. squeezer.auto_squeeze_min_lines = 50 return squeezer @@ -382,23 +339,23 @@ def test_init(self, MockHovertip): expandingbutton = ExpandingButton('TEXT', 'TAGS', 50, squeezer) self.assertEqual(expandingbutton.s, 'TEXT') - # check that the underlying tkinter.Button is properly configured + # Check that the underlying tkinter.Button is properly configured. self.assertEqual(expandingbutton.master, text_widget) self.assertTrue('50 lines' in expandingbutton.cget('text')) - # check that the text widget still contains no text + # Check that the text widget still contains no text. self.assertEqual(text_widget.get('1.0', 'end'), '\n') - # check that the mouse events are bound + # Check that the mouse events are bound. self.assertIn('', expandingbutton.bind()) right_button_code = '' % ('2' if macosx.isAquaTk() else '3') self.assertIn(right_button_code, expandingbutton.bind()) - # check that ToolTip was called once, with appropriate values + # Check that ToolTip was called once, with appropriate values. self.assertEqual(MockHovertip.call_count, 1) MockHovertip.assert_called_with(expandingbutton, ANY, hover_delay=ANY) - # check that 'right-click' appears in the tooltip text + # Check that 'right-click' appears in the tooltip text. tooltip_text = MockHovertip.call_args[0][1] self.assertIn('right-click', tooltip_text.lower()) @@ -407,29 +364,30 @@ def test_expand(self): squeezer = self.make_mock_squeezer() expandingbutton = ExpandingButton('TEXT', 'TAGS', 50, squeezer) - # insert the button into the text widget - # (this is normally done by the Squeezer class) + # Insert the button into the text widget + # (this is normally done by the Squeezer class). text_widget = expandingbutton.text text_widget.window_create("1.0", window=expandingbutton) - # set base_text to the text widget, so that changes are actually made - # to it (by ExpandingButton) and we can inspect these changes afterwards + # Set base_text to the text widget, so that changes are actually + # made to it (by ExpandingButton) and we can inspect these + # changes afterwards. expandingbutton.base_text = expandingbutton.text # trigger the expand event retval = expandingbutton.expand(event=Mock()) self.assertEqual(retval, None) - # check that the text was inserted into the text widget + # Check that the text was inserted into the text widget. self.assertEqual(text_widget.get('1.0', 'end'), 'TEXT\n') - # check that the 'TAGS' tag was set on the inserted text + # Check that the 'TAGS' tag was set on the inserted text. text_end_index = text_widget.index('end-1c') self.assertEqual(text_widget.get('1.0', text_end_index), 'TEXT') self.assertEqual(text_widget.tag_nextrange('TAGS', '1.0'), ('1.0', text_end_index)) - # check that the button removed itself from squeezer.expandingbuttons + # Check that the button removed itself from squeezer.expandingbuttons. self.assertEqual(squeezer.expandingbuttons.remove.call_count, 1) squeezer.expandingbuttons.remove.assert_called_with(expandingbutton) @@ -441,55 +399,54 @@ def test_expand_dangerous_oupput(self): expandingbutton.set_is_dangerous() self.assertTrue(expandingbutton.is_dangerous) - # insert the button into the text widget - # (this is normally done by the Squeezer class) + # Insert the button into the text widget + # (this is normally done by the Squeezer class). text_widget = expandingbutton.text text_widget.window_create("1.0", window=expandingbutton) - # set base_text to the text widget, so that changes are actually made - # to it (by ExpandingButton) and we can inspect these changes afterwards + # Set base_text to the text widget, so that changes are actually + # made to it (by ExpandingButton) and we can inspect these + # changes afterwards. expandingbutton.base_text = expandingbutton.text - # patch the message box module to always return False + # Patch the message box module to always return False. with patch('idlelib.squeezer.tkMessageBox') as mock_msgbox: mock_msgbox.askokcancel.return_value = False mock_msgbox.askyesno.return_value = False - - # trigger the expand event + # Trigger the expand event. retval = expandingbutton.expand(event=Mock()) - # check that the event chain was broken and no text was inserted + # Check that the event chain was broken and no text was inserted. self.assertEqual(retval, 'break') self.assertEqual(expandingbutton.text.get('1.0', 'end-1c'), '') - # patch the message box module to always return True + # Patch the message box module to always return True. with patch('idlelib.squeezer.tkMessageBox') as mock_msgbox: mock_msgbox.askokcancel.return_value = True mock_msgbox.askyesno.return_value = True - - # trigger the expand event + # Trigger the expand event. retval = expandingbutton.expand(event=Mock()) - # check that the event chain wasn't broken and the text was inserted + # Check that the event chain wasn't broken and the text was inserted. self.assertEqual(retval, None) self.assertEqual(expandingbutton.text.get('1.0', 'end-1c'), text) def test_copy(self): """Test the copy event.""" - # testing with the actual clipboard proved problematic, so this test - # replaces the clipboard manipulation functions with mocks and checks - # that they are called appropriately + # Testing with the actual clipboard proved problematic, so this + # test replaces the clipboard manipulation functions with mocks + # and checks that they are called appropriately. squeezer = self.make_mock_squeezer() expandingbutton = ExpandingButton('TEXT', 'TAGS', 50, squeezer) expandingbutton.clipboard_clear = Mock() expandingbutton.clipboard_append = Mock() - # trigger the copy event + # Trigger the copy event. retval = expandingbutton.copy(event=Mock()) self.assertEqual(retval, None) - # check that the expanding button called clipboard_clear() and - # clipboard_append('TEXT') once each + # Vheck that the expanding button called clipboard_clear() and + # clipboard_append('TEXT') once each. self.assertEqual(expandingbutton.clipboard_clear.call_count, 1) self.assertEqual(expandingbutton.clipboard_append.call_count, 1) expandingbutton.clipboard_append.assert_called_with('TEXT') @@ -502,13 +459,13 @@ def test_view(self): with patch('idlelib.squeezer.view_text', autospec=view_text)\ as mock_view_text: - # trigger the view event + # Trigger the view event. expandingbutton.view(event=Mock()) - # check that the expanding button called view_text + # Check that the expanding button called view_text. self.assertEqual(mock_view_text.call_count, 1) - # check that the proper text was passed + # Check that the proper text was passed. self.assertEqual(mock_view_text.call_args[0][2], 'TEXT') def test_rmenu(self): diff --git a/Lib/idlelib/pyshell.py b/Lib/idlelib/pyshell.py index b6172fd6b9bb..ea49aff08b43 100755 --- a/Lib/idlelib/pyshell.py +++ b/Lib/idlelib/pyshell.py @@ -899,6 +899,9 @@ def __init__(self, flist=None): if use_subprocess: text.bind("<>", self.view_restart_mark) text.bind("<>", self.restart_shell) + squeezer = self.Squeezer(self) + text.bind("<>", + squeezer.squeeze_current_text_event) self.save_stdout = sys.stdout self.save_stderr = sys.stderr diff --git a/Lib/idlelib/squeezer.py b/Lib/idlelib/squeezer.py index 8960356799a4..869498d753a2 100644 --- a/Lib/idlelib/squeezer.py +++ b/Lib/idlelib/squeezer.py @@ -15,6 +15,7 @@ messages and their tracebacks. """ import re +import weakref import tkinter as tk from tkinter.font import Font @@ -26,7 +27,7 @@ from idlelib import macosx -def count_lines_with_wrapping(s, linewidth=80, tabwidth=8): +def count_lines_with_wrapping(s, linewidth=80): """Count the number of lines in a given string. Lines are counted as if the string was wrapped so that lines are never over @@ -34,25 +35,27 @@ def count_lines_with_wrapping(s, linewidth=80, tabwidth=8): Tabs are considered tabwidth characters long. """ + tabwidth = 8 # Currently always true in Shell. pos = 0 linecount = 1 current_column = 0 for m in re.finditer(r"[\t\n]", s): - # process the normal chars up to tab or newline + # Process the normal chars up to tab or newline. numchars = m.start() - pos pos += numchars current_column += numchars - # deal with tab or newline + # Deal with tab or newline. if s[pos] == '\n': - # Avoid the `current_column == 0` edge-case, and while we're at it, - # don't bother adding 0. + # Avoid the `current_column == 0` edge-case, and while we're + # at it, don't bother adding 0. if current_column > linewidth: - # If the current column was exactly linewidth, divmod would give - # (1,0), even though a new line hadn't yet been started. The same - # is true if length is any exact multiple of linewidth. Therefore, - # subtract 1 before dividing a non-empty line. + # If the current column was exactly linewidth, divmod + # would give (1,0), even though a new line hadn't yet + # been started. The same is true if length is any exact + # multiple of linewidth. Therefore, subtract 1 before + # dividing a non-empty line. linecount += (current_column - 1) // linewidth linecount += 1 current_column = 0 @@ -60,21 +63,21 @@ def count_lines_with_wrapping(s, linewidth=80, tabwidth=8): assert s[pos] == '\t' current_column += tabwidth - (current_column % tabwidth) - # if a tab passes the end of the line, consider the entire tab as - # being on the next line + # If a tab passes the end of the line, consider the entire + # tab as being on the next line. if current_column > linewidth: linecount += 1 current_column = tabwidth - pos += 1 # after the tab or newline + pos += 1 # After the tab or newline. - # process remaining chars (no more tabs or newlines) + # Process remaining chars (no more tabs or newlines). current_column += len(s) - pos - # avoid divmod(-1, linewidth) + # Avoid divmod(-1, linewidth). if current_column > 0: linecount += (current_column - 1) // linewidth else: - # the text ended with a newline; don't count an extra line after it + # Text ended with newline; don't count an extra line after it. linecount -= 1 return linecount @@ -98,9 +101,7 @@ def __init__(self, s, tags, numoflines, squeezer): self.squeezer = squeezer self.editwin = editwin = squeezer.editwin self.text = text = editwin.text - - # the base Text widget of the PyShell object, used to change text - # before the iomark + # The base Text widget is needed to change text before iomark. self.base_text = editwin.per.bottom line_plurality = "lines" if numoflines != 1 else "line" @@ -119,7 +120,7 @@ def __init__(self, s, tags, numoflines, squeezer): self.bind("", self.context_menu_event) else: self.bind("", self.context_menu_event) - self.selection_handle( + self.selection_handle( # X windows only. lambda offset, length: s[int(offset):int(offset) + int(length)]) self.is_dangerous = None @@ -182,7 +183,7 @@ def view(self, event=None): modal=False, wrap='none') rmenu_specs = ( - # item structure: (label, method_name) + # Item structure: (label, method_name). ('copy', 'copy'), ('view', 'view'), ) @@ -202,6 +203,8 @@ class Squeezer: This avoids IDLE's shell slowing down considerably, and even becoming completely unresponsive, when very long outputs are written. """ + _instance_weakref = None + @classmethod def reload(cls): """Load class variables from config.""" @@ -210,6 +213,14 @@ def reload(cls): type="int", default=50, ) + # Loading the font info requires a Tk root. IDLE doesn't rely + # on Tkinter's "default root", so the instance will reload + # font info using its editor windows's Tk root. + if cls._instance_weakref is not None: + instance = cls._instance_weakref() + if instance is not None: + instance.load_font() + def __init__(self, editwin): """Initialize settings for Squeezer. @@ -223,44 +234,58 @@ def __init__(self, editwin): self.editwin = editwin self.text = text = editwin.text - # Get the base Text widget of the PyShell object, used to change text - # before the iomark. PyShell deliberately disables changing text before - # the iomark via its 'text' attribute, which is actually a wrapper for - # the actual Text widget. Squeezer, however, needs to make such changes. + # Get the base Text widget of the PyShell object, used to change + # text before the iomark. PyShell deliberately disables changing + # text before the iomark via its 'text' attribute, which is + # actually a wrapper for the actual Text widget. Squeezer, + # however, needs to make such changes. self.base_text = editwin.per.bottom + Squeezer._instance_weakref = weakref.ref(self) + self.load_font() + + # Twice the text widget's border width and internal padding; + # pre-calculated here for the get_line_width() method. + self.window_width_delta = 2 * ( + int(text.cget('border')) + + int(text.cget('padx')) + ) + self.expandingbuttons = [] - from idlelib.pyshell import PyShell # done here to avoid import cycle - if isinstance(editwin, PyShell): - # If we get a PyShell instance, replace its write method with a - # wrapper, which inserts an ExpandingButton instead of a long text. - def mywrite(s, tags=(), write=editwin.write): - # only auto-squeeze text which has just the "stdout" tag - if tags != "stdout": - return write(s, tags) - - # only auto-squeeze text with at least the minimum - # configured number of lines - numoflines = self.count_lines(s) - if numoflines < self.auto_squeeze_min_lines: - return write(s, tags) - - # create an ExpandingButton instance - expandingbutton = ExpandingButton(s, tags, numoflines, - self) - - # insert the ExpandingButton into the Text widget - text.mark_gravity("iomark", tk.RIGHT) - text.window_create("iomark", window=expandingbutton, - padx=3, pady=5) - text.see("iomark") - text.update() - text.mark_gravity("iomark", tk.LEFT) - - # add the ExpandingButton to the Squeezer's list - self.expandingbuttons.append(expandingbutton) - - editwin.write = mywrite + + # Replace the PyShell instance's write method with a wrapper, + # which inserts an ExpandingButton instead of a long text. + def mywrite(s, tags=(), write=editwin.write): + # Only auto-squeeze text which has just the "stdout" tag. + if tags != "stdout": + return write(s, tags) + + # Only auto-squeeze text with at least the minimum + # configured number of lines. + auto_squeeze_min_lines = self.auto_squeeze_min_lines + # First, a very quick check to skip very short texts. + if len(s) < auto_squeeze_min_lines: + return write(s, tags) + # Now the full line-count check. + numoflines = self.count_lines(s) + if numoflines < auto_squeeze_min_lines: + return write(s, tags) + + # Create an ExpandingButton instance. + expandingbutton = ExpandingButton(s, tags, numoflines, self) + + # Insert the ExpandingButton into the Text widget. + text.mark_gravity("iomark", tk.RIGHT) + text.window_create("iomark", window=expandingbutton, + padx=3, pady=5) + text.see("iomark") + text.update() + text.mark_gravity("iomark", tk.LEFT) + + # Add the ExpandingButton to the Squeezer's list. + self.expandingbuttons.append(expandingbutton) + + editwin.write = mywrite def count_lines(self, s): """Count the number of lines in a given text. @@ -273,25 +298,24 @@ def count_lines(self, s): Tabs are considered tabwidth characters long. """ - # Tab width is configurable - tabwidth = self.editwin.get_tk_tabwidth() - - # Get the Text widget's size - linewidth = self.editwin.text.winfo_width() - # Deduct the border and padding - linewidth -= 2*sum([int(self.editwin.text.cget(opt)) - for opt in ('border', 'padx')]) - - # Get the Text widget's font - font = Font(self.editwin.text, name=self.editwin.text.cget('font')) - # Divide the size of the Text widget by the font's width. - # According to Tk8.5 docs, the Text widget's width is set - # according to the width of its font's '0' (zero) character, - # so we will use this as an approximation. - # see: http://www.tcl.tk/man/tcl8.5/TkCmd/text.htm#M-width - linewidth //= font.measure('0') - - return count_lines_with_wrapping(s, linewidth, tabwidth) + linewidth = self.get_line_width() + return count_lines_with_wrapping(s, linewidth) + + def get_line_width(self): + # The maximum line length in pixels: The width of the text + # widget, minus twice the border width and internal padding. + linewidth_pixels = \ + self.base_text.winfo_width() - self.window_width_delta + + # Divide the width of the Text widget by the font width, + # which is taken to be the width of '0' (zero). + # http://www.tcl.tk/man/tcl8.6/TkCmd/text.htm#M21 + return linewidth_pixels // self.zero_char_width + + def load_font(self): + text = self.base_text + self.zero_char_width = \ + Font(text, font=text.cget('font')).measure('0') def squeeze_current_text_event(self, event): """squeeze-current-text event handler @@ -301,29 +325,29 @@ def squeeze_current_text_event(self, event): If the insert cursor is not in a squeezable block of text, give the user a small warning and do nothing. """ - # set tag_name to the first valid tag found on the "insert" cursor + # Set tag_name to the first valid tag found on the "insert" cursor. tag_names = self.text.tag_names(tk.INSERT) for tag_name in ("stdout", "stderr"): if tag_name in tag_names: break else: - # the insert cursor doesn't have a "stdout" or "stderr" tag + # The insert cursor doesn't have a "stdout" or "stderr" tag. self.text.bell() return "break" - # find the range to squeeze + # Find the range to squeeze. start, end = self.text.tag_prevrange(tag_name, tk.INSERT + "+1c") s = self.text.get(start, end) - # if the last char is a newline, remove it from the range + # If the last char is a newline, remove it from the range. if len(s) > 0 and s[-1] == '\n': end = self.text.index("%s-1c" % end) s = s[:-1] - # delete the text + # Delete the text. self.base_text.delete(start, end) - # prepare an ExpandingButton + # Prepare an ExpandingButton. numoflines = self.count_lines(s) expandingbutton = ExpandingButton(s, tag_name, numoflines, self) @@ -331,9 +355,9 @@ def squeeze_current_text_event(self, event): self.text.window_create(start, window=expandingbutton, padx=3, pady=5) - # insert the ExpandingButton to the list of ExpandingButtons, while - # keeping the list ordered according to the position of the buttons in - # the Text widget + # Insert the ExpandingButton to the list of ExpandingButtons, + # while keeping the list ordered according to the position of + # the buttons in the Text widget. i = len(self.expandingbuttons) while i > 0 and self.text.compare(self.expandingbuttons[i-1], ">", expandingbutton): diff --git a/Misc/NEWS.d/next/IDLE/2018-12-27-17-46-42.bpo-35196.9E-xUh.rst b/Misc/NEWS.d/next/IDLE/2018-12-27-17-46-42.bpo-35196.9E-xUh.rst new file mode 100644 index 000000000000..ee90d76010d9 --- /dev/null +++ b/Misc/NEWS.d/next/IDLE/2018-12-27-17-46-42.bpo-35196.9E-xUh.rst @@ -0,0 +1 @@ +Speed up squeezer line counting. From webhook-mailer at python.org Sun Jan 13 12:50:32 2019 From: webhook-mailer at python.org (Terry Jan Reedy) Date: Sun, 13 Jan 2019 17:50:32 -0000 Subject: [Python-checkins] bpo-35730: Disable IDLE test_reload assertion. (GH-11543) Message-ID: https://github.com/python/cpython/commit/5bb146aaea1484bcc117ab6cb38dda39ceb5df0f commit: 5bb146aaea1484bcc117ab6cb38dda39ceb5df0f branch: master author: Terry Jan Reedy committer: GitHub date: 2019-01-13T12:50:29-05:00 summary: bpo-35730: Disable IDLE test_reload assertion. (GH-11543) IDLE's test_squeezer.SqueezerTest.test_reload, added for issue 35196, failed on both Gentoo buildbots. files: M Lib/idlelib/idle_test/test_squeezer.py diff --git a/Lib/idlelib/idle_test/test_squeezer.py b/Lib/idlelib/idle_test/test_squeezer.py index 7c28a107a90f..71eccd3693f0 100644 --- a/Lib/idlelib/idle_test/test_squeezer.py +++ b/Lib/idlelib/idle_test/test_squeezer.py @@ -307,7 +307,9 @@ def test_reload(self): str(new_auto_squeeze_min_lines)) Squeezer.reload() - self.assertGreater(squeezer.zero_char_width, orig_zero_char_width) + # The following failed on Gentoo buildbots. Issue title will be + # IDLE: Fix squeezer test_reload. + #self.assertGreater(squeezer.zero_char_width, orig_zero_char_width) self.assertEqual(squeezer.auto_squeeze_min_lines, new_auto_squeeze_min_lines) From webhook-mailer at python.org Sun Jan 13 13:05:53 2019 From: webhook-mailer at python.org (Miss Islington (bot)) Date: Sun, 13 Jan 2019 18:05:53 -0000 Subject: [Python-checkins] bpo-35730: Disable IDLE test_reload assertion. (GH-11543) Message-ID: https://github.com/python/cpython/commit/890d3fa10c68af6306cf6b989b2133978e6e7a12 commit: 890d3fa10c68af6306cf6b989b2133978e6e7a12 branch: 3.7 author: Miss Islington (bot) <31488909+miss-islington at users.noreply.github.com> committer: GitHub date: 2019-01-13T10:05:50-08:00 summary: bpo-35730: Disable IDLE test_reload assertion. (GH-11543) IDLE's test_squeezer.SqueezerTest.test_reload, added for issue 35196, failed on both Gentoo buildbots. (cherry picked from commit 5bb146aaea1484bcc117ab6cb38dda39ceb5df0f) Co-authored-by: Terry Jan Reedy files: M Lib/idlelib/idle_test/test_squeezer.py diff --git a/Lib/idlelib/idle_test/test_squeezer.py b/Lib/idlelib/idle_test/test_squeezer.py index 7c28a107a90f..71eccd3693f0 100644 --- a/Lib/idlelib/idle_test/test_squeezer.py +++ b/Lib/idlelib/idle_test/test_squeezer.py @@ -307,7 +307,9 @@ def test_reload(self): str(new_auto_squeeze_min_lines)) Squeezer.reload() - self.assertGreater(squeezer.zero_char_width, orig_zero_char_width) + # The following failed on Gentoo buildbots. Issue title will be + # IDLE: Fix squeezer test_reload. + #self.assertGreater(squeezer.zero_char_width, orig_zero_char_width) self.assertEqual(squeezer.auto_squeeze_min_lines, new_auto_squeeze_min_lines) From solipsis at pitrou.net Mon Jan 14 04:02:37 2019 From: solipsis at pitrou.net (solipsis at pitrou.net) Date: Mon, 14 Jan 2019 09:02:37 +0000 Subject: [Python-checkins] Daily reference leaks (4243df51fe43): sum=-2 Message-ID: <20190114090237.1.8AC5209D43BB26B0@psf.io> results for 4243df51fe43 on branch "default" -------------------------------------------- test_collections leaked [0, -7, 1] memory blocks, sum=-6 test_functools leaked [0, 3, 1] memory blocks, sum=4 test_multiprocessing_fork leaked [-2, 3, -1] memory blocks, sum=0 Command line was: ['./python', '-m', 'test.regrtest', '-uall', '-R', '3:3:/home/psf-users/antoine/refleaks/reflog6iJVef', '--timeout', '7200'] From webhook-mailer at python.org Mon Jan 14 05:23:49 2019 From: webhook-mailer at python.org (Victor Stinner) Date: Mon, 14 Jan 2019 10:23:49 -0000 Subject: [Python-checkins] bpo-35066: _dateime.datetime.strftime copies trailing '%' (GH-10692) Message-ID: https://github.com/python/cpython/commit/454b3d4ea246e8751534e105548d141ed7b0b032 commit: 454b3d4ea246e8751534e105548d141ed7b0b032 branch: master author: MichaelSaah committer: Victor Stinner date: 2019-01-14T11:23:39+01:00 summary: bpo-35066: _dateime.datetime.strftime copies trailing '%' (GH-10692) Previously, calling the strftime() method on a datetime object with a trailing '%' in the format string would result in an exception. However, this only occured when the datetime C module was being used; the python implementation did not match this behavior. Datetime is now PEP-399 compliant, and will not throw an exception on a trailing '%'. files: A Misc/NEWS.d/next/Library/2018-11-29-09-38-40.bpo-35066.Nwej2s.rst M Lib/test/datetimetester.py M Modules/_datetimemodule.c diff --git a/Lib/test/datetimetester.py b/Lib/test/datetimetester.py index 2f838c445555..d729c7efd52f 100644 --- a/Lib/test/datetimetester.py +++ b/Lib/test/datetimetester.py @@ -1351,6 +1351,17 @@ def test_strftime(self): #check that this standard extension works t.strftime("%f") + def test_strftime_trailing_percent(self): + # bpo-35066: make sure trailing '%' doesn't cause + # datetime's strftime to complain + t = self.theclass(2005, 3, 2) + try: + _time.strftime('%') + except ValueError: + self.skipTest('time module does not support trailing %') + self.assertEqual(t.strftime('%'), '%') + self.assertEqual(t.strftime("m:%m d:%d y:%y %"), "m:03 d:02 y:05 %") + def test_format(self): dt = self.theclass(2007, 9, 10) self.assertEqual(dt.__format__(''), str(dt)) diff --git a/Misc/NEWS.d/next/Library/2018-11-29-09-38-40.bpo-35066.Nwej2s.rst b/Misc/NEWS.d/next/Library/2018-11-29-09-38-40.bpo-35066.Nwej2s.rst new file mode 100644 index 000000000000..b0c39bd86383 --- /dev/null +++ b/Misc/NEWS.d/next/Library/2018-11-29-09-38-40.bpo-35066.Nwej2s.rst @@ -0,0 +1,5 @@ +Previously, calling the strftime() method on a datetime object with a +trailing '%' in the format string would result in an exception. However, +this only occured when the datetime C module was being used; the python +implementation did not match this behavior. Datetime is now PEP-399 +compliant, and will not throw an exception on a trailing '%'. diff --git a/Modules/_datetimemodule.c b/Modules/_datetimemodule.c index 16c7bffda8f6..7997758908bb 100644 --- a/Modules/_datetimemodule.c +++ b/Modules/_datetimemodule.c @@ -1528,10 +1528,13 @@ wrap_strftime(PyObject *object, PyObject *format, PyObject *timetuple, ntoappend = 1; } else if ((ch = *pin++) == '\0') { - /* There's a lone trailing %; doesn't make sense. */ - PyErr_SetString(PyExc_ValueError, "strftime format " - "ends with raw %"); - goto Done; + /* Null byte follows %, copy only '%'. + * + * Back the pin up one char so that we catch the null check + * the next time through the loop.*/ + pin--; + ptoappend = pin - 1; + ntoappend = 1; } /* A % has been seen and ch is the character after it. */ else if (ch == 'z') { @@ -1616,7 +1619,7 @@ wrap_strftime(PyObject *object, PyObject *format, PyObject *timetuple, usednew += ntoappend; assert(usednew <= totalnew); } /* end while() */ - + if (_PyBytes_Resize(&newfmt, usednew) < 0) goto Done; { From webhook-mailer at python.org Mon Jan 14 05:41:38 2019 From: webhook-mailer at python.org (Miss Islington (bot)) Date: Mon, 14 Jan 2019 10:41:38 -0000 Subject: [Python-checkins] bpo-35066: _dateime.datetime.strftime copies trailing '%' (GH-10692) Message-ID: https://github.com/python/cpython/commit/26122de1a80d1618ee80862cf3b8f73f8ec7d9cf commit: 26122de1a80d1618ee80862cf3b8f73f8ec7d9cf branch: 3.7 author: Miss Islington (bot) <31488909+miss-islington at users.noreply.github.com> committer: GitHub date: 2019-01-14T02:41:33-08:00 summary: bpo-35066: _dateime.datetime.strftime copies trailing '%' (GH-10692) Previously, calling the strftime() method on a datetime object with a trailing '%' in the format string would result in an exception. However, this only occured when the datetime C module was being used; the python implementation did not match this behavior. Datetime is now PEP-399 compliant, and will not throw an exception on a trailing '%'. (cherry picked from commit 454b3d4ea246e8751534e105548d141ed7b0b032) Co-authored-by: MichaelSaah files: A Misc/NEWS.d/next/Library/2018-11-29-09-38-40.bpo-35066.Nwej2s.rst M Lib/test/datetimetester.py M Modules/_datetimemodule.c diff --git a/Lib/test/datetimetester.py b/Lib/test/datetimetester.py index 2f838c445555..d729c7efd52f 100644 --- a/Lib/test/datetimetester.py +++ b/Lib/test/datetimetester.py @@ -1351,6 +1351,17 @@ def test_strftime(self): #check that this standard extension works t.strftime("%f") + def test_strftime_trailing_percent(self): + # bpo-35066: make sure trailing '%' doesn't cause + # datetime's strftime to complain + t = self.theclass(2005, 3, 2) + try: + _time.strftime('%') + except ValueError: + self.skipTest('time module does not support trailing %') + self.assertEqual(t.strftime('%'), '%') + self.assertEqual(t.strftime("m:%m d:%d y:%y %"), "m:03 d:02 y:05 %") + def test_format(self): dt = self.theclass(2007, 9, 10) self.assertEqual(dt.__format__(''), str(dt)) diff --git a/Misc/NEWS.d/next/Library/2018-11-29-09-38-40.bpo-35066.Nwej2s.rst b/Misc/NEWS.d/next/Library/2018-11-29-09-38-40.bpo-35066.Nwej2s.rst new file mode 100644 index 000000000000..b0c39bd86383 --- /dev/null +++ b/Misc/NEWS.d/next/Library/2018-11-29-09-38-40.bpo-35066.Nwej2s.rst @@ -0,0 +1,5 @@ +Previously, calling the strftime() method on a datetime object with a +trailing '%' in the format string would result in an exception. However, +this only occured when the datetime C module was being used; the python +implementation did not match this behavior. Datetime is now PEP-399 +compliant, and will not throw an exception on a trailing '%'. diff --git a/Modules/_datetimemodule.c b/Modules/_datetimemodule.c index 5afeeea4881d..9405b4610dfc 100644 --- a/Modules/_datetimemodule.c +++ b/Modules/_datetimemodule.c @@ -1514,10 +1514,13 @@ wrap_strftime(PyObject *object, PyObject *format, PyObject *timetuple, ntoappend = 1; } else if ((ch = *pin++) == '\0') { - /* There's a lone trailing %; doesn't make sense. */ - PyErr_SetString(PyExc_ValueError, "strftime format " - "ends with raw %"); - goto Done; + /* Null byte follows %, copy only '%'. + * + * Back the pin up one char so that we catch the null check + * the next time through the loop.*/ + pin--; + ptoappend = pin - 1; + ntoappend = 1; } /* A % has been seen and ch is the character after it. */ else if (ch == 'z') { @@ -1602,7 +1605,7 @@ wrap_strftime(PyObject *object, PyObject *format, PyObject *timetuple, usednew += ntoappend; assert(usednew <= totalnew); } /* end while() */ - + if (_PyBytes_Resize(&newfmt, usednew) < 0) goto Done; { From webhook-mailer at python.org Mon Jan 14 05:58:43 2019 From: webhook-mailer at python.org (Serhiy Storchaka) Date: Mon, 14 Jan 2019 10:58:43 -0000 Subject: [Python-checkins] bpo-34756: Silence only ImportError and AttributeError in sys.breakpointhook(). (GH-9457) Message-ID: https://github.com/python/cpython/commit/6fe9c446f8302553952f63fc6d96be4dfa48ceba commit: 6fe9c446f8302553952f63fc6d96be4dfa48ceba branch: master author: Serhiy Storchaka committer: GitHub date: 2019-01-14T12:58:37+02:00 summary: bpo-34756: Silence only ImportError and AttributeError in sys.breakpointhook(). (GH-9457) files: M Python/sysmodule.c diff --git a/Python/sysmodule.c b/Python/sysmodule.c index 10707fd23fc6..869834b92432 100644 --- a/Python/sysmodule.c +++ b/Python/sysmodule.c @@ -170,6 +170,12 @@ sys_breakpointhook(PyObject *self, PyObject *const *args, Py_ssize_t nargs, PyOb return retval; error: + if (!PyErr_ExceptionMatches(PyExc_ImportError) + && !PyErr_ExceptionMatches(PyExc_AttributeError)) + { + PyMem_RawFree(envar); + return NULL; + } /* If any of the imports went wrong, then warn and ignore. */ PyErr_Clear(); int status = PyErr_WarnFormat( From webhook-mailer at python.org Mon Jan 14 06:17:11 2019 From: webhook-mailer at python.org (Miss Islington (bot)) Date: Mon, 14 Jan 2019 11:17:11 -0000 Subject: [Python-checkins] bpo-34756: Silence only ImportError and AttributeError in sys.breakpointhook(). (GH-9457) Message-ID: https://github.com/python/cpython/commit/6d0254bae4d739b487fcaa76705a2d309bce8e75 commit: 6d0254bae4d739b487fcaa76705a2d309bce8e75 branch: 3.7 author: Miss Islington (bot) <31488909+miss-islington at users.noreply.github.com> committer: GitHub date: 2019-01-14T03:17:06-08:00 summary: bpo-34756: Silence only ImportError and AttributeError in sys.breakpointhook(). (GH-9457) (cherry picked from commit 6fe9c446f8302553952f63fc6d96be4dfa48ceba) Co-authored-by: Serhiy Storchaka files: M Python/sysmodule.c diff --git a/Python/sysmodule.c b/Python/sysmodule.c index efe5b29ef33c..75e4f4bf294f 100644 --- a/Python/sysmodule.c +++ b/Python/sysmodule.c @@ -171,6 +171,12 @@ sys_breakpointhook(PyObject *self, PyObject *const *args, Py_ssize_t nargs, PyOb return retval; error: + if (!PyErr_ExceptionMatches(PyExc_ImportError) + && !PyErr_ExceptionMatches(PyExc_AttributeError)) + { + PyMem_RawFree(envar); + return NULL; + } /* If any of the imports went wrong, then warn and ignore. */ PyErr_Clear(); int status = PyErr_WarnFormat( From webhook-mailer at python.org Tue Jan 15 03:53:22 2019 From: webhook-mailer at python.org (Serhiy Storchaka) Date: Tue, 15 Jan 2019 08:53:22 -0000 Subject: [Python-checkins] bpo-35619: Improve support of custom data descriptors in help() and pydoc. (GH-11366) Message-ID: https://github.com/python/cpython/commit/efcf82f94572abcdbd70336e0b2c3d0f4df280bc commit: efcf82f94572abcdbd70336e0b2c3d0f4df280bc branch: master author: Serhiy Storchaka committer: GitHub date: 2019-01-15T10:53:18+02:00 summary: bpo-35619: Improve support of custom data descriptors in help() and pydoc. (GH-11366) files: A Misc/NEWS.d/next/Library/2018-12-30-19-50-36.bpo-35619.ZRXdhy.rst M Lib/pydoc.py M Lib/test/test_pydoc.py diff --git a/Lib/pydoc.py b/Lib/pydoc.py index 59f6e3935135..daa7205bd74e 100644 --- a/Lib/pydoc.py +++ b/Lib/pydoc.py @@ -137,12 +137,6 @@ def stripid(text): # The behaviour of %p is implementation-dependent in terms of case. return _re_stripid.sub(r'\1', text) -def _is_some_method(obj): - return (inspect.isfunction(obj) or - inspect.ismethod(obj) or - inspect.isbuiltin(obj) or - inspect.ismethoddescriptor(obj)) - def _is_bound_method(fn): """ Returns True if fn is a bound method, regardless of whether @@ -158,7 +152,7 @@ def _is_bound_method(fn): def allmethods(cl): methods = {} - for key, value in inspect.getmembers(cl, _is_some_method): + for key, value in inspect.getmembers(cl, inspect.isroutine): methods[key] = 1 for base in cl.__bases__: methods.update(allmethods(base)) # all your base are belong to us @@ -379,15 +373,13 @@ def document(self, object, name=None, *args): # identifies something in a way that pydoc itself has issues handling; # think 'super' and how it is a descriptor (which raises the exception # by lacking a __name__ attribute) and an instance. - if inspect.isgetsetdescriptor(object): return self.docdata(*args) - if inspect.ismemberdescriptor(object): return self.docdata(*args) try: if inspect.ismodule(object): return self.docmodule(*args) if inspect.isclass(object): return self.docclass(*args) if inspect.isroutine(object): return self.docroutine(*args) except AttributeError: pass - if isinstance(object, property): return self.docproperty(*args) + if inspect.isdatadescriptor(object): return self.docdata(*args) return self.docother(*args) def fail(self, object, name=None, *args): @@ -809,7 +801,7 @@ def spill(msg, attrs, predicate): except Exception: # Some descriptors may meet a failure in their __get__. # (bug #1785) - push(self._docdescriptor(name, value, mod)) + push(self.docdata(value, name, mod)) else: push(self.document(value, name, mod, funcs, classes, mdict, object)) @@ -822,7 +814,7 @@ def spilldescriptors(msg, attrs, predicate): hr.maybe() push(msg) for name, kind, homecls, value in ok: - push(self._docdescriptor(name, value, mod)) + push(self.docdata(value, name, mod)) return attrs def spilldata(msg, attrs, predicate): @@ -994,32 +986,27 @@ def docroutine(self, object, name=None, mod=None, doc = doc and '
%s
' % doc return '
%s
%s
\n' % (decl, doc) - def _docdescriptor(self, name, value, mod): + def docdata(self, object, name=None, mod=None, cl=None): + """Produce html documentation for a data descriptor.""" results = [] push = results.append if name: push('
%s
\n' % name) - if value.__doc__ is not None: - doc = self.markup(getdoc(value), self.preformat) + if object.__doc__ is not None: + doc = self.markup(getdoc(object), self.preformat) push('
%s
\n' % doc) push('
\n') return ''.join(results) - def docproperty(self, object, name=None, mod=None, cl=None): - """Produce html documentation for a property.""" - return self._docdescriptor(name, object, mod) + docproperty = docdata def docother(self, object, name=None, mod=None, *ignored): """Produce HTML documentation for a data object.""" lhs = name and '%s = ' % name or '' return lhs + self.repr(object) - def docdata(self, object, name=None, mod=None, cl=None): - """Produce html documentation for a data descriptor.""" - return self._docdescriptor(name, object, mod) - def index(self, dir, shadowed=None): """Generate an HTML index for a directory of modules.""" modpkgs = [] @@ -1292,7 +1279,7 @@ def spill(msg, attrs, predicate): except Exception: # Some descriptors may meet a failure in their __get__. # (bug #1785) - push(self._docdescriptor(name, value, mod)) + push(self.docdata(value, name, mod)) else: push(self.document(value, name, mod, object)) @@ -1304,7 +1291,7 @@ def spilldescriptors(msg, attrs, predicate): hr.maybe() push(msg) for name, kind, homecls, value in ok: - push(self._docdescriptor(name, value, mod)) + push(self.docdata(value, name, mod)) return attrs def spilldata(msg, attrs, predicate): @@ -1420,26 +1407,21 @@ def docroutine(self, object, name=None, mod=None, cl=None): doc = getdoc(object) or '' return decl + '\n' + (doc and self.indent(doc).rstrip() + '\n') - def _docdescriptor(self, name, value, mod): + def docdata(self, object, name=None, mod=None, cl=None): + """Produce text documentation for a data descriptor.""" results = [] push = results.append if name: push(self.bold(name)) push('\n') - doc = getdoc(value) or '' + doc = getdoc(object) or '' if doc: push(self.indent(doc)) push('\n') return ''.join(results) - def docproperty(self, object, name=None, mod=None, cl=None): - """Produce text documentation for a property.""" - return self._docdescriptor(name, object, mod) - - def docdata(self, object, name=None, mod=None, cl=None): - """Produce text documentation for a data descriptor.""" - return self._docdescriptor(name, object, mod) + docproperty = docdata def docother(self, object, name=None, mod=None, parent=None, maxlen=None, doc=None): """Produce text documentation for a data object.""" @@ -1673,9 +1655,7 @@ def render_doc(thing, title='Python Library Documentation: %s', forceload=0, if not (inspect.ismodule(object) or inspect.isclass(object) or inspect.isroutine(object) or - inspect.isgetsetdescriptor(object) or - inspect.ismemberdescriptor(object) or - isinstance(object, property)): + inspect.isdatadescriptor(object)): # If the passed object is a piece of data or an instance, # document its available methods instead of its value. object = type(object) diff --git a/Lib/test/test_pydoc.py b/Lib/test/test_pydoc.py index ffe80fc06fc6..c2bd9f3012c1 100644 --- a/Lib/test/test_pydoc.py +++ b/Lib/test/test_pydoc.py @@ -743,15 +743,6 @@ def test_splitdoc_with_description(self): self.assertEqual(pydoc.splitdoc(example_string), ('I Am A Doc', '\nHere is my description')) - def test_is_object_or_method(self): - doc = pydoc.Doc() - # Bound Method - self.assertTrue(pydoc._is_some_method(doc.fail)) - # Method Descriptor - self.assertTrue(pydoc._is_some_method(int.__add__)) - # String - self.assertFalse(pydoc._is_some_method("I am not a method")) - def test_is_package_when_not_package(self): with test.support.temp_cwd() as test_dir: self.assertFalse(pydoc.ispackage(test_dir)) @@ -1093,6 +1084,12 @@ def _get_summary_line(o): assert len(lines) >= 2 return lines[2] + @staticmethod + def _get_summary_lines(o): + text = pydoc.plain(pydoc.render_doc(o)) + lines = text.split('\n') + return '\n'.join(lines[2:]) + # these should include "self" def test_unbound_python_method(self): self.assertEqual(self._get_summary_line(textwrap.TextWrapper.wrap), @@ -1108,7 +1105,6 @@ def test_bound_python_method(self): t = textwrap.TextWrapper() self.assertEqual(self._get_summary_line(t.wrap), "wrap(text) method of textwrap.TextWrapper instance") - def test_field_order_for_named_tuples(self): Person = namedtuple('Person', ['nickname', 'firstname', 'agegroup']) s = pydoc.render_doc(Person) @@ -1138,6 +1134,164 @@ def test_module_level_callable(self): self.assertEqual(self._get_summary_line(os.stat), "stat(path, *, dir_fd=None, follow_symlinks=True)") + @requires_docstrings + def test_staticmethod(self): + class X: + @staticmethod + def sm(x, y): + '''A static method''' + ... + self.assertEqual(self._get_summary_lines(X.__dict__['sm']), + "") + self.assertEqual(self._get_summary_lines(X.sm), """\ +sm(x, y) + A static method +""") + self.assertIn(""" + | Static methods defined here: + |\x20\x20 + | sm(x, y) + | A static method +""", pydoc.plain(pydoc.render_doc(X))) + + @requires_docstrings + def test_classmethod(self): + class X: + @classmethod + def cm(cls, x): + '''A class method''' + ... + self.assertEqual(self._get_summary_lines(X.__dict__['cm']), + "") + self.assertEqual(self._get_summary_lines(X.cm), """\ +cm(x) method of builtins.type instance + A class method +""") + self.assertIn(""" + | Class methods defined here: + |\x20\x20 + | cm(x) from builtins.type + | A class method +""", pydoc.plain(pydoc.render_doc(X))) + + @requires_docstrings + def test_getset_descriptor(self): + # Currently these attributes are implemented as getset descriptors + # in CPython. + self.assertEqual(self._get_summary_line(int.numerator), "numerator") + self.assertEqual(self._get_summary_line(float.real), "real") + self.assertEqual(self._get_summary_line(Exception.args), "args") + self.assertEqual(self._get_summary_line(memoryview.obj), "obj") + + @requires_docstrings + def test_member_descriptor(self): + # Currently these attributes are implemented as member descriptors + # in CPython. + self.assertEqual(self._get_summary_line(complex.real), "real") + self.assertEqual(self._get_summary_line(range.start), "start") + self.assertEqual(self._get_summary_line(slice.start), "start") + self.assertEqual(self._get_summary_line(property.fget), "fget") + self.assertEqual(self._get_summary_line(StopIteration.value), "value") + + @requires_docstrings + def test_slot_descriptor(self): + class Point: + __slots__ = 'x', 'y' + self.assertEqual(self._get_summary_line(Point.x), "x") + + @requires_docstrings + def test_dict_attr_descriptor(self): + class NS: + pass + self.assertEqual(self._get_summary_line(NS.__dict__['__dict__']), + "__dict__") + + @requires_docstrings + def test_structseq_member_descriptor(self): + self.assertEqual(self._get_summary_line(type(sys.hash_info).width), + "width") + self.assertEqual(self._get_summary_line(type(sys.flags).debug), + "debug") + self.assertEqual(self._get_summary_line(type(sys.version_info).major), + "major") + self.assertEqual(self._get_summary_line(type(sys.float_info).max), + "max") + + @requires_docstrings + def test_namedtuple_field_descriptor(self): + Box = namedtuple('Box', ('width', 'height')) + self.assertEqual(self._get_summary_lines(Box.width), """\ + Alias for field number 0 +""") + + @requires_docstrings + def test_property(self): + class Rect: + @property + def area(self): + '''Area of the rect''' + return self.w * self.h + + self.assertEqual(self._get_summary_lines(Rect.area), """\ + Area of the rect +""") + self.assertIn(""" + | area + | Area of the rect +""", pydoc.plain(pydoc.render_doc(Rect))) + + @requires_docstrings + def test_custom_non_data_descriptor(self): + class Descr: + def __get__(self, obj, cls): + if obj is None: + return self + return 42 + class X: + attr = Descr() + + text = pydoc.plain(pydoc.render_doc(X.attr)) + self.assertEqual(self._get_summary_lines(X.attr), """\ +.Descr object>""") + + X.attr.__doc__ = 'Custom descriptor' + self.assertEqual(self._get_summary_lines(X.attr), """\ +.Descr object>""") + + X.attr.__name__ = 'foo' + self.assertEqual(self._get_summary_lines(X.attr), """\ +foo(...) + Custom descriptor +""") + + @requires_docstrings + def test_custom_data_descriptor(self): + class Descr: + def __get__(self, obj, cls): + if obj is None: + return self + return 42 + def __set__(self, obj, cls): + 1/0 + class X: + attr = Descr() + + text = pydoc.plain(pydoc.render_doc(X.attr)) + self.assertEqual(self._get_summary_lines(X.attr), "") + + X.attr.__doc__ = 'Custom descriptor' + text = pydoc.plain(pydoc.render_doc(X.attr)) + self.assertEqual(self._get_summary_lines(X.attr), """\ + Custom descriptor +""") + + X.attr.__name__ = 'foo' + text = pydoc.plain(pydoc.render_doc(X.attr)) + self.assertEqual(self._get_summary_lines(X.attr), """\ +foo + Custom descriptor +""") + class PydocServerTest(unittest.TestCase): """Tests for pydoc._start_server""" diff --git a/Misc/NEWS.d/next/Library/2018-12-30-19-50-36.bpo-35619.ZRXdhy.rst b/Misc/NEWS.d/next/Library/2018-12-30-19-50-36.bpo-35619.ZRXdhy.rst new file mode 100644 index 000000000000..fe278e63dd86 --- /dev/null +++ b/Misc/NEWS.d/next/Library/2018-12-30-19-50-36.bpo-35619.ZRXdhy.rst @@ -0,0 +1,2 @@ +Improved support of custom data descriptors in :func:`help` and +:mod:`pydoc`. From webhook-mailer at python.org Tue Jan 15 03:55:46 2019 From: webhook-mailer at python.org (Serhiy Storchaka) Date: Tue, 15 Jan 2019 08:55:46 -0000 Subject: [Python-checkins] bpo-29707: Document that os.path.ismount() is not able to reliable detect bind mounts. (GH-11238) Message-ID: https://github.com/python/cpython/commit/32ebd8508d4807a7c85d2ed8e9c3b44ecd6de591 commit: 32ebd8508d4807a7c85d2ed8e9c3b44ecd6de591 branch: master author: Serhiy Storchaka committer: GitHub date: 2019-01-15T10:55:40+02:00 summary: bpo-29707: Document that os.path.ismount() is not able to reliable detect bind mounts. (GH-11238) files: M Doc/library/os.path.rst diff --git a/Doc/library/os.path.rst b/Doc/library/os.path.rst index 23194aee66d5..ebbf63cc3548 100644 --- a/Doc/library/os.path.rst +++ b/Doc/library/os.path.rst @@ -283,10 +283,11 @@ the :mod:`glob` module.) Return ``True`` if pathname *path* is a :dfn:`mount point`: a point in a file system where a different file system has been mounted. On POSIX, the - function checks whether *path*'s parent, :file:`path/..`, is on a different - device than *path*, or whether :file:`path/..` and *path* point to the same + function checks whether *path*'s parent, :file:`{path}/..`, is on a different + device than *path*, or whether :file:`{path}/..` and *path* point to the same i-node on the same device --- this should detect mount points for all Unix - and POSIX variants. On Windows, a drive letter root and a share UNC are + and POSIX variants. It is not able to reliably detect bind mounts on the + same filesystem. On Windows, a drive letter root and a share UNC are always mount points, and for any other path ``GetVolumePathName`` is called to see if it is different from the input path. From webhook-mailer at python.org Tue Jan 15 04:01:19 2019 From: webhook-mailer at python.org (Miss Islington (bot)) Date: Tue, 15 Jan 2019 09:01:19 -0000 Subject: [Python-checkins] bpo-29707: Document that os.path.ismount() is not able to reliable detect bind mounts. (GH-11238) Message-ID: https://github.com/python/cpython/commit/a4aade2cf82dfa889c2bdad9fa0aa874f43c0bf8 commit: a4aade2cf82dfa889c2bdad9fa0aa874f43c0bf8 branch: 3.7 author: Miss Islington (bot) <31488909+miss-islington at users.noreply.github.com> committer: GitHub date: 2019-01-15T01:01:15-08:00 summary: bpo-29707: Document that os.path.ismount() is not able to reliable detect bind mounts. (GH-11238) (cherry picked from commit 32ebd8508d4807a7c85d2ed8e9c3b44ecd6de591) Co-authored-by: Serhiy Storchaka files: M Doc/library/os.path.rst diff --git a/Doc/library/os.path.rst b/Doc/library/os.path.rst index f6ff01097fe3..d78ab068a31e 100644 --- a/Doc/library/os.path.rst +++ b/Doc/library/os.path.rst @@ -277,10 +277,11 @@ the :mod:`glob` module.) Return ``True`` if pathname *path* is a :dfn:`mount point`: a point in a file system where a different file system has been mounted. On POSIX, the - function checks whether *path*'s parent, :file:`path/..`, is on a different - device than *path*, or whether :file:`path/..` and *path* point to the same + function checks whether *path*'s parent, :file:`{path}/..`, is on a different + device than *path*, or whether :file:`{path}/..` and *path* point to the same i-node on the same device --- this should detect mount points for all Unix - and POSIX variants. On Windows, a drive letter root and a share UNC are + and POSIX variants. It is not able to reliably detect bind mounts on the + same filesystem. On Windows, a drive letter root and a share UNC are always mount points, and for any other path ``GetVolumePathName`` is called to see if it is different from the input path. From solipsis at pitrou.net Tue Jan 15 04:09:40 2019 From: solipsis at pitrou.net (solipsis at pitrou.net) Date: Tue, 15 Jan 2019 09:09:40 +0000 Subject: [Python-checkins] Daily reference leaks (4243df51fe43): sum=10 Message-ID: <20190115090940.1.B1BA486285507393@psf.io> results for 4243df51fe43 on branch "default" -------------------------------------------- test_collections leaked [0, 0, 7] memory blocks, sum=7 test_functools leaked [0, 3, 1] memory blocks, sum=4 test_multiprocessing_forkserver leaked [-2, 2, -1] memory blocks, sum=-1 Command line was: ['./python', '-m', 'test.regrtest', '-uall', '-R', '3:3:/home/psf-users/antoine/refleaks/reflogC7DhXU', '--timeout', '7200'] From webhook-mailer at python.org Tue Jan 15 05:29:24 2019 From: webhook-mailer at python.org (Serhiy Storchaka) Date: Tue, 15 Jan 2019 10:29:24 -0000 Subject: [Python-checkins] bpo-35738: Update the example for timer.Timer.repeat(). (GH-11559) Message-ID: https://github.com/python/cpython/commit/06f8b57212b2e2cd2e63af36cecdfa3075b324a2 commit: 06f8b57212b2e2cd2e63af36cecdfa3075b324a2 branch: master author: Henry Chen committer: Serhiy Storchaka date: 2019-01-15T12:29:21+02:00 summary: bpo-35738: Update the example for timer.Timer.repeat(). (GH-11559) Show correct number of repeats. files: M Doc/library/timeit.rst diff --git a/Doc/library/timeit.rst b/Doc/library/timeit.rst index 93ca940ef5bd..197b8a76fc38 100644 --- a/Doc/library/timeit.rst +++ b/Doc/library/timeit.rst @@ -292,7 +292,7 @@ The same can be done using the :class:`Timer` class and its methods:: >>> t.timeit() 0.3955516149999312 >>> t.repeat() - [0.40193588800002544, 0.3960157959998014, 0.39594301399984033] + [0.40183617287970225, 0.37027556854118704, 0.38344867356679524, 0.3712595970846668, 0.37866875250654886] The following examples show how to time expressions that contain multiple lines. From webhook-mailer at python.org Tue Jan 15 05:48:03 2019 From: webhook-mailer at python.org (Victor Stinner) Date: Tue, 15 Jan 2019 10:48:03 -0000 Subject: [Python-checkins] bpo-34323: Enhance IocpProactor.close() log (GH-11555) Message-ID: https://github.com/python/cpython/commit/b1e45739d832e1e402a563c6727defda92e193b7 commit: b1e45739d832e1e402a563c6727defda92e193b7 branch: master author: Victor Stinner committer: GitHub date: 2019-01-15T11:48:00+01:00 summary: bpo-34323: Enhance IocpProactor.close() log (GH-11555) IocpProactor.close() now uses time to decide when to log: wait 1 second before the first log, then log every second. Log also the number of seconds since close() was called. files: A Misc/NEWS.d/next/Library/2019-01-14-17-34-36.bpo-34323.CRErrt.rst M Lib/asyncio/windows_events.py diff --git a/Lib/asyncio/windows_events.py b/Lib/asyncio/windows_events.py index bdb9a6e28a86..7f264e6f9a07 100644 --- a/Lib/asyncio/windows_events.py +++ b/Lib/asyncio/windows_events.py @@ -7,6 +7,7 @@ import msvcrt import socket import struct +import time import weakref from . import events @@ -802,10 +803,21 @@ def close(self): context['source_traceback'] = fut._source_traceback self._loop.call_exception_handler(context) - # wait until all cancelled overlapped future complete + # Wait until all cancelled overlapped complete: don't exit with running + # overlapped to prevent a crash. Display progress every second if the + # loop is still running. + msg_update = 1.0 + start_time = time.monotonic() + next_msg = start_time + msg_update while self._cache: - if not self._poll(1): - logger.debug('taking long time to close proactor') + if next_msg <= time.monotonic(): + logger.debug('IocpProactor.close(): ' + 'loop is running after closing for %.1f seconds', + time.monotonic() - start_time) + next_msg = time.monotonic() + msg_update + + # handle a few events, or timeout + self._poll(msg_update) self._results = [] diff --git a/Misc/NEWS.d/next/Library/2019-01-14-17-34-36.bpo-34323.CRErrt.rst b/Misc/NEWS.d/next/Library/2019-01-14-17-34-36.bpo-34323.CRErrt.rst new file mode 100644 index 000000000000..59269244cc47 --- /dev/null +++ b/Misc/NEWS.d/next/Library/2019-01-14-17-34-36.bpo-34323.CRErrt.rst @@ -0,0 +1,3 @@ +:mod:`asyncio`: Enhance ``IocpProactor.close()`` log: wait 1 second before +the first log, then log every second. Log also the number of seconds since +``close()`` was called. From webhook-mailer at python.org Tue Jan 15 05:49:20 2019 From: webhook-mailer at python.org (Miss Islington (bot)) Date: Tue, 15 Jan 2019 10:49:20 -0000 Subject: [Python-checkins] bpo-35738: Update the example for timer.Timer.repeat(). (GH-11559) Message-ID: https://github.com/python/cpython/commit/0bb6b891154b5718c2d7604fc4aa7a51a2f9fe70 commit: 0bb6b891154b5718c2d7604fc4aa7a51a2f9fe70 branch: 3.7 author: Miss Islington (bot) <31488909+miss-islington at users.noreply.github.com> committer: GitHub date: 2019-01-15T02:49:16-08:00 summary: bpo-35738: Update the example for timer.Timer.repeat(). (GH-11559) Show correct number of repeats. (cherry picked from commit 06f8b57212b2e2cd2e63af36cecdfa3075b324a2) Co-authored-by: Henry Chen files: M Doc/library/timeit.rst diff --git a/Doc/library/timeit.rst b/Doc/library/timeit.rst index 93ca940ef5bd..197b8a76fc38 100644 --- a/Doc/library/timeit.rst +++ b/Doc/library/timeit.rst @@ -292,7 +292,7 @@ The same can be done using the :class:`Timer` class and its methods:: >>> t.timeit() 0.3955516149999312 >>> t.repeat() - [0.40193588800002544, 0.3960157959998014, 0.39594301399984033] + [0.40183617287970225, 0.37027556854118704, 0.38344867356679524, 0.3712595970846668, 0.37866875250654886] The following examples show how to time expressions that contain multiple lines. From webhook-mailer at python.org Tue Jan 15 06:13:51 2019 From: webhook-mailer at python.org (Victor Stinner) Date: Tue, 15 Jan 2019 11:13:51 -0000 Subject: [Python-checkins] bpo-11555: Enhance IocpProactor.close() log again (GH-11563) Message-ID: https://github.com/python/cpython/commit/b91140fdb17472d03a7b7971f143c08a40fde923 commit: b91140fdb17472d03a7b7971f143c08a40fde923 branch: master author: Victor Stinner committer: GitHub date: 2019-01-15T12:13:48+01:00 summary: bpo-11555: Enhance IocpProactor.close() log again (GH-11563) Add repr(self) to the log to display the number of pending overlapped in the log. files: M Lib/asyncio/windows_events.py diff --git a/Lib/asyncio/windows_events.py b/Lib/asyncio/windows_events.py index 7f264e6f9a07..29750f18d80c 100644 --- a/Lib/asyncio/windows_events.py +++ b/Lib/asyncio/windows_events.py @@ -811,9 +811,8 @@ def close(self): next_msg = start_time + msg_update while self._cache: if next_msg <= time.monotonic(): - logger.debug('IocpProactor.close(): ' - 'loop is running after closing for %.1f seconds', - time.monotonic() - start_time) + logger.debug('%r is running after closing for %.1f seconds', + self, time.monotonic() - start_time) next_msg = time.monotonic() + msg_update # handle a few events, or timeout From webhook-mailer at python.org Tue Jan 15 06:26:50 2019 From: webhook-mailer at python.org (Serhiy Storchaka) Date: Tue, 15 Jan 2019 11:26:50 -0000 Subject: [Python-checkins] bpo-35742: Fix test_envar_unimportable in test_builtin. (GH-11561) Message-ID: https://github.com/python/cpython/commit/3607ef43c4a1a24d44f39ff54a77fc0af5bfa09a commit: 3607ef43c4a1a24d44f39ff54a77fc0af5bfa09a branch: master author: Serhiy Storchaka committer: GitHub date: 2019-01-15T13:26:38+02:00 summary: bpo-35742: Fix test_envar_unimportable in test_builtin. (GH-11561) Handle the case of an empty module name in PYTHONBREAKPOINT. Fixes a regression introduced in bpo-34756. files: M Lib/test/test_builtin.py M Python/sysmodule.c diff --git a/Lib/test/test_builtin.py b/Lib/test/test_builtin.py index e2a4f2fa6d2f..5674ea89b1c4 100644 --- a/Lib/test/test_builtin.py +++ b/Lib/test/test_builtin.py @@ -1608,6 +1608,7 @@ def test_envar_good_path_empty_string(self): def test_envar_unimportable(self): for envar in ( '.', '..', '.foo', 'foo.', '.int', 'int.', + '.foo.bar', '..foo.bar', '/./', 'nosuchbuiltin', 'nosuchmodule.nosuchcallable', ): diff --git a/Python/sysmodule.c b/Python/sysmodule.c index 869834b92432..5ea3772efded 100644 --- a/Python/sysmodule.c +++ b/Python/sysmodule.c @@ -141,11 +141,14 @@ sys_breakpointhook(PyObject *self, PyObject *const *args, Py_ssize_t nargs, PyOb modulepath = PyUnicode_FromString("builtins"); attrname = envar; } - else { + else if (last_dot != envar) { /* Split on the last dot; */ modulepath = PyUnicode_FromStringAndSize(envar, last_dot - envar); attrname = last_dot + 1; } + else { + goto warn; + } if (modulepath == NULL) { PyMem_RawFree(envar); return NULL; @@ -155,27 +158,29 @@ sys_breakpointhook(PyObject *self, PyObject *const *args, Py_ssize_t nargs, PyOb Py_DECREF(modulepath); if (module == NULL) { - goto error; + if (PyErr_ExceptionMatches(PyExc_ImportError)) { + goto warn; + } + PyMem_RawFree(envar); + return NULL; } PyObject *hook = PyObject_GetAttrString(module, attrname); Py_DECREF(module); if (hook == NULL) { - goto error; + if (PyErr_ExceptionMatches(PyExc_AttributeError)) { + goto warn; + } + PyMem_RawFree(envar); + return NULL; } PyMem_RawFree(envar); PyObject *retval = _PyObject_FastCallKeywords(hook, args, nargs, keywords); Py_DECREF(hook); return retval; - error: - if (!PyErr_ExceptionMatches(PyExc_ImportError) - && !PyErr_ExceptionMatches(PyExc_AttributeError)) - { - PyMem_RawFree(envar); - return NULL; - } + warn: /* If any of the imports went wrong, then warn and ignore. */ PyErr_Clear(); int status = PyErr_WarnFormat( From webhook-mailer at python.org Tue Jan 15 06:46:02 2019 From: webhook-mailer at python.org (Miss Islington (bot)) Date: Tue, 15 Jan 2019 11:46:02 -0000 Subject: [Python-checkins] bpo-35742: Fix test_envar_unimportable in test_builtin. (GH-11561) Message-ID: https://github.com/python/cpython/commit/97d6a56d9d169f35cf2a24d62bf15adfc42fc672 commit: 97d6a56d9d169f35cf2a24d62bf15adfc42fc672 branch: 3.7 author: Miss Islington (bot) <31488909+miss-islington at users.noreply.github.com> committer: GitHub date: 2019-01-15T03:45:57-08:00 summary: bpo-35742: Fix test_envar_unimportable in test_builtin. (GH-11561) Handle the case of an empty module name in PYTHONBREAKPOINT. Fixes a regression introduced in bpo-34756. (cherry picked from commit 3607ef43c4a1a24d44f39ff54a77fc0af5bfa09a) Co-authored-by: Serhiy Storchaka files: M Lib/test/test_builtin.py M Python/sysmodule.c diff --git a/Lib/test/test_builtin.py b/Lib/test/test_builtin.py index 7c9768a337bb..c3f04ec7447c 100644 --- a/Lib/test/test_builtin.py +++ b/Lib/test/test_builtin.py @@ -1601,6 +1601,7 @@ def test_envar_good_path_empty_string(self): def test_envar_unimportable(self): for envar in ( '.', '..', '.foo', 'foo.', '.int', 'int.', + '.foo.bar', '..foo.bar', '/./', 'nosuchbuiltin', 'nosuchmodule.nosuchcallable', ): diff --git a/Python/sysmodule.c b/Python/sysmodule.c index 75e4f4bf294f..cdc2edf038a1 100644 --- a/Python/sysmodule.c +++ b/Python/sysmodule.c @@ -134,11 +134,14 @@ sys_breakpointhook(PyObject *self, PyObject *const *args, Py_ssize_t nargs, PyOb modulepath = PyUnicode_FromString("builtins"); attrname = envar; } - else { + else if (last_dot != envar) { /* Split on the last dot; */ modulepath = PyUnicode_FromStringAndSize(envar, last_dot - envar); attrname = last_dot + 1; } + else { + goto warn; + } if (modulepath == NULL) { PyMem_RawFree(envar); return NULL; @@ -156,27 +159,29 @@ sys_breakpointhook(PyObject *self, PyObject *const *args, Py_ssize_t nargs, PyOb Py_DECREF(fromlist); if (module == NULL) { - goto error; + if (PyErr_ExceptionMatches(PyExc_ImportError)) { + goto warn; + } + PyMem_RawFree(envar); + return NULL; } PyObject *hook = PyObject_GetAttrString(module, attrname); Py_DECREF(module); if (hook == NULL) { - goto error; + if (PyErr_ExceptionMatches(PyExc_AttributeError)) { + goto warn; + } + PyMem_RawFree(envar); + return NULL; } PyMem_RawFree(envar); PyObject *retval = _PyObject_FastCallKeywords(hook, args, nargs, keywords); Py_DECREF(hook); return retval; - error: - if (!PyErr_ExceptionMatches(PyExc_ImportError) - && !PyErr_ExceptionMatches(PyExc_AttributeError)) - { - PyMem_RawFree(envar); - return NULL; - } + warn: /* If any of the imports went wrong, then warn and ignore. */ PyErr_Clear(); int status = PyErr_WarnFormat( From webhook-mailer at python.org Tue Jan 15 07:05:35 2019 From: webhook-mailer at python.org (Victor Stinner) Date: Tue, 15 Jan 2019 12:05:35 -0000 Subject: [Python-checkins] [3.7] bpo-34323: Enhance IocpProactor.close() log (GH-11565) Message-ID: https://github.com/python/cpython/commit/d5a6adf6285ec8892b977a32c22143ebd1025b50 commit: d5a6adf6285ec8892b977a32c22143ebd1025b50 branch: 3.7 author: Victor Stinner committer: GitHub date: 2019-01-15T13:05:28+01:00 summary: [3.7] bpo-34323: Enhance IocpProactor.close() log (GH-11565) * IocpProactor: prevent modification if closed (GH-11494) * _wait_for_handle(), _register() and _unregister() methods of IocpProactor now raise an exception if closed * Add "closed" to IocpProactor.__repr__() * Simplify IocpProactor.close() (cherry picked from commit 9b07681c09182d4b9d23cd52566a4992b8afecbb) * bpo-34323: Enhance IocpProactor.close() log (GH-11555) IocpProactor.close() now uses time to decide when to log: wait 1 second before the first log, then log every second. Log also the number of seconds since close() was called. (cherry picked from commit b1e45739d832e1e402a563c6727defda92e193b7) * bpo-34323: Enhance IocpProactor.close() log again (GH-11563) Add repr(self) to the log to display the number of pending overlapped in the log. (cherry picked from commit b91140fdb17472d03a7b7971f143c08a40fde923) files: A Misc/NEWS.d/next/Library/2019-01-14-17-34-36.bpo-34323.CRErrt.rst M Lib/asyncio/windows_events.py diff --git a/Lib/asyncio/windows_events.py b/Lib/asyncio/windows_events.py index 2ec542764375..e3778688ab5b 100644 --- a/Lib/asyncio/windows_events.py +++ b/Lib/asyncio/windows_events.py @@ -7,6 +7,7 @@ import msvcrt import socket import struct +import time import weakref from . import events @@ -392,10 +393,16 @@ def __init__(self, concurrency=0xffffffff): self._unregistered = [] self._stopped_serving = weakref.WeakSet() + def _check_closed(self): + if self._iocp is None: + raise RuntimeError('IocpProactor is closed') + def __repr__(self): - return ('<%s overlapped#=%s result#=%s>' - % (self.__class__.__name__, len(self._cache), - len(self._results))) + info = ['overlapped#=%s' % len(self._cache), + 'result#=%s' % len(self._results)] + if self._iocp is None: + info.append('closed') + return '<%s %s>' % (self.__class__.__name__, " ".join(info)) def set_loop(self, loop): self._loop = loop @@ -602,6 +609,8 @@ def _wait_cancel(self, event, done_callback): return fut def _wait_for_handle(self, handle, timeout, _is_cancel): + self._check_closed() + if timeout is None: ms = _winapi.INFINITE else: @@ -644,6 +653,8 @@ def _register_with_iocp(self, obj): # that succeed immediately. def _register(self, ov, obj, callback): + self._check_closed() + # Return a future which will be set with the result of the # operation when it completes. The future's value is actually # the value returned by callback(). @@ -680,6 +691,7 @@ def _unregister(self, ov): already be signalled (pending in the proactor event queue). It is also safe if the event is never signalled (because it was cancelled). """ + self._check_closed() self._unregistered.append(ov) def _get_accept_socket(self, family): @@ -749,6 +761,10 @@ def _stop_serving(self, obj): self._stopped_serving.add(obj) def close(self): + if self._iocp is None: + # already closed + return + # Cancel remaining registered operations. for address, (fut, ov, obj, callback) in list(self._cache.items()): if fut.cancelled(): @@ -771,14 +787,25 @@ def close(self): context['source_traceback'] = fut._source_traceback self._loop.call_exception_handler(context) + # Wait until all cancelled overlapped complete: don't exit with running + # overlapped to prevent a crash. Display progress every second if the + # loop is still running. + msg_update = 1.0 + start_time = time.monotonic() + next_msg = start_time + msg_update while self._cache: - if not self._poll(1): - logger.debug('taking long time to close proactor') + if next_msg <= time.monotonic(): + logger.debug('%r is running after closing for %.1f seconds', + self, time.monotonic() - start_time) + next_msg = time.monotonic() + msg_update + + # handle a few events, or timeout + self._poll(msg_update) self._results = [] - if self._iocp is not None: - _winapi.CloseHandle(self._iocp) - self._iocp = None + + _winapi.CloseHandle(self._iocp) + self._iocp = None def __del__(self): self.close() diff --git a/Misc/NEWS.d/next/Library/2019-01-14-17-34-36.bpo-34323.CRErrt.rst b/Misc/NEWS.d/next/Library/2019-01-14-17-34-36.bpo-34323.CRErrt.rst new file mode 100644 index 000000000000..59269244cc47 --- /dev/null +++ b/Misc/NEWS.d/next/Library/2019-01-14-17-34-36.bpo-34323.CRErrt.rst @@ -0,0 +1,3 @@ +:mod:`asyncio`: Enhance ``IocpProactor.close()`` log: wait 1 second before +the first log, then log every second. Log also the number of seconds since +``close()`` was called. From webhook-mailer at python.org Tue Jan 15 07:34:52 2019 From: webhook-mailer at python.org (Serhiy Storchaka) Date: Tue, 15 Jan 2019 12:34:52 -0000 Subject: [Python-checkins] [2.7] bpo-8765: Deprecate writing unicode to binary streams in Py3k mode. (GH-11127) Message-ID: https://github.com/python/cpython/commit/1462234baf7398a6b00c0f51905e26caa17d3c60 commit: 1462234baf7398a6b00c0f51905e26caa17d3c60 branch: 2.7 author: Serhiy Storchaka committer: GitHub date: 2019-01-15T14:34:48+02:00 summary: [2.7] bpo-8765: Deprecate writing unicode to binary streams in Py3k mode. (GH-11127) files: A Misc/NEWS.d/next/Library/2018-12-12-09-59-41.bpo-8765.IFupT2.rst M Lib/test/test_fileio.py M Modules/_io/bufferedio.c M Modules/_io/fileio.c diff --git a/Lib/test/test_fileio.py b/Lib/test/test_fileio.py index 2825a87024da..57c9f14f47ea 100644 --- a/Lib/test/test_fileio.py +++ b/Lib/test/test_fileio.py @@ -12,7 +12,7 @@ from UserList import UserList from test.test_support import TESTFN, check_warnings, run_unittest, make_bad_fd -from test.test_support import py3k_bytes as bytes, cpython_only +from test.test_support import py3k_bytes as bytes, cpython_only, check_py3k_warnings from test.script_helper import run_python from _io import FileIO as _FileIO @@ -101,6 +101,10 @@ def test_none_args(self): self.assertEqual(self.f.readline(None), b"hi\n") self.assertEqual(self.f.readlines(None), [b"bye\n", b"abc"]) + def testWriteUnicode(self): + with check_py3k_warnings(): + self.f.write(u'') + def testRepr(self): self.assertEqual(repr(self.f), "<_io.FileIO name=%r mode='%s'>" % (self.f.name, self.f.mode)) @@ -210,7 +214,7 @@ def testErrnoOnClose(self, f): @ClosedFDRaises def testErrnoOnClosedWrite(self, f): - f.write('a') + f.write(b'a') @ClosedFDRaises def testErrnoOnClosedSeek(self, f): diff --git a/Misc/NEWS.d/next/Library/2018-12-12-09-59-41.bpo-8765.IFupT2.rst b/Misc/NEWS.d/next/Library/2018-12-12-09-59-41.bpo-8765.IFupT2.rst new file mode 100644 index 000000000000..24a8efc2a94f --- /dev/null +++ b/Misc/NEWS.d/next/Library/2018-12-12-09-59-41.bpo-8765.IFupT2.rst @@ -0,0 +1,2 @@ +The write() method of buffered and unbuffered binary streams in the io +module emits now a DeprecationWarning in Py3k mode for unicode argument. diff --git a/Modules/_io/bufferedio.c b/Modules/_io/bufferedio.c index 5bef7463e7cb..b8c98a4d0d04 100644 --- a/Modules/_io/bufferedio.c +++ b/Modules/_io/bufferedio.c @@ -1812,6 +1812,13 @@ bufferedwriter_write(buffered *self, PyObject *args) if (!PyArg_ParseTuple(args, "s*:write", &buf)) { return NULL; } + if (PyUnicode_Check(PyTuple_GET_ITEM(args, 0)) && + PyErr_WarnPy3k("write() argument must be string or buffer, " + "not 'unicode'", 1) < 0) + { + PyBuffer_Release(&buf); + return NULL; + } if (IS_CLOSED(self)) { PyErr_SetString(PyExc_ValueError, "write to closed file"); diff --git a/Modules/_io/fileio.c b/Modules/_io/fileio.c index 2b40ada195a1..ed5b918e76a7 100644 --- a/Modules/_io/fileio.c +++ b/Modules/_io/fileio.c @@ -716,8 +716,16 @@ fileio_write(fileio *self, PyObject *args) if (!self->writable) return err_mode("writing"); - if (!PyArg_ParseTuple(args, "s*", &pbuf)) + if (!PyArg_ParseTuple(args, "s*:write", &pbuf)) { return NULL; + } + if (PyUnicode_Check(PyTuple_GET_ITEM(args, 0)) && + PyErr_WarnPy3k("write() argument must be string or buffer, " + "not 'unicode'", 1) < 0) + { + PyBuffer_Release(&pbuf); + return NULL; + } if (_PyVerify_fd(self->fd)) { Py_BEGIN_ALLOW_THREADS From webhook-mailer at python.org Tue Jan 15 07:58:44 2019 From: webhook-mailer at python.org (Victor Stinner) Date: Tue, 15 Jan 2019 12:58:44 -0000 Subject: [Python-checkins] bpo-23846: Fix ProactorEventLoop._write_to_self() (GH-11566) Message-ID: https://github.com/python/cpython/commit/c9f872b0bdce5888f1879fa74e098bf4a05430c5 commit: c9f872b0bdce5888f1879fa74e098bf4a05430c5 branch: master author: Victor Stinner committer: GitHub date: 2019-01-15T13:58:38+01:00 summary: bpo-23846: Fix ProactorEventLoop._write_to_self() (GH-11566) asyncio.ProactorEventLoop now catchs and logs send errors when the self-pipe is full: BaseProactorEventLoop._write_to_self() now catchs and logs OSError exceptions, as done by BaseSelectorEventLoop._write_to_self(). files: A Misc/NEWS.d/next/Library/2019-01-15-13-31-30.bpo-23846.LT_qL8.rst M Lib/asyncio/proactor_events.py diff --git a/Lib/asyncio/proactor_events.py b/Lib/asyncio/proactor_events.py index 3a1826e2c0d2..a849be1cc147 100644 --- a/Lib/asyncio/proactor_events.py +++ b/Lib/asyncio/proactor_events.py @@ -636,7 +636,13 @@ def _loop_self_reading(self, f=None): f.add_done_callback(self._loop_self_reading) def _write_to_self(self): - self._csock.send(b'\0') + try: + self._csock.send(b'\0') + except OSError: + if self._debug: + logger.debug("Fail to write a null byte into the " + "self-pipe socket", + exc_info=True) def _start_serving(self, protocol_factory, sock, sslcontext=None, server=None, backlog=100, diff --git a/Misc/NEWS.d/next/Library/2019-01-15-13-31-30.bpo-23846.LT_qL8.rst b/Misc/NEWS.d/next/Library/2019-01-15-13-31-30.bpo-23846.LT_qL8.rst new file mode 100644 index 000000000000..788f092df9c1 --- /dev/null +++ b/Misc/NEWS.d/next/Library/2019-01-15-13-31-30.bpo-23846.LT_qL8.rst @@ -0,0 +1,2 @@ +:class:`asyncio.ProactorEventLoop` now catchs and logs send errors when the +self-pipe is full. From webhook-mailer at python.org Tue Jan 15 08:17:14 2019 From: webhook-mailer at python.org (Miss Islington (bot)) Date: Tue, 15 Jan 2019 13:17:14 -0000 Subject: [Python-checkins] bpo-23846: Fix ProactorEventLoop._write_to_self() (GH-11566) Message-ID: https://github.com/python/cpython/commit/c9f26714d511a338ba2fdd926e3dc62636f31815 commit: c9f26714d511a338ba2fdd926e3dc62636f31815 branch: 3.7 author: Miss Islington (bot) <31488909+miss-islington at users.noreply.github.com> committer: GitHub date: 2019-01-15T05:17:05-08:00 summary: bpo-23846: Fix ProactorEventLoop._write_to_self() (GH-11566) asyncio.ProactorEventLoop now catchs and logs send errors when the self-pipe is full: BaseProactorEventLoop._write_to_self() now catchs and logs OSError exceptions, as done by BaseSelectorEventLoop._write_to_self(). (cherry picked from commit c9f872b0bdce5888f1879fa74e098bf4a05430c5) Co-authored-by: Victor Stinner files: A Misc/NEWS.d/next/Library/2019-01-15-13-31-30.bpo-23846.LT_qL8.rst M Lib/asyncio/proactor_events.py diff --git a/Lib/asyncio/proactor_events.py b/Lib/asyncio/proactor_events.py index 782c86106bce..a638cceda5e4 100644 --- a/Lib/asyncio/proactor_events.py +++ b/Lib/asyncio/proactor_events.py @@ -634,7 +634,13 @@ def _loop_self_reading(self, f=None): f.add_done_callback(self._loop_self_reading) def _write_to_self(self): - self._csock.send(b'\0') + try: + self._csock.send(b'\0') + except OSError: + if self._debug: + logger.debug("Fail to write a null byte into the " + "self-pipe socket", + exc_info=True) def _start_serving(self, protocol_factory, sock, sslcontext=None, server=None, backlog=100, diff --git a/Misc/NEWS.d/next/Library/2019-01-15-13-31-30.bpo-23846.LT_qL8.rst b/Misc/NEWS.d/next/Library/2019-01-15-13-31-30.bpo-23846.LT_qL8.rst new file mode 100644 index 000000000000..788f092df9c1 --- /dev/null +++ b/Misc/NEWS.d/next/Library/2019-01-15-13-31-30.bpo-23846.LT_qL8.rst @@ -0,0 +1,2 @@ +:class:`asyncio.ProactorEventLoop` now catchs and logs send errors when the +self-pipe is full. From webhook-mailer at python.org Tue Jan 15 17:47:51 2019 From: webhook-mailer at python.org (Miss Islington (bot)) Date: Tue, 15 Jan 2019 22:47:51 -0000 Subject: [Python-checkins] bpo-35746: Fix segfault in ssl's cert parser (GH-11569) Message-ID: https://github.com/python/cpython/commit/a37f52436f9aa4b9292878b72f3ff1480e2606c3 commit: a37f52436f9aa4b9292878b72f3ff1480e2606c3 branch: master author: Christian Heimes committer: Miss Islington (bot) <31488909+miss-islington at users.noreply.github.com> date: 2019-01-15T14:47:42-08:00 summary: bpo-35746: Fix segfault in ssl's cert parser (GH-11569) Fix a NULL pointer deref in ssl module. The cert parser did not handle CRL distribution points with empty DP or URI correctly. A malicious or buggy certificate can result into segfault. Signed-off-by: Christian Heimes https://bugs.python.org/issue35746 files: A Lib/test/talos-2019-0758.pem A Misc/NEWS.d/next/Security/2019-01-15-18-16-05.bpo-35746.nMSd0j.rst M Lib/test/test_ssl.py M Modules/_ssl.c diff --git a/Lib/test/talos-2019-0758.pem b/Lib/test/talos-2019-0758.pem new file mode 100644 index 000000000000..13b95a77fd8a --- /dev/null +++ b/Lib/test/talos-2019-0758.pem @@ -0,0 +1,22 @@ +-----BEGIN CERTIFICATE----- +MIIDqDCCApKgAwIBAgIBAjALBgkqhkiG9w0BAQswHzELMAkGA1UEBhMCVUsxEDAO +BgNVBAMTB2NvZHktY2EwHhcNMTgwNjE4MTgwMDU4WhcNMjgwNjE0MTgwMDU4WjA7 +MQswCQYDVQQGEwJVSzEsMCoGA1UEAxMjY29kZW5vbWljb24tdm0tMi50ZXN0Lmxh +bC5jaXNjby5jb20wggEiMA0GCSqGSIb3DQEBAQUAA4IBDwAwggEKAoIBAQC63fGB +J80A9Av1GB0bptslKRIUtJm8EeEu34HkDWbL6AJY0P8WfDtlXjlPaLqFa6sqH6ES +V48prSm1ZUbDSVL8R6BYVYpOlK8/48xk4pGTgRzv69gf5SGtQLwHy8UPBKgjSZoD +5a5k5wJXGswhKFFNqyyxqCvWmMnJWxXTt2XDCiWc4g4YAWi4O4+6SeeHVAV9rV7C +1wxqjzKovVe2uZOHjKEzJbbIU6JBPb6TRfMdRdYOw98n1VXDcKVgdX2DuuqjCzHP +WhU4Tw050M9NaK3eXp4Mh69VuiKoBGOLSOcS8reqHIU46Reg0hqeL8LIL6OhFHIF +j7HR6V1X6F+BfRS/AgMBAAGjgdYwgdMwCQYDVR0TBAIwADAdBgNVHQ4EFgQUOktp +HQjxDXXUg8prleY9jeLKeQ4wTwYDVR0jBEgwRoAUx6zgPygZ0ZErF9sPC4+5e2Io +UU+hI6QhMB8xCzAJBgNVBAYTAlVLMRAwDgYDVQQDEwdjb2R5LWNhggkA1QEAuwb7 +2s0wCQYDVR0SBAIwADAuBgNVHREEJzAlgiNjb2Rlbm9taWNvbi12bS0yLnRlc3Qu +bGFsLmNpc2NvLmNvbTAOBgNVHQ8BAf8EBAMCBaAwCwYDVR0fBAQwAjAAMAsGCSqG +SIb3DQEBCwOCAQEAvqantx2yBlM11RoFiCfi+AfSblXPdrIrHvccepV4pYc/yO6p +t1f2dxHQb8rWH3i6cWag/EgIZx+HJQvo0rgPY1BFJsX1WnYf1/znZpkUBGbVmlJr +t/dW1gSkNS6sPsM0Q+7HPgEv8CPDNK5eo7vU2seE0iWOkxSyVUuiCEY9ZVGaLVit +p0C78nZ35Pdv4I+1cosmHl28+es1WI22rrnmdBpH8J1eY6WvUw2xuZHLeNVN0TzV +Q3qq53AaCWuLOD1AjESWuUCxMZTK9DPS4JKXTK8RLyDeqOvJGjsSWp3kL0y3GaQ+ +10T1rfkKJub2+m9A9duin1fn6tHc2wSvB7m3DA== +-----END CERTIFICATE----- diff --git a/Lib/test/test_ssl.py b/Lib/test/test_ssl.py index 7f6b93148f45..1fc657f4d867 100644 --- a/Lib/test/test_ssl.py +++ b/Lib/test/test_ssl.py @@ -115,6 +115,7 @@ def data_file(*name): BADKEY = data_file("badkey.pem") NOKIACERT = data_file("nokia.pem") NULLBYTECERT = data_file("nullbytecert.pem") +TALOS_INVALID_CRLDP = data_file("talos-2019-0758.pem") DHFILE = data_file("ffdh3072.pem") BYTES_DHFILE = os.fsencode(DHFILE) @@ -348,6 +349,27 @@ def test_parse_cert(self): self.assertEqual(p['crlDistributionPoints'], ('http://SVRIntl-G3-crl.verisign.com/SVRIntlG3.crl',)) + def test_parse_cert_CVE_2019_5010(self): + p = ssl._ssl._test_decode_cert(TALOS_INVALID_CRLDP) + if support.verbose: + sys.stdout.write("\n" + pprint.pformat(p) + "\n") + self.assertEqual( + p, + { + 'issuer': ( + (('countryName', 'UK'),), (('commonName', 'cody-ca'),)), + 'notAfter': 'Jun 14 18:00:58 2028 GMT', + 'notBefore': 'Jun 18 18:00:58 2018 GMT', + 'serialNumber': '02', + 'subject': ((('countryName', 'UK'),), + (('commonName', + 'codenomicon-vm-2.test.lal.cisco.com'),)), + 'subjectAltName': ( + ('DNS', 'codenomicon-vm-2.test.lal.cisco.com'),), + 'version': 3 + } + ) + def test_parse_cert_CVE_2013_4238(self): p = ssl._ssl._test_decode_cert(NULLBYTECERT) if support.verbose: diff --git a/Misc/NEWS.d/next/Security/2019-01-15-18-16-05.bpo-35746.nMSd0j.rst b/Misc/NEWS.d/next/Security/2019-01-15-18-16-05.bpo-35746.nMSd0j.rst new file mode 100644 index 000000000000..dffe347eec84 --- /dev/null +++ b/Misc/NEWS.d/next/Security/2019-01-15-18-16-05.bpo-35746.nMSd0j.rst @@ -0,0 +1,3 @@ +[CVE-2019-5010] Fix a NULL pointer deref in ssl module. The cert parser did +not handle CRL distribution points with empty DP or URI correctly. A +malicious or buggy certificate can result into segfault. diff --git a/Modules/_ssl.c b/Modules/_ssl.c index 4e3352d9e661..0e720e268d93 100644 --- a/Modules/_ssl.c +++ b/Modules/_ssl.c @@ -1515,6 +1515,10 @@ _get_crl_dp(X509 *certificate) { STACK_OF(GENERAL_NAME) *gns; dp = sk_DIST_POINT_value(dps, i); + if (dp->distpoint == NULL) { + /* Ignore empty DP value, CVE-2019-5010 */ + continue; + } gns = dp->distpoint->name.fullname; for (j=0; j < sk_GENERAL_NAME_num(gns); j++) { From webhook-mailer at python.org Tue Jan 15 18:02:39 2019 From: webhook-mailer at python.org (Victor Stinner) Date: Tue, 15 Jan 2019 23:02:39 -0000 Subject: [Python-checkins] bpo-35537: subprocess uses os.posix_spawn in some cases (GH-11452) Message-ID: https://github.com/python/cpython/commit/9daecf37a571e98aaf43a387bcc9e41a7132f477 commit: 9daecf37a571e98aaf43a387bcc9e41a7132f477 branch: master author: Victor Stinner committer: GitHub date: 2019-01-16T00:02:35+01:00 summary: bpo-35537: subprocess uses os.posix_spawn in some cases (GH-11452) The subprocess module can now use the os.posix_spawn() function in some cases for better performance. Currently, it is only used on macOS and Linux (using glibc 2.24 or newer) if all these conditions are met: * executable path contains a directory * close_fds=False * preexec_fn, pass_fds, cwd, stdin, stdout, stderr and start_new_session parameters are not set Co-authored-by: Joannah Nanjekye files: A Misc/NEWS.d/next/Library/2018-12-20-16-24-51.bpo-35537.z4E7aA.rst M Doc/whatsnew/3.8.rst M Lib/subprocess.py M Lib/test/pythoninfo.py diff --git a/Doc/whatsnew/3.8.rst b/Doc/whatsnew/3.8.rst index 370ef4604834..053fe902c481 100644 --- a/Doc/whatsnew/3.8.rst +++ b/Doc/whatsnew/3.8.rst @@ -275,6 +275,15 @@ xml Optimizations ============= +* The :mod:`subprocess` module can now use the :func:`os.posix_spawn` function + in some cases for better performance. Currently, it is only used on macOS + and Linux (using glibc 2.24 or newer) if all these conditions are met: + + * *close_fds* is false; + * *preexec_fn*, *pass_fds*, *cwd*, *stdin*, *stdout*, *stderr* and + *start_new_session* parameters are not set; + * the *executable* path contains a directory. + * :func:`shutil.copyfile`, :func:`shutil.copy`, :func:`shutil.copy2`, :func:`shutil.copytree` and :func:`shutil.move` use platform-specific "fast-copy" syscalls on Linux, macOS and Solaris in order to copy the file diff --git a/Lib/subprocess.py b/Lib/subprocess.py index 696617697047..b94575b8401e 100644 --- a/Lib/subprocess.py +++ b/Lib/subprocess.py @@ -606,6 +606,57 @@ def getoutput(cmd): return getstatusoutput(cmd)[1] +def _use_posix_spawn(): + """Check is posix_spawn() can be used for subprocess. + + subprocess requires a posix_spawn() implementation that reports properly + errors to the parent process, set errno on the following failures: + + * process attribute actions failed + * file actions failed + * exec() failed + + Prefer an implementation which can use vfork in some cases for best + performances. + """ + if _mswindows or not hasattr(os, 'posix_spawn'): + # os.posix_spawn() is not available + return False + + if sys.platform == 'darwin': + # posix_spawn() is a syscall on macOS and properly reports errors + return True + + # Check libc name and runtime libc version + try: + ver = os.confstr('CS_GNU_LIBC_VERSION') + # parse 'glibc 2.28' as ('glibc', (2, 28)) + parts = ver.split(maxsplit=1) + if len(parts) != 2: + # reject unknown format + raise ValueError + libc = parts[0] + version = tuple(map(int, parts[1].split('.'))) + + if sys.platform == 'linux' and libc == 'glibc' and version >= (2, 24): + # glibc 2.24 has a new Linux posix_spawn implementation using vfork + # which properly reports errors to the parent process. + return True + # Note: Don't use the POSIX implementation of glibc because it doesn't + # use vfork (even if glibc 2.26 added a pipe to properly report errors + # to the parent process). + except (AttributeError, ValueError, OSError): + # os.confstr() or CS_GNU_LIBC_VERSION value not available + pass + + # By default, consider that the implementation does not properly report + # errors. + return False + + +_USE_POSIX_SPAWN = _use_posix_spawn() + + class Popen(object): """ Execute a child program in a new process. @@ -1390,6 +1441,23 @@ def _get_handles(self, stdin, stdout, stderr): errread, errwrite) + def _posix_spawn(self, args, executable, env, restore_signals): + """Execute program using os.posix_spawn().""" + if env is None: + env = os.environ + + kwargs = {} + if restore_signals: + # See _Py_RestoreSignals() in Python/pylifecycle.c + sigset = [] + for signame in ('SIGPIPE', 'SIGXFZ', 'SIGXFSZ'): + signum = getattr(signal, signame, None) + if signum is not None: + sigset.append(signum) + kwargs['setsigdef'] = sigset + + self.pid = os.posix_spawn(executable, args, env, **kwargs) + def _execute_child(self, args, executable, preexec_fn, close_fds, pass_fds, cwd, env, startupinfo, creationflags, shell, @@ -1414,6 +1482,20 @@ def _execute_child(self, args, executable, preexec_fn, close_fds, if executable is None: executable = args[0] + + if (_USE_POSIX_SPAWN + and os.path.dirname(executable) + and preexec_fn is None + and not close_fds + and not pass_fds + and cwd is None + and p2cread == p2cwrite == -1 + and c2pread == c2pwrite == -1 + and errread == errwrite == -1 + and not start_new_session): + self._posix_spawn(args, executable, env, restore_signals) + return + orig_executable = executable # For transferring possible exec failure from child to parent. diff --git a/Lib/test/pythoninfo.py b/Lib/test/pythoninfo.py index 7ce6bf7b1ab9..7e94a31cecea 100644 --- a/Lib/test/pythoninfo.py +++ b/Lib/test/pythoninfo.py @@ -610,6 +610,11 @@ def collect_get_config(info_add): info_add('%s[%s]' % (prefix, key), repr(config[key])) +def collect_subprocess(info_add): + import subprocess + copy_attributes(info_add, subprocess, 'subprocess.%s', ('_USE_POSIX_SPAWN',)) + + def collect_info(info): error = False info_add = info.add @@ -639,6 +644,7 @@ def collect_info(info): collect_cc, collect_gdbm, collect_get_config, + collect_subprocess, # Collecting from tests should be last as they have side effects. collect_test_socket, diff --git a/Misc/NEWS.d/next/Library/2018-12-20-16-24-51.bpo-35537.z4E7aA.rst b/Misc/NEWS.d/next/Library/2018-12-20-16-24-51.bpo-35537.z4E7aA.rst new file mode 100644 index 000000000000..b14d7493bc60 --- /dev/null +++ b/Misc/NEWS.d/next/Library/2018-12-20-16-24-51.bpo-35537.z4E7aA.rst @@ -0,0 +1,2 @@ +The :mod:`subprocess` module can now use the :func:`os.posix_spawn` function in +some cases for better performance. From webhook-mailer at python.org Tue Jan 15 18:03:39 2019 From: webhook-mailer at python.org (Miss Islington (bot)) Date: Tue, 15 Jan 2019 23:03:39 -0000 Subject: [Python-checkins] bpo-35746: Fix segfault in ssl's cert parser (GH-11569) Message-ID: https://github.com/python/cpython/commit/be5de958e9052e322b0087c6dba81cdad0c3e031 commit: be5de958e9052e322b0087c6dba81cdad0c3e031 branch: 3.7 author: Miss Islington (bot) <31488909+miss-islington at users.noreply.github.com> committer: GitHub date: 2019-01-15T15:03:36-08:00 summary: bpo-35746: Fix segfault in ssl's cert parser (GH-11569) Fix a NULL pointer deref in ssl module. The cert parser did not handle CRL distribution points with empty DP or URI correctly. A malicious or buggy certificate can result into segfault. Signed-off-by: Christian Heimes https://bugs.python.org/issue35746 (cherry picked from commit a37f52436f9aa4b9292878b72f3ff1480e2606c3) Co-authored-by: Christian Heimes files: A Lib/test/talos-2019-0758.pem A Misc/NEWS.d/next/Security/2019-01-15-18-16-05.bpo-35746.nMSd0j.rst M Lib/test/test_ssl.py M Modules/_ssl.c diff --git a/Lib/test/talos-2019-0758.pem b/Lib/test/talos-2019-0758.pem new file mode 100644 index 000000000000..13b95a77fd8a --- /dev/null +++ b/Lib/test/talos-2019-0758.pem @@ -0,0 +1,22 @@ +-----BEGIN CERTIFICATE----- +MIIDqDCCApKgAwIBAgIBAjALBgkqhkiG9w0BAQswHzELMAkGA1UEBhMCVUsxEDAO +BgNVBAMTB2NvZHktY2EwHhcNMTgwNjE4MTgwMDU4WhcNMjgwNjE0MTgwMDU4WjA7 +MQswCQYDVQQGEwJVSzEsMCoGA1UEAxMjY29kZW5vbWljb24tdm0tMi50ZXN0Lmxh +bC5jaXNjby5jb20wggEiMA0GCSqGSIb3DQEBAQUAA4IBDwAwggEKAoIBAQC63fGB +J80A9Av1GB0bptslKRIUtJm8EeEu34HkDWbL6AJY0P8WfDtlXjlPaLqFa6sqH6ES +V48prSm1ZUbDSVL8R6BYVYpOlK8/48xk4pGTgRzv69gf5SGtQLwHy8UPBKgjSZoD +5a5k5wJXGswhKFFNqyyxqCvWmMnJWxXTt2XDCiWc4g4YAWi4O4+6SeeHVAV9rV7C +1wxqjzKovVe2uZOHjKEzJbbIU6JBPb6TRfMdRdYOw98n1VXDcKVgdX2DuuqjCzHP +WhU4Tw050M9NaK3eXp4Mh69VuiKoBGOLSOcS8reqHIU46Reg0hqeL8LIL6OhFHIF +j7HR6V1X6F+BfRS/AgMBAAGjgdYwgdMwCQYDVR0TBAIwADAdBgNVHQ4EFgQUOktp +HQjxDXXUg8prleY9jeLKeQ4wTwYDVR0jBEgwRoAUx6zgPygZ0ZErF9sPC4+5e2Io +UU+hI6QhMB8xCzAJBgNVBAYTAlVLMRAwDgYDVQQDEwdjb2R5LWNhggkA1QEAuwb7 +2s0wCQYDVR0SBAIwADAuBgNVHREEJzAlgiNjb2Rlbm9taWNvbi12bS0yLnRlc3Qu +bGFsLmNpc2NvLmNvbTAOBgNVHQ8BAf8EBAMCBaAwCwYDVR0fBAQwAjAAMAsGCSqG +SIb3DQEBCwOCAQEAvqantx2yBlM11RoFiCfi+AfSblXPdrIrHvccepV4pYc/yO6p +t1f2dxHQb8rWH3i6cWag/EgIZx+HJQvo0rgPY1BFJsX1WnYf1/znZpkUBGbVmlJr +t/dW1gSkNS6sPsM0Q+7HPgEv8CPDNK5eo7vU2seE0iWOkxSyVUuiCEY9ZVGaLVit +p0C78nZ35Pdv4I+1cosmHl28+es1WI22rrnmdBpH8J1eY6WvUw2xuZHLeNVN0TzV +Q3qq53AaCWuLOD1AjESWuUCxMZTK9DPS4JKXTK8RLyDeqOvJGjsSWp3kL0y3GaQ+ +10T1rfkKJub2+m9A9duin1fn6tHc2wSvB7m3DA== +-----END CERTIFICATE----- diff --git a/Lib/test/test_ssl.py b/Lib/test/test_ssl.py index f1b9565c8d91..b6794ce3a817 100644 --- a/Lib/test/test_ssl.py +++ b/Lib/test/test_ssl.py @@ -116,6 +116,7 @@ def data_file(*name): BADKEY = data_file("badkey.pem") NOKIACERT = data_file("nokia.pem") NULLBYTECERT = data_file("nullbytecert.pem") +TALOS_INVALID_CRLDP = data_file("talos-2019-0758.pem") DHFILE = data_file("ffdh3072.pem") BYTES_DHFILE = os.fsencode(DHFILE) @@ -365,6 +366,27 @@ def test_parse_cert(self): self.assertEqual(p['crlDistributionPoints'], ('http://SVRIntl-G3-crl.verisign.com/SVRIntlG3.crl',)) + def test_parse_cert_CVE_2019_5010(self): + p = ssl._ssl._test_decode_cert(TALOS_INVALID_CRLDP) + if support.verbose: + sys.stdout.write("\n" + pprint.pformat(p) + "\n") + self.assertEqual( + p, + { + 'issuer': ( + (('countryName', 'UK'),), (('commonName', 'cody-ca'),)), + 'notAfter': 'Jun 14 18:00:58 2028 GMT', + 'notBefore': 'Jun 18 18:00:58 2018 GMT', + 'serialNumber': '02', + 'subject': ((('countryName', 'UK'),), + (('commonName', + 'codenomicon-vm-2.test.lal.cisco.com'),)), + 'subjectAltName': ( + ('DNS', 'codenomicon-vm-2.test.lal.cisco.com'),), + 'version': 3 + } + ) + def test_parse_cert_CVE_2013_4238(self): p = ssl._ssl._test_decode_cert(NULLBYTECERT) if support.verbose: diff --git a/Misc/NEWS.d/next/Security/2019-01-15-18-16-05.bpo-35746.nMSd0j.rst b/Misc/NEWS.d/next/Security/2019-01-15-18-16-05.bpo-35746.nMSd0j.rst new file mode 100644 index 000000000000..dffe347eec84 --- /dev/null +++ b/Misc/NEWS.d/next/Security/2019-01-15-18-16-05.bpo-35746.nMSd0j.rst @@ -0,0 +1,3 @@ +[CVE-2019-5010] Fix a NULL pointer deref in ssl module. The cert parser did +not handle CRL distribution points with empty DP or URI correctly. A +malicious or buggy certificate can result into segfault. diff --git a/Modules/_ssl.c b/Modules/_ssl.c index 9894ad821d63..9baec8a9bc84 100644 --- a/Modules/_ssl.c +++ b/Modules/_ssl.c @@ -1516,6 +1516,10 @@ _get_crl_dp(X509 *certificate) { STACK_OF(GENERAL_NAME) *gns; dp = sk_DIST_POINT_value(dps, i); + if (dp->distpoint == NULL) { + /* Ignore empty DP value, CVE-2019-5010 */ + continue; + } gns = dp->distpoint->name.fullname; for (j=0; j < sk_GENERAL_NAME_num(gns); j++) { From webhook-mailer at python.org Tue Jan 15 18:11:56 2019 From: webhook-mailer at python.org (Miss Islington (bot)) Date: Tue, 15 Jan 2019 23:11:56 -0000 Subject: [Python-checkins] bpo-35746: Fix segfault in ssl's cert parser (GH-11569) Message-ID: https://github.com/python/cpython/commit/06b15424b0dcacb1c551b2a36e739fffa8d0c595 commit: 06b15424b0dcacb1c551b2a36e739fffa8d0c595 branch: 2.7 author: Miss Islington (bot) <31488909+miss-islington at users.noreply.github.com> committer: GitHub date: 2019-01-15T15:11:52-08:00 summary: bpo-35746: Fix segfault in ssl's cert parser (GH-11569) Fix a NULL pointer deref in ssl module. The cert parser did not handle CRL distribution points with empty DP or URI correctly. A malicious or buggy certificate can result into segfault. Signed-off-by: Christian Heimes https://bugs.python.org/issue35746 (cherry picked from commit a37f52436f9aa4b9292878b72f3ff1480e2606c3) Co-authored-by: Christian Heimes files: A Lib/test/talos-2019-0758.pem A Misc/NEWS.d/next/Security/2019-01-15-18-16-05.bpo-35746.nMSd0j.rst M Lib/test/test_ssl.py M Modules/_ssl.c diff --git a/Lib/test/talos-2019-0758.pem b/Lib/test/talos-2019-0758.pem new file mode 100644 index 000000000000..13b95a77fd8a --- /dev/null +++ b/Lib/test/talos-2019-0758.pem @@ -0,0 +1,22 @@ +-----BEGIN CERTIFICATE----- +MIIDqDCCApKgAwIBAgIBAjALBgkqhkiG9w0BAQswHzELMAkGA1UEBhMCVUsxEDAO +BgNVBAMTB2NvZHktY2EwHhcNMTgwNjE4MTgwMDU4WhcNMjgwNjE0MTgwMDU4WjA7 +MQswCQYDVQQGEwJVSzEsMCoGA1UEAxMjY29kZW5vbWljb24tdm0tMi50ZXN0Lmxh +bC5jaXNjby5jb20wggEiMA0GCSqGSIb3DQEBAQUAA4IBDwAwggEKAoIBAQC63fGB +J80A9Av1GB0bptslKRIUtJm8EeEu34HkDWbL6AJY0P8WfDtlXjlPaLqFa6sqH6ES +V48prSm1ZUbDSVL8R6BYVYpOlK8/48xk4pGTgRzv69gf5SGtQLwHy8UPBKgjSZoD +5a5k5wJXGswhKFFNqyyxqCvWmMnJWxXTt2XDCiWc4g4YAWi4O4+6SeeHVAV9rV7C +1wxqjzKovVe2uZOHjKEzJbbIU6JBPb6TRfMdRdYOw98n1VXDcKVgdX2DuuqjCzHP +WhU4Tw050M9NaK3eXp4Mh69VuiKoBGOLSOcS8reqHIU46Reg0hqeL8LIL6OhFHIF +j7HR6V1X6F+BfRS/AgMBAAGjgdYwgdMwCQYDVR0TBAIwADAdBgNVHQ4EFgQUOktp +HQjxDXXUg8prleY9jeLKeQ4wTwYDVR0jBEgwRoAUx6zgPygZ0ZErF9sPC4+5e2Io +UU+hI6QhMB8xCzAJBgNVBAYTAlVLMRAwDgYDVQQDEwdjb2R5LWNhggkA1QEAuwb7 +2s0wCQYDVR0SBAIwADAuBgNVHREEJzAlgiNjb2Rlbm9taWNvbi12bS0yLnRlc3Qu +bGFsLmNpc2NvLmNvbTAOBgNVHQ8BAf8EBAMCBaAwCwYDVR0fBAQwAjAAMAsGCSqG +SIb3DQEBCwOCAQEAvqantx2yBlM11RoFiCfi+AfSblXPdrIrHvccepV4pYc/yO6p +t1f2dxHQb8rWH3i6cWag/EgIZx+HJQvo0rgPY1BFJsX1WnYf1/znZpkUBGbVmlJr +t/dW1gSkNS6sPsM0Q+7HPgEv8CPDNK5eo7vU2seE0iWOkxSyVUuiCEY9ZVGaLVit +p0C78nZ35Pdv4I+1cosmHl28+es1WI22rrnmdBpH8J1eY6WvUw2xuZHLeNVN0TzV +Q3qq53AaCWuLOD1AjESWuUCxMZTK9DPS4JKXTK8RLyDeqOvJGjsSWp3kL0y3GaQ+ +10T1rfkKJub2+m9A9duin1fn6tHc2wSvB7m3DA== +-----END CERTIFICATE----- diff --git a/Lib/test/test_ssl.py b/Lib/test/test_ssl.py index e47603170253..9240184d98a4 100644 --- a/Lib/test/test_ssl.py +++ b/Lib/test/test_ssl.py @@ -72,6 +72,7 @@ def data_file(*name): BADKEY = data_file("badkey.pem") NOKIACERT = data_file("nokia.pem") NULLBYTECERT = data_file("nullbytecert.pem") +TALOS_INVALID_CRLDP = data_file("talos-2019-0758.pem") DHFILE = data_file("ffdh3072.pem") BYTES_DHFILE = DHFILE.encode(sys.getfilesystemencoding()) @@ -227,6 +228,27 @@ def test_parse_cert(self): self.assertEqual(p['crlDistributionPoints'], ('http://SVRIntl-G3-crl.verisign.com/SVRIntlG3.crl',)) + def test_parse_cert_CVE_2019_5010(self): + p = ssl._ssl._test_decode_cert(TALOS_INVALID_CRLDP) + if support.verbose: + sys.stdout.write("\n" + pprint.pformat(p) + "\n") + self.assertEqual( + p, + { + 'issuer': ( + (('countryName', 'UK'),), (('commonName', 'cody-ca'),)), + 'notAfter': 'Jun 14 18:00:58 2028 GMT', + 'notBefore': 'Jun 18 18:00:58 2018 GMT', + 'serialNumber': '02', + 'subject': ((('countryName', 'UK'),), + (('commonName', + 'codenomicon-vm-2.test.lal.cisco.com'),)), + 'subjectAltName': ( + ('DNS', 'codenomicon-vm-2.test.lal.cisco.com'),), + 'version': 3 + } + ) + def test_parse_cert_CVE_2013_4238(self): p = ssl._ssl._test_decode_cert(NULLBYTECERT) if support.verbose: diff --git a/Misc/NEWS.d/next/Security/2019-01-15-18-16-05.bpo-35746.nMSd0j.rst b/Misc/NEWS.d/next/Security/2019-01-15-18-16-05.bpo-35746.nMSd0j.rst new file mode 100644 index 000000000000..dffe347eec84 --- /dev/null +++ b/Misc/NEWS.d/next/Security/2019-01-15-18-16-05.bpo-35746.nMSd0j.rst @@ -0,0 +1,3 @@ +[CVE-2019-5010] Fix a NULL pointer deref in ssl module. The cert parser did +not handle CRL distribution points with empty DP or URI correctly. A +malicious or buggy certificate can result into segfault. diff --git a/Modules/_ssl.c b/Modules/_ssl.c index a96c41926036..19bb1207b48f 100644 --- a/Modules/_ssl.c +++ b/Modules/_ssl.c @@ -1223,6 +1223,10 @@ _get_crl_dp(X509 *certificate) { STACK_OF(GENERAL_NAME) *gns; dp = sk_DIST_POINT_value(dps, i); + if (dp->distpoint == NULL) { + /* Ignore empty DP value, CVE-2019-5010 */ + continue; + } gns = dp->distpoint->name.fullname; for (j=0; j < sk_GENERAL_NAME_num(gns); j++) { From webhook-mailer at python.org Tue Jan 15 20:16:41 2019 From: webhook-mailer at python.org (Ned Deily) Date: Wed, 16 Jan 2019 01:16:41 -0000 Subject: [Python-checkins] bpo-35746: Fix segfault in ssl's cert parser (GH-11569) (GH-11573) Message-ID: https://github.com/python/cpython/commit/216a4d83c3b72f4fdcd81b588dc3f42cc461739a commit: 216a4d83c3b72f4fdcd81b588dc3f42cc461739a branch: 3.6 author: Miss Islington (bot) <31488909+miss-islington at users.noreply.github.com> committer: Ned Deily date: 2019-01-15T20:16:36-05:00 summary: bpo-35746: Fix segfault in ssl's cert parser (GH-11569) (GH-11573) Fix a NULL pointer deref in ssl module. The cert parser did not handle CRL distribution points with empty DP or URI correctly. A malicious or buggy certificate can result into segfault. Signed-off-by: Christian Heimes https://bugs.python.org/issue35746 (cherry picked from commit a37f52436f9aa4b9292878b72f3ff1480e2606c3) Co-authored-by: Christian Heimes files: A Lib/test/talos-2019-0758.pem A Misc/NEWS.d/next/Security/2019-01-15-18-16-05.bpo-35746.nMSd0j.rst M Lib/test/test_ssl.py M Modules/_ssl.c diff --git a/Lib/test/talos-2019-0758.pem b/Lib/test/talos-2019-0758.pem new file mode 100644 index 000000000000..13b95a77fd8a --- /dev/null +++ b/Lib/test/talos-2019-0758.pem @@ -0,0 +1,22 @@ +-----BEGIN CERTIFICATE----- +MIIDqDCCApKgAwIBAgIBAjALBgkqhkiG9w0BAQswHzELMAkGA1UEBhMCVUsxEDAO +BgNVBAMTB2NvZHktY2EwHhcNMTgwNjE4MTgwMDU4WhcNMjgwNjE0MTgwMDU4WjA7 +MQswCQYDVQQGEwJVSzEsMCoGA1UEAxMjY29kZW5vbWljb24tdm0tMi50ZXN0Lmxh +bC5jaXNjby5jb20wggEiMA0GCSqGSIb3DQEBAQUAA4IBDwAwggEKAoIBAQC63fGB +J80A9Av1GB0bptslKRIUtJm8EeEu34HkDWbL6AJY0P8WfDtlXjlPaLqFa6sqH6ES +V48prSm1ZUbDSVL8R6BYVYpOlK8/48xk4pGTgRzv69gf5SGtQLwHy8UPBKgjSZoD +5a5k5wJXGswhKFFNqyyxqCvWmMnJWxXTt2XDCiWc4g4YAWi4O4+6SeeHVAV9rV7C +1wxqjzKovVe2uZOHjKEzJbbIU6JBPb6TRfMdRdYOw98n1VXDcKVgdX2DuuqjCzHP +WhU4Tw050M9NaK3eXp4Mh69VuiKoBGOLSOcS8reqHIU46Reg0hqeL8LIL6OhFHIF +j7HR6V1X6F+BfRS/AgMBAAGjgdYwgdMwCQYDVR0TBAIwADAdBgNVHQ4EFgQUOktp +HQjxDXXUg8prleY9jeLKeQ4wTwYDVR0jBEgwRoAUx6zgPygZ0ZErF9sPC4+5e2Io +UU+hI6QhMB8xCzAJBgNVBAYTAlVLMRAwDgYDVQQDEwdjb2R5LWNhggkA1QEAuwb7 +2s0wCQYDVR0SBAIwADAuBgNVHREEJzAlgiNjb2Rlbm9taWNvbi12bS0yLnRlc3Qu +bGFsLmNpc2NvLmNvbTAOBgNVHQ8BAf8EBAMCBaAwCwYDVR0fBAQwAjAAMAsGCSqG +SIb3DQEBCwOCAQEAvqantx2yBlM11RoFiCfi+AfSblXPdrIrHvccepV4pYc/yO6p +t1f2dxHQb8rWH3i6cWag/EgIZx+HJQvo0rgPY1BFJsX1WnYf1/znZpkUBGbVmlJr +t/dW1gSkNS6sPsM0Q+7HPgEv8CPDNK5eo7vU2seE0iWOkxSyVUuiCEY9ZVGaLVit +p0C78nZ35Pdv4I+1cosmHl28+es1WI22rrnmdBpH8J1eY6WvUw2xuZHLeNVN0TzV +Q3qq53AaCWuLOD1AjESWuUCxMZTK9DPS4JKXTK8RLyDeqOvJGjsSWp3kL0y3GaQ+ +10T1rfkKJub2+m9A9duin1fn6tHc2wSvB7m3DA== +-----END CERTIFICATE----- diff --git a/Lib/test/test_ssl.py b/Lib/test/test_ssl.py index 705f1d3245b6..0aeabc10f2a9 100644 --- a/Lib/test/test_ssl.py +++ b/Lib/test/test_ssl.py @@ -79,6 +79,7 @@ def data_file(*name): BADKEY = data_file("badkey.pem") NOKIACERT = data_file("nokia.pem") NULLBYTECERT = data_file("nullbytecert.pem") +TALOS_INVALID_CRLDP = data_file("talos-2019-0758.pem") DHFILE = data_file("ffdh3072.pem") BYTES_DHFILE = os.fsencode(DHFILE) @@ -293,6 +294,27 @@ def test_parse_cert(self): self.assertEqual(p['crlDistributionPoints'], ('http://SVRIntl-G3-crl.verisign.com/SVRIntlG3.crl',)) + def test_parse_cert_CVE_2019_5010(self): + p = ssl._ssl._test_decode_cert(TALOS_INVALID_CRLDP) + if support.verbose: + sys.stdout.write("\n" + pprint.pformat(p) + "\n") + self.assertEqual( + p, + { + 'issuer': ( + (('countryName', 'UK'),), (('commonName', 'cody-ca'),)), + 'notAfter': 'Jun 14 18:00:58 2028 GMT', + 'notBefore': 'Jun 18 18:00:58 2018 GMT', + 'serialNumber': '02', + 'subject': ((('countryName', 'UK'),), + (('commonName', + 'codenomicon-vm-2.test.lal.cisco.com'),)), + 'subjectAltName': ( + ('DNS', 'codenomicon-vm-2.test.lal.cisco.com'),), + 'version': 3 + } + ) + def test_parse_cert_CVE_2013_4238(self): p = ssl._ssl._test_decode_cert(NULLBYTECERT) if support.verbose: diff --git a/Misc/NEWS.d/next/Security/2019-01-15-18-16-05.bpo-35746.nMSd0j.rst b/Misc/NEWS.d/next/Security/2019-01-15-18-16-05.bpo-35746.nMSd0j.rst new file mode 100644 index 000000000000..dffe347eec84 --- /dev/null +++ b/Misc/NEWS.d/next/Security/2019-01-15-18-16-05.bpo-35746.nMSd0j.rst @@ -0,0 +1,3 @@ +[CVE-2019-5010] Fix a NULL pointer deref in ssl module. The cert parser did +not handle CRL distribution points with empty DP or URI correctly. A +malicious or buggy certificate can result into segfault. diff --git a/Modules/_ssl.c b/Modules/_ssl.c index a188d6a7291a..7365630a5eaf 100644 --- a/Modules/_ssl.c +++ b/Modules/_ssl.c @@ -1338,6 +1338,10 @@ _get_crl_dp(X509 *certificate) { STACK_OF(GENERAL_NAME) *gns; dp = sk_DIST_POINT_value(dps, i); + if (dp->distpoint == NULL) { + /* Ignore empty DP value, CVE-2019-5010 */ + continue; + } gns = dp->distpoint->name.fullname; for (j=0; j < sk_GENERAL_NAME_num(gns); j++) { From solipsis at pitrou.net Wed Jan 16 04:10:23 2019 From: solipsis at pitrou.net (solipsis at pitrou.net) Date: Wed, 16 Jan 2019 09:10:23 +0000 Subject: [Python-checkins] Daily reference leaks (4243df51fe43): sum=-3 Message-ID: <20190116091023.1.E59990008DCDFA26@psf.io> results for 4243df51fe43 on branch "default" -------------------------------------------- test_collections leaked [-7, 1, 0] memory blocks, sum=-6 test_functools leaked [0, 3, 1] memory blocks, sum=4 test_multiprocessing_fork leaked [1, 0, -2] memory blocks, sum=-1 Command line was: ['./python', '-m', 'test.regrtest', '-uall', '-R', '3:3:/home/psf-users/antoine/refleaks/reflogGcPBfk', '--timeout', '7200'] From webhook-mailer at python.org Wed Jan 16 08:29:32 2019 From: webhook-mailer at python.org (Victor Stinner) Date: Wed, 16 Jan 2019 13:29:32 -0000 Subject: [Python-checkins] bpo-35674: Add os.posix_spawnp() (GH-11554) Message-ID: https://github.com/python/cpython/commit/92b8322e7ea04b239cb1cb87b78d952f13ddfebb commit: 92b8322e7ea04b239cb1cb87b78d952f13ddfebb branch: master author: Joannah Nanjekye <33177550+nanjekyejoannah at users.noreply.github.com> committer: Victor Stinner date: 2019-01-16T14:29:26+01:00 summary: bpo-35674: Add os.posix_spawnp() (GH-11554) Add a new os.posix_spawnp() function. files: A Misc/NEWS.d/next/Library/2019-01-14-14-13-08.bpo-35674.kamWqz.rst M Doc/library/os.rst M Lib/test/test_posix.py M Modules/clinic/posixmodule.c.h M Modules/posixmodule.c M aclocal.m4 M configure M configure.ac M pyconfig.h.in diff --git a/Doc/library/os.rst b/Doc/library/os.rst index 16250e29774f..2aa51f875d9e 100644 --- a/Doc/library/os.rst +++ b/Doc/library/os.rst @@ -3396,6 +3396,10 @@ written in Python, such as a mail server's external command delivery program. The positional-only arguments *path*, *args*, and *env* are similar to :func:`execve`. + The *path* parameter is the path to the executable file.The *path* should + contain a directory.Use :func:`posix_spawnp` to pass an executable file + without directory. + The *file_actions* argument may be a sequence of tuples describing actions to take on specific file descriptors in the child process between the C library implementation's :c:func:`fork` and :c:func:`exec` steps. @@ -3459,6 +3463,19 @@ written in Python, such as a mail server's external command delivery program. .. versionadded:: 3.7 +.. function:: posix_spawnp(path, argv, env, *, file_actions=None, \ + setpgroup=None, resetids=False, setsigmask=(), \ + setsigdef=(), scheduler=None) + + Wraps the :c:func:`posix_spawnp` C library API for use from Python. + + Similar to :func:`posix_spawn` except that the system searches + for the *executable* file in the list of directories specified by the + :envvar:`PATH` environment variable (in the same way as for ``execvp(3)``). + + .. versionadded:: 3.8 + + .. function:: register_at_fork(*, before=None, after_in_parent=None, \ after_in_child=None) diff --git a/Lib/test/test_posix.py b/Lib/test/test_posix.py index d7e512c99f03..79fc3c264418 100644 --- a/Lib/test/test_posix.py +++ b/Lib/test/test_posix.py @@ -1489,10 +1489,10 @@ def test_setgroups(self): self.assertListEqual(groups, posix.getgroups()) - at unittest.skipUnless(hasattr(os, 'posix_spawn'), "test needs os.posix_spawn") -class TestPosixSpawn(unittest.TestCase): - # Program which does nothing and exit with status 0 (success) +class _PosixSpawnMixin: + # Program which does nothing and exits with status 0 (success) NOOP_PROGRAM = (sys.executable, '-I', '-S', '-c', 'pass') + spawn_func = None def python_args(self, *args): # Disable site module to avoid side effects. For example, @@ -1511,7 +1511,7 @@ def test_returns_pid(self): pidfile.write(str(os.getpid())) """ args = self.python_args('-c', script) - pid = posix.posix_spawn(args[0], args, os.environ) + pid = self.spawn_func(args[0], args, os.environ) self.assertEqual(os.waitpid(pid, 0), (pid, 0)) with open(pidfile) as f: self.assertEqual(f.read(), str(pid)) @@ -1519,9 +1519,9 @@ def test_returns_pid(self): def test_no_such_executable(self): no_such_executable = 'no_such_executable' try: - pid = posix.posix_spawn(no_such_executable, - [no_such_executable], - os.environ) + pid = self.spawn_func(no_such_executable, + [no_such_executable], + os.environ) except FileNotFoundError as exc: self.assertEqual(exc.filename, no_such_executable) else: @@ -1538,14 +1538,14 @@ def test_specify_environment(self): envfile.write(os.environ['foo']) """ args = self.python_args('-c', script) - pid = posix.posix_spawn(args[0], args, - {**os.environ, 'foo': 'bar'}) + pid = self.spawn_func(args[0], args, + {**os.environ, 'foo': 'bar'}) self.assertEqual(os.waitpid(pid, 0), (pid, 0)) with open(envfile) as f: self.assertEqual(f.read(), 'bar') def test_empty_file_actions(self): - pid = posix.posix_spawn( + pid = self.spawn_func( self.NOOP_PROGRAM[0], self.NOOP_PROGRAM, os.environ, @@ -1554,7 +1554,7 @@ def test_empty_file_actions(self): self.assertEqual(os.waitpid(pid, 0), (pid, 0)) def test_resetids_explicit_default(self): - pid = posix.posix_spawn( + pid = self.spawn_func( sys.executable, [sys.executable, '-c', 'pass'], os.environ, @@ -1563,7 +1563,7 @@ def test_resetids_explicit_default(self): self.assertEqual(os.waitpid(pid, 0), (pid, 0)) def test_resetids(self): - pid = posix.posix_spawn( + pid = self.spawn_func( sys.executable, [sys.executable, '-c', 'pass'], os.environ, @@ -1573,12 +1573,12 @@ def test_resetids(self): def test_resetids_wrong_type(self): with self.assertRaises(TypeError): - posix.posix_spawn(sys.executable, - [sys.executable, "-c", "pass"], - os.environ, resetids=None) + self.spawn_func(sys.executable, + [sys.executable, "-c", "pass"], + os.environ, resetids=None) def test_setpgroup(self): - pid = posix.posix_spawn( + pid = self.spawn_func( sys.executable, [sys.executable, '-c', 'pass'], os.environ, @@ -1588,9 +1588,9 @@ def test_setpgroup(self): def test_setpgroup_wrong_type(self): with self.assertRaises(TypeError): - posix.posix_spawn(sys.executable, - [sys.executable, "-c", "pass"], - os.environ, setpgroup="023") + self.spawn_func(sys.executable, + [sys.executable, "-c", "pass"], + os.environ, setpgroup="023") @unittest.skipUnless(hasattr(signal, 'pthread_sigmask'), 'need signal.pthread_sigmask()') @@ -1599,7 +1599,7 @@ def test_setsigmask(self): import signal signal.raise_signal(signal.SIGUSR1)""") - pid = posix.posix_spawn( + pid = self.spawn_func( sys.executable, [sys.executable, '-c', code], os.environ, @@ -1609,18 +1609,18 @@ def test_setsigmask(self): def test_setsigmask_wrong_type(self): with self.assertRaises(TypeError): - posix.posix_spawn(sys.executable, - [sys.executable, "-c", "pass"], - os.environ, setsigmask=34) + self.spawn_func(sys.executable, + [sys.executable, "-c", "pass"], + os.environ, setsigmask=34) with self.assertRaises(TypeError): - posix.posix_spawn(sys.executable, - [sys.executable, "-c", "pass"], - os.environ, setsigmask=["j"]) + self.spawn_func(sys.executable, + [sys.executable, "-c", "pass"], + os.environ, setsigmask=["j"]) with self.assertRaises(ValueError): - posix.posix_spawn(sys.executable, - [sys.executable, "-c", "pass"], - os.environ, setsigmask=[signal.NSIG, - signal.NSIG+1]) + self.spawn_func(sys.executable, + [sys.executable, "-c", "pass"], + os.environ, setsigmask=[signal.NSIG, + signal.NSIG+1]) @unittest.skipUnless(hasattr(signal, 'pthread_sigmask'), 'need signal.pthread_sigmask()') @@ -1630,7 +1630,7 @@ def test_setsigdef(self): import signal signal.raise_signal(signal.SIGUSR1)""") try: - pid = posix.posix_spawn( + pid = self.spawn_func( sys.executable, [sys.executable, '-c', code], os.environ, @@ -1646,17 +1646,17 @@ def test_setsigdef(self): def test_setsigdef_wrong_type(self): with self.assertRaises(TypeError): - posix.posix_spawn(sys.executable, - [sys.executable, "-c", "pass"], - os.environ, setsigdef=34) + self.spawn_func(sys.executable, + [sys.executable, "-c", "pass"], + os.environ, setsigdef=34) with self.assertRaises(TypeError): - posix.posix_spawn(sys.executable, - [sys.executable, "-c", "pass"], - os.environ, setsigdef=["j"]) + self.spawn_func(sys.executable, + [sys.executable, "-c", "pass"], + os.environ, setsigdef=["j"]) with self.assertRaises(ValueError): - posix.posix_spawn(sys.executable, - [sys.executable, "-c", "pass"], - os.environ, setsigdef=[signal.NSIG, signal.NSIG+1]) + self.spawn_func(sys.executable, + [sys.executable, "-c", "pass"], + os.environ, setsigdef=[signal.NSIG, signal.NSIG+1]) @requires_sched @unittest.skipIf(sys.platform.startswith(('freebsd', 'netbsd')), @@ -1670,7 +1670,7 @@ def test_setscheduler_only_param(self): sys.exit(101) if os.sched_getparam(0).sched_priority != {priority}: sys.exit(102)""") - pid = posix.posix_spawn( + pid = self.spawn_func( sys.executable, [sys.executable, '-c', code], os.environ, @@ -1690,7 +1690,7 @@ def test_setscheduler_with_policy(self): sys.exit(101) if os.sched_getparam(0).sched_priority != {priority}: sys.exit(102)""") - pid = posix.posix_spawn( + pid = self.spawn_func( sys.executable, [sys.executable, '-c', code], os.environ, @@ -1704,40 +1704,40 @@ def test_multiple_file_actions(self): (os.POSIX_SPAWN_CLOSE, 0), (os.POSIX_SPAWN_DUP2, 1, 4), ] - pid = posix.posix_spawn(self.NOOP_PROGRAM[0], - self.NOOP_PROGRAM, - os.environ, - file_actions=file_actions) + pid = self.spawn_func(self.NOOP_PROGRAM[0], + self.NOOP_PROGRAM, + os.environ, + file_actions=file_actions) self.assertEqual(os.waitpid(pid, 0), (pid, 0)) def test_bad_file_actions(self): args = self.NOOP_PROGRAM with self.assertRaises(TypeError): - posix.posix_spawn(args[0], args, os.environ, - file_actions=[None]) + self.spawn_func(args[0], args, os.environ, + file_actions=[None]) with self.assertRaises(TypeError): - posix.posix_spawn(args[0], args, os.environ, - file_actions=[()]) + self.spawn_func(args[0], args, os.environ, + file_actions=[()]) with self.assertRaises(TypeError): - posix.posix_spawn(args[0], args, os.environ, - file_actions=[(None,)]) + self.spawn_func(args[0], args, os.environ, + file_actions=[(None,)]) with self.assertRaises(TypeError): - posix.posix_spawn(args[0], args, os.environ, - file_actions=[(12345,)]) + self.spawn_func(args[0], args, os.environ, + file_actions=[(12345,)]) with self.assertRaises(TypeError): - posix.posix_spawn(args[0], args, os.environ, - file_actions=[(os.POSIX_SPAWN_CLOSE,)]) + self.spawn_func(args[0], args, os.environ, + file_actions=[(os.POSIX_SPAWN_CLOSE,)]) with self.assertRaises(TypeError): - posix.posix_spawn(args[0], args, os.environ, - file_actions=[(os.POSIX_SPAWN_CLOSE, 1, 2)]) + self.spawn_func(args[0], args, os.environ, + file_actions=[(os.POSIX_SPAWN_CLOSE, 1, 2)]) with self.assertRaises(TypeError): - posix.posix_spawn(args[0], args, os.environ, - file_actions=[(os.POSIX_SPAWN_CLOSE, None)]) + self.spawn_func(args[0], args, os.environ, + file_actions=[(os.POSIX_SPAWN_CLOSE, None)]) with self.assertRaises(ValueError): - posix.posix_spawn(args[0], args, os.environ, - file_actions=[(os.POSIX_SPAWN_OPEN, - 3, __file__ + '\0', - os.O_RDONLY, 0)]) + self.spawn_func(args[0], args, os.environ, + file_actions=[(os.POSIX_SPAWN_OPEN, + 3, __file__ + '\0', + os.O_RDONLY, 0)]) def test_open_file(self): outfile = support.TESTFN @@ -1752,8 +1752,8 @@ def test_open_file(self): stat.S_IRUSR | stat.S_IWUSR), ] args = self.python_args('-c', script) - pid = posix.posix_spawn(args[0], args, os.environ, - file_actions=file_actions) + pid = self.spawn_func(args[0], args, os.environ, + file_actions=file_actions) self.assertEqual(os.waitpid(pid, 0), (pid, 0)) with open(outfile) as f: self.assertEqual(f.read(), 'hello') @@ -1770,8 +1770,8 @@ def test_close_file(self): closefile.write('is closed %d' % e.errno) """ args = self.python_args('-c', script) - pid = posix.posix_spawn(args[0], args, os.environ, - file_actions=[(os.POSIX_SPAWN_CLOSE, 0),]) + pid = self.spawn_func(args[0], args, os.environ, + file_actions=[(os.POSIX_SPAWN_CLOSE, 0)]) self.assertEqual(os.waitpid(pid, 0), (pid, 0)) with open(closefile) as f: self.assertEqual(f.read(), 'is closed %d' % errno.EBADF) @@ -1788,16 +1788,64 @@ def test_dup2(self): (os.POSIX_SPAWN_DUP2, childfile.fileno(), 1), ] args = self.python_args('-c', script) - pid = posix.posix_spawn(args[0], args, os.environ, - file_actions=file_actions) + pid = self.spawn_func(args[0], args, os.environ, + file_actions=file_actions) self.assertEqual(os.waitpid(pid, 0), (pid, 0)) with open(dupfile) as f: self.assertEqual(f.read(), 'hello') + at unittest.skipUnless(hasattr(os, 'posix_spawn'), "test needs os.posix_spawn") +class TestPosixSpawn(unittest.TestCase, _PosixSpawnMixin): + spawn_func = getattr(posix, 'posix_spawn', None) + + + at unittest.skipUnless(hasattr(os, 'posix_spawnp'), "test needs os.posix_spawnp") +class TestPosixSpawnP(unittest.TestCase, _PosixSpawnMixin): + spawn_func = getattr(posix, 'posix_spawnp', None) + + @support.skip_unless_symlink + def test_posix_spawnp(self): + # Use a symlink to create a program in its own temporary directory + temp_dir = tempfile.mkdtemp() + self.addCleanup(support.rmtree, temp_dir) + + program = 'posix_spawnp_test_program.exe' + program_fullpath = os.path.join(temp_dir, program) + os.symlink(sys.executable, program_fullpath) + + try: + path = os.pathsep.join((temp_dir, os.environ['PATH'])) + except KeyError: + path = temp_dir # PATH is not set + + spawn_args = (program, '-I', '-S', '-c', 'pass') + code = textwrap.dedent(""" + import os + args = %a + pid = os.posix_spawnp(args[0], args, os.environ) + pid2, status = os.waitpid(pid, 0) + if pid2 != pid: + raise Exception(f"pid {pid2} != {pid}") + if status != 0: + raise Exception(f"status {status} != 0") + """ % (spawn_args,)) + + # Use a subprocess to test os.posix_spawnp() with a modified PATH + # environment variable: posix_spawnp() uses the current environment + # to locate the program, not its environment argument. + args = ('-c', code) + assert_python_ok(*args, PATH=path) + + def test_main(): try: - support.run_unittest(PosixTester, PosixGroupsTester, TestPosixSpawn) + support.run_unittest( + PosixTester, + PosixGroupsTester, + TestPosixSpawn, + TestPosixSpawnP, + ) finally: support.reap_children() diff --git a/Misc/NEWS.d/next/Library/2019-01-14-14-13-08.bpo-35674.kamWqz.rst b/Misc/NEWS.d/next/Library/2019-01-14-14-13-08.bpo-35674.kamWqz.rst new file mode 100644 index 000000000000..02d170ecac6e --- /dev/null +++ b/Misc/NEWS.d/next/Library/2019-01-14-14-13-08.bpo-35674.kamWqz.rst @@ -0,0 +1,2 @@ +Add a new :func:`os.posix_spawnp` function. +Patch by Joannah Nanjekye. \ No newline at end of file diff --git a/Modules/clinic/posixmodule.c.h b/Modules/clinic/posixmodule.c.h index 2c1ee97faf71..ce17709c38ba 100644 --- a/Modules/clinic/posixmodule.c.h +++ b/Modules/clinic/posixmodule.c.h @@ -1791,6 +1791,75 @@ os_posix_spawn(PyObject *module, PyObject *const *args, Py_ssize_t nargs, PyObje #endif /* defined(HAVE_POSIX_SPAWN) */ +#if defined(HAVE_POSIX_SPAWNP) + +PyDoc_STRVAR(os_posix_spawnp__doc__, +"posix_spawnp($module, path, argv, env, /, *, file_actions=(),\n" +" setpgroup=None, resetids=False, setsigmask=(),\n" +" setsigdef=(), scheduler=None)\n" +"--\n" +"\n" +"Execute the program specified by path in a new process.\n" +"\n" +" path\n" +" Path of executable file.\n" +" argv\n" +" Tuple or list of strings.\n" +" env\n" +" Dictionary of strings mapping to strings.\n" +" file_actions\n" +" A sequence of file action tuples.\n" +" setpgroup\n" +" The pgroup to use with the POSIX_SPAWN_SETPGROUP flag.\n" +" resetids\n" +" If the value is `True` the POSIX_SPAWN_RESETIDS will be activated.\n" +" setsigmask\n" +" The sigmask to use with the POSIX_SPAWN_SETSIGMASK flag.\n" +" setsigdef\n" +" The sigmask to use with the POSIX_SPAWN_SETSIGDEF flag.\n" +" scheduler\n" +" A tuple with the scheduler policy (optional) and parameters."); + +#define OS_POSIX_SPAWNP_METHODDEF \ + {"posix_spawnp", (PyCFunction)(void(*)(void))os_posix_spawnp, METH_FASTCALL|METH_KEYWORDS, os_posix_spawnp__doc__}, + +static PyObject * +os_posix_spawnp_impl(PyObject *module, path_t *path, PyObject *argv, + PyObject *env, PyObject *file_actions, + PyObject *setpgroup, int resetids, PyObject *setsigmask, + PyObject *setsigdef, PyObject *scheduler); + +static PyObject * +os_posix_spawnp(PyObject *module, PyObject *const *args, Py_ssize_t nargs, PyObject *kwnames) +{ + PyObject *return_value = NULL; + static const char * const _keywords[] = {"", "", "", "file_actions", "setpgroup", "resetids", "setsigmask", "setsigdef", "scheduler", NULL}; + static _PyArg_Parser _parser = {"O&OO|$OOiOOO:posix_spawnp", _keywords, 0}; + path_t path = PATH_T_INITIALIZE("posix_spawnp", "path", 0, 0); + PyObject *argv; + PyObject *env; + PyObject *file_actions = NULL; + PyObject *setpgroup = NULL; + int resetids = 0; + PyObject *setsigmask = NULL; + PyObject *setsigdef = NULL; + PyObject *scheduler = NULL; + + if (!_PyArg_ParseStackAndKeywords(args, nargs, kwnames, &_parser, + path_converter, &path, &argv, &env, &file_actions, &setpgroup, &resetids, &setsigmask, &setsigdef, &scheduler)) { + goto exit; + } + return_value = os_posix_spawnp_impl(module, &path, argv, env, file_actions, setpgroup, resetids, setsigmask, setsigdef, scheduler); + +exit: + /* Cleanup for path */ + path_cleanup(&path); + + return return_value; +} + +#endif /* defined(HAVE_POSIX_SPAWNP) */ + #if (defined(HAVE_SPAWNV) || defined(HAVE_WSPAWNV)) PyDoc_STRVAR(os_spawnv__doc__, @@ -6851,6 +6920,10 @@ os_getrandom(PyObject *module, PyObject *const *args, Py_ssize_t nargs, PyObject #define OS_POSIX_SPAWN_METHODDEF #endif /* !defined(OS_POSIX_SPAWN_METHODDEF) */ +#ifndef OS_POSIX_SPAWNP_METHODDEF + #define OS_POSIX_SPAWNP_METHODDEF +#endif /* !defined(OS_POSIX_SPAWNP_METHODDEF) */ + #ifndef OS_SPAWNV_METHODDEF #define OS_SPAWNV_METHODDEF #endif /* !defined(OS_SPAWNV_METHODDEF) */ @@ -7258,4 +7331,4 @@ os_getrandom(PyObject *module, PyObject *const *args, Py_ssize_t nargs, PyObject #ifndef OS_GETRANDOM_METHODDEF #define OS_GETRANDOM_METHODDEF #endif /* !defined(OS_GETRANDOM_METHODDEF) */ -/*[clinic end generated code: output=febc1e16c9024e40 input=a9049054013a1b77]*/ +/*[clinic end generated code: output=dabd0fa27bf87044 input=a9049054013a1b77]*/ diff --git a/Modules/posixmodule.c b/Modules/posixmodule.c index e5c2a9cfc1ec..b25b5220cdb3 100644 --- a/Modules/posixmodule.c +++ b/Modules/posixmodule.c @@ -5381,39 +5381,12 @@ parse_file_actions(PyObject *file_actions, return -1; } -/*[clinic input] - -os.posix_spawn - path: path_t - Path of executable file. - argv: object - Tuple or list of strings. - env: object - Dictionary of strings mapping to strings. - / - * - file_actions: object(c_default='NULL') = () - A sequence of file action tuples. - setpgroup: object = NULL - The pgroup to use with the POSIX_SPAWN_SETPGROUP flag. - resetids: bool(accept={int}) = False - If the value is `True` the POSIX_SPAWN_RESETIDS will be activated. - setsigmask: object(c_default='NULL') = () - The sigmask to use with the POSIX_SPAWN_SETSIGMASK flag. - setsigdef: object(c_default='NULL') = () - The sigmask to use with the POSIX_SPAWN_SETSIGDEF flag. - scheduler: object = NULL - A tuple with the scheduler policy (optional) and parameters. - -Execute the program specified by path in a new process. -[clinic start generated code]*/ static PyObject * -os_posix_spawn_impl(PyObject *module, path_t *path, PyObject *argv, - PyObject *env, PyObject *file_actions, - PyObject *setpgroup, int resetids, PyObject *setsigmask, - PyObject *setsigdef, PyObject *scheduler) -/*[clinic end generated code: output=45dfa4c515d09f2c input=2891c2f1d457e39b]*/ +py_posix_spawn(int use_posix_spawnp, PyObject *module, path_t *path, PyObject *argv, + PyObject *env, PyObject *file_actions, + PyObject *setpgroup, int resetids, PyObject *setsigmask, + PyObject *setsigdef, PyObject *scheduler) { EXECV_CHAR **argvlist = NULL; EXECV_CHAR **envlist = NULL; @@ -5489,9 +5462,19 @@ os_posix_spawn_impl(PyObject *module, path_t *path, PyObject *argv, attrp = &attr; _Py_BEGIN_SUPPRESS_IPH - err_code = posix_spawn(&pid, path->narrow, - file_actionsp, attrp, argvlist, envlist); +#ifdef HAVE_POSIX_SPAWNP + if (use_posix_spawnp) { + err_code = posix_spawnp(&pid, path->narrow, + file_actionsp, attrp, argvlist, envlist); + } + else +#endif /* HAVE_POSIX_SPAWNP */ + { + err_code = posix_spawn(&pid, path->narrow, + file_actionsp, attrp, argvlist, envlist); + } _Py_END_SUPPRESS_IPH + if (err_code) { errno = err_code; PyErr_SetFromErrnoWithFilenameObject(PyExc_OSError, path->object); @@ -5518,7 +5501,90 @@ os_posix_spawn_impl(PyObject *module, path_t *path, PyObject *argv, Py_XDECREF(temp_buffer); return result; } -#endif /* HAVE_POSIX_SPAWN */ + + +/*[clinic input] + +os.posix_spawn + path: path_t + Path of executable file. + argv: object + Tuple or list of strings. + env: object + Dictionary of strings mapping to strings. + / + * + file_actions: object(c_default='NULL') = () + A sequence of file action tuples. + setpgroup: object = NULL + The pgroup to use with the POSIX_SPAWN_SETPGROUP flag. + resetids: bool(accept={int}) = False + If the value is `True` the POSIX_SPAWN_RESETIDS will be activated. + setsigmask: object(c_default='NULL') = () + The sigmask to use with the POSIX_SPAWN_SETSIGMASK flag. + setsigdef: object(c_default='NULL') = () + The sigmask to use with the POSIX_SPAWN_SETSIGDEF flag. + scheduler: object = NULL + A tuple with the scheduler policy (optional) and parameters. + +Execute the program specified by path in a new process. +[clinic start generated code]*/ + +static PyObject * +os_posix_spawn_impl(PyObject *module, path_t *path, PyObject *argv, + PyObject *env, PyObject *file_actions, + PyObject *setpgroup, int resetids, PyObject *setsigmask, + PyObject *setsigdef, PyObject *scheduler) +/*[clinic end generated code: output=45dfa4c515d09f2c input=2891c2f1d457e39b]*/ +{ + return py_posix_spawn(0, module, path, argv, env, file_actions, + setpgroup, resetids, setsigmask, setsigdef, + scheduler); +} + #endif /* HAVE_POSIX_SPAWN */ + + + +#ifdef HAVE_POSIX_SPAWNP +/*[clinic input] + +os.posix_spawnp + path: path_t + Path of executable file. + argv: object + Tuple or list of strings. + env: object + Dictionary of strings mapping to strings. + / + * + file_actions: object(c_default='NULL') = () + A sequence of file action tuples. + setpgroup: object = NULL + The pgroup to use with the POSIX_SPAWN_SETPGROUP flag. + resetids: bool(accept={int}) = False + If the value is `True` the POSIX_SPAWN_RESETIDS will be activated. + setsigmask: object(c_default='NULL') = () + The sigmask to use with the POSIX_SPAWN_SETSIGMASK flag. + setsigdef: object(c_default='NULL') = () + The sigmask to use with the POSIX_SPAWN_SETSIGDEF flag. + scheduler: object = NULL + A tuple with the scheduler policy (optional) and parameters. + +Execute the program specified by path in a new process. +[clinic start generated code]*/ + +static PyObject * +os_posix_spawnp_impl(PyObject *module, path_t *path, PyObject *argv, + PyObject *env, PyObject *file_actions, + PyObject *setpgroup, int resetids, PyObject *setsigmask, + PyObject *setsigdef, PyObject *scheduler) +/*[clinic end generated code: output=7955dc0edc82b8c3 input=b7576eb25b1ed9eb]*/ +{ + return py_posix_spawn(1, module, path, argv, env, file_actions, + setpgroup, resetids, setsigmask, setsigdef, + scheduler); +} +#endif /* HAVE_POSIX_SPAWNP */ #if defined(HAVE_SPAWNV) || defined(HAVE_WSPAWNV) @@ -13084,6 +13150,7 @@ static PyMethodDef posix_methods[] = { OS_GETPRIORITY_METHODDEF OS_SETPRIORITY_METHODDEF OS_POSIX_SPAWN_METHODDEF + OS_POSIX_SPAWNP_METHODDEF OS_READLINK_METHODDEF OS_RENAME_METHODDEF OS_REPLACE_METHODDEF diff --git a/aclocal.m4 b/aclocal.m4 index 94a2dd6d3ebd..f98db73656d3 100644 --- a/aclocal.m4 +++ b/aclocal.m4 @@ -1,6 +1,6 @@ -# generated automatically by aclocal 1.15.1 -*- Autoconf -*- +# generated automatically by aclocal 1.15 -*- Autoconf -*- -# Copyright (C) 1996-2017 Free Software Foundation, Inc. +# Copyright (C) 1996-2014 Free Software Foundation, Inc. # This file is free software; the Free Software Foundation # gives unlimited permission to copy and/or distribute it, @@ -13,7 +13,7 @@ m4_ifndef([AC_CONFIG_MACRO_DIRS], [m4_defun([_AM_CONFIG_MACRO_DIRS], [])m4_defun([AC_CONFIG_MACRO_DIRS], [_AM_CONFIG_MACRO_DIRS($@)])]) dnl pkg.m4 - Macros to locate and utilise pkg-config. -*- Autoconf -*- -dnl serial 11 (pkg-config-0.29) +dnl serial 11 (pkg-config-0.29.1) dnl dnl Copyright ? 2004 Scott James Remnant . dnl Copyright ? 2012-2015 Dan Nicholson @@ -55,7 +55,7 @@ dnl dnl See the "Since" comment for each macro you use to see what version dnl of the macros you require. m4_defun([PKG_PREREQ], -[m4_define([PKG_MACROS_VERSION], [0.29]) +[m4_define([PKG_MACROS_VERSION], [0.29.1]) m4_if(m4_version_compare(PKG_MACROS_VERSION, [$1]), -1, [m4_fatal([pkg.m4 version $1 or higher is required but ]PKG_MACROS_VERSION[ found])]) ])dnl PKG_PREREQ diff --git a/configure b/configure index 13677b9d96b6..b32481dca03f 100755 --- a/configure +++ b/configure @@ -11447,7 +11447,7 @@ for ac_func in alarm accept4 setitimer getitimer bind_textdomain_codeset chown \ initgroups kill killpg lchown lockf linkat lstat lutimes mmap \ memrchr mbrtowc mkdirat mkfifo \ mkfifoat mknod mknodat mktime mremap nice openat pathconf pause pipe2 plock poll \ - posix_fallocate posix_fadvise posix_spawn pread preadv preadv2 \ + posix_fallocate posix_fadvise posix_spawn posix_spawnp pread preadv preadv2 \ pthread_init pthread_kill putenv pwrite pwritev pwritev2 readlink readlinkat readv realpath renameat \ sem_open sem_timedwait sem_getvalue sem_unlink sendfile setegid seteuid \ setgid sethostname \ diff --git a/configure.ac b/configure.ac index bd09a9c9e1e0..262c72668a95 100644 --- a/configure.ac +++ b/configure.ac @@ -3505,7 +3505,7 @@ AC_CHECK_FUNCS(alarm accept4 setitimer getitimer bind_textdomain_codeset chown \ initgroups kill killpg lchown lockf linkat lstat lutimes mmap \ memrchr mbrtowc mkdirat mkfifo \ mkfifoat mknod mknodat mktime mremap nice openat pathconf pause pipe2 plock poll \ - posix_fallocate posix_fadvise posix_spawn pread preadv preadv2 \ + posix_fallocate posix_fadvise posix_spawn posix_spawnp pread preadv preadv2 \ pthread_init pthread_kill putenv pwrite pwritev pwritev2 readlink readlinkat readv realpath renameat \ sem_open sem_timedwait sem_getvalue sem_unlink sendfile setegid seteuid \ setgid sethostname \ diff --git a/pyconfig.h.in b/pyconfig.h.in index f37ca3615025..a2a56230fc13 100644 --- a/pyconfig.h.in +++ b/pyconfig.h.in @@ -732,6 +732,9 @@ /* Define to 1 if you have the `posix_spawn' function. */ #undef HAVE_POSIX_SPAWN +/* Define to 1 if you have the `posix_spawnp' function. */ +#undef HAVE_POSIX_SPAWNP + /* Define to 1 if you have the `pread' function. */ #undef HAVE_PREAD From webhook-mailer at python.org Wed Jan 16 09:26:26 2019 From: webhook-mailer at python.org (Victor Stinner) Date: Wed, 16 Jan 2019 14:26:26 -0000 Subject: [Python-checkins] bpo-35537: subprocess can now use os.posix_spawnp (GH-11579) Message-ID: https://github.com/python/cpython/commit/07858894689047c77f9c12ddc061d30681368d19 commit: 07858894689047c77f9c12ddc061d30681368d19 branch: master author: Victor Stinner committer: GitHub date: 2019-01-16T15:26:20+01:00 summary: bpo-35537: subprocess can now use os.posix_spawnp (GH-11579) The subprocess module can now use the os.posix_spawnp() function, if it is available, to locate the program in the PATH. files: M Doc/whatsnew/3.8.rst M Lib/subprocess.py M Lib/test/pythoninfo.py diff --git a/Doc/whatsnew/3.8.rst b/Doc/whatsnew/3.8.rst index 053fe902c481..05fb4ffe03bb 100644 --- a/Doc/whatsnew/3.8.rst +++ b/Doc/whatsnew/3.8.rst @@ -281,8 +281,7 @@ Optimizations * *close_fds* is false; * *preexec_fn*, *pass_fds*, *cwd*, *stdin*, *stdout*, *stderr* and - *start_new_session* parameters are not set; - * the *executable* path contains a directory. + *start_new_session* parameters are not set. * :func:`shutil.copyfile`, :func:`shutil.copy`, :func:`shutil.copy2`, :func:`shutil.copytree` and :func:`shutil.move` use platform-specific diff --git a/Lib/subprocess.py b/Lib/subprocess.py index b94575b8401e..d63cf2050634 100644 --- a/Lib/subprocess.py +++ b/Lib/subprocess.py @@ -655,6 +655,7 @@ def _use_posix_spawn(): _USE_POSIX_SPAWN = _use_posix_spawn() +_HAVE_POSIX_SPAWNP = hasattr(os, 'posix_spawnp') class Popen(object): @@ -1442,7 +1443,10 @@ def _get_handles(self, stdin, stdout, stderr): def _posix_spawn(self, args, executable, env, restore_signals): - """Execute program using os.posix_spawn().""" + """Execute program using os.posix_spawnp(). + + Or use os.posix_spawn() if os.posix_spawnp() is not available. + """ if env is None: env = os.environ @@ -1456,7 +1460,10 @@ def _posix_spawn(self, args, executable, env, restore_signals): sigset.append(signum) kwargs['setsigdef'] = sigset - self.pid = os.posix_spawn(executable, args, env, **kwargs) + if _HAVE_POSIX_SPAWNP: + self.pid = os.posix_spawnp(executable, args, env, **kwargs) + else: + self.pid = os.posix_spawn(executable, args, env, **kwargs) def _execute_child(self, args, executable, preexec_fn, close_fds, pass_fds, cwd, env, @@ -1484,7 +1491,7 @@ def _execute_child(self, args, executable, preexec_fn, close_fds, executable = args[0] if (_USE_POSIX_SPAWN - and os.path.dirname(executable) + and (_HAVE_POSIX_SPAWNP or os.path.dirname(executable)) and preexec_fn is None and not close_fds and not pass_fds diff --git a/Lib/test/pythoninfo.py b/Lib/test/pythoninfo.py index 7e94a31cecea..93d87be41588 100644 --- a/Lib/test/pythoninfo.py +++ b/Lib/test/pythoninfo.py @@ -612,7 +612,8 @@ def collect_get_config(info_add): def collect_subprocess(info_add): import subprocess - copy_attributes(info_add, subprocess, 'subprocess.%s', ('_USE_POSIX_SPAWN',)) + attrs = ('_USE_POSIX_SPAWN', '_HAVE_POSIX_SPAWNP') + copy_attributes(info_add, subprocess, 'subprocess.%s', attrs) def collect_info(info): From webhook-mailer at python.org Wed Jan 16 17:38:12 2019 From: webhook-mailer at python.org (Victor Stinner) Date: Wed, 16 Jan 2019 22:38:12 -0000 Subject: [Python-checkins] Revert "bpo-35537: subprocess can now use os.posix_spawnp (GH-11579)" (GH-11582) Message-ID: https://github.com/python/cpython/commit/8c349565e8a442e17f1a954d1a9996847749d778 commit: 8c349565e8a442e17f1a954d1a9996847749d778 branch: master author: Victor Stinner committer: GitHub date: 2019-01-16T23:38:06+01:00 summary: Revert "bpo-35537: subprocess can now use os.posix_spawnp (GH-11579)" (GH-11582) This reverts commit 07858894689047c77f9c12ddc061d30681368d19. files: M Doc/whatsnew/3.8.rst M Lib/subprocess.py M Lib/test/pythoninfo.py diff --git a/Doc/whatsnew/3.8.rst b/Doc/whatsnew/3.8.rst index 05fb4ffe03bb..053fe902c481 100644 --- a/Doc/whatsnew/3.8.rst +++ b/Doc/whatsnew/3.8.rst @@ -281,7 +281,8 @@ Optimizations * *close_fds* is false; * *preexec_fn*, *pass_fds*, *cwd*, *stdin*, *stdout*, *stderr* and - *start_new_session* parameters are not set. + *start_new_session* parameters are not set; + * the *executable* path contains a directory. * :func:`shutil.copyfile`, :func:`shutil.copy`, :func:`shutil.copy2`, :func:`shutil.copytree` and :func:`shutil.move` use platform-specific diff --git a/Lib/subprocess.py b/Lib/subprocess.py index d63cf2050634..b94575b8401e 100644 --- a/Lib/subprocess.py +++ b/Lib/subprocess.py @@ -655,7 +655,6 @@ def _use_posix_spawn(): _USE_POSIX_SPAWN = _use_posix_spawn() -_HAVE_POSIX_SPAWNP = hasattr(os, 'posix_spawnp') class Popen(object): @@ -1443,10 +1442,7 @@ def _get_handles(self, stdin, stdout, stderr): def _posix_spawn(self, args, executable, env, restore_signals): - """Execute program using os.posix_spawnp(). - - Or use os.posix_spawn() if os.posix_spawnp() is not available. - """ + """Execute program using os.posix_spawn().""" if env is None: env = os.environ @@ -1460,10 +1456,7 @@ def _posix_spawn(self, args, executable, env, restore_signals): sigset.append(signum) kwargs['setsigdef'] = sigset - if _HAVE_POSIX_SPAWNP: - self.pid = os.posix_spawnp(executable, args, env, **kwargs) - else: - self.pid = os.posix_spawn(executable, args, env, **kwargs) + self.pid = os.posix_spawn(executable, args, env, **kwargs) def _execute_child(self, args, executable, preexec_fn, close_fds, pass_fds, cwd, env, @@ -1491,7 +1484,7 @@ def _execute_child(self, args, executable, preexec_fn, close_fds, executable = args[0] if (_USE_POSIX_SPAWN - and (_HAVE_POSIX_SPAWNP or os.path.dirname(executable)) + and os.path.dirname(executable) and preexec_fn is None and not close_fds and not pass_fds diff --git a/Lib/test/pythoninfo.py b/Lib/test/pythoninfo.py index 93d87be41588..7e94a31cecea 100644 --- a/Lib/test/pythoninfo.py +++ b/Lib/test/pythoninfo.py @@ -612,8 +612,7 @@ def collect_get_config(info_add): def collect_subprocess(info_add): import subprocess - attrs = ('_USE_POSIX_SPAWN', '_HAVE_POSIX_SPAWNP') - copy_attributes(info_add, subprocess, 'subprocess.%s', attrs) + copy_attributes(info_add, subprocess, 'subprocess.%s', ('_USE_POSIX_SPAWN',)) def collect_info(info): From solipsis at pitrou.net Thu Jan 17 04:09:04 2019 From: solipsis at pitrou.net (solipsis at pitrou.net) Date: Thu, 17 Jan 2019 09:09:04 +0000 Subject: [Python-checkins] Daily reference leaks (4243df51fe43): sum=2 Message-ID: <20190117090904.1.AAEFC5AB14FC3B4D@psf.io> results for 4243df51fe43 on branch "default" -------------------------------------------- test_collections leaked [-7, 1, 0] memory blocks, sum=-6 test_functools leaked [0, 3, 1] memory blocks, sum=4 test_multiprocessing_fork leaked [2, 0, 0] memory blocks, sum=2 test_multiprocessing_forkserver leaked [-2, 2, 0] memory blocks, sum=0 test_multiprocessing_spawn leaked [2, -2, 2] memory blocks, sum=2 Command line was: ['./python', '-m', 'test.regrtest', '-uall', '-R', '3:3:/home/psf-users/antoine/refleaks/reflogd1PTFQ', '--timeout', '7200'] From webhook-mailer at python.org Thu Jan 17 05:41:32 2019 From: webhook-mailer at python.org (Miss Islington (bot)) Date: Thu, 17 Jan 2019 10:41:32 -0000 Subject: [Python-checkins] bpo-35486: Note Py3.6 import system API requirement change (GH-11540) Message-ID: https://github.com/python/cpython/commit/cee29b46a19116261b083dc803217aa754c7df40 commit: cee29b46a19116261b083dc803217aa754c7df40 branch: master author: Nick Coghlan committer: Miss Islington (bot) <31488909+miss-islington at users.noreply.github.com> date: 2019-01-17T02:41:29-08:00 summary: bpo-35486: Note Py3.6 import system API requirement change (GH-11540) While the introduction of ModuleNotFoundError was fully backwards compatible on the import API consumer side, folks providing alternative implementations of `__import__` need to make an update to be forward compatible with clients that start relying on the new subclass. https://bugs.python.org/issue35486 files: M Doc/library/importlib.rst M Doc/whatsnew/3.6.rst diff --git a/Doc/library/importlib.rst b/Doc/library/importlib.rst index 3c9a99abf9a2..23831c75842f 100644 --- a/Doc/library/importlib.rst +++ b/Doc/library/importlib.rst @@ -1737,7 +1737,8 @@ Python 3.6 and newer for other parts of the code). if spec is not None: break else: - raise ImportError(f'No module named {absolute_name!r}') + msg = f'No module named {absolute_name!r}' + raise ModuleNotFoundError(msg, name=absolute_name) module = importlib.util.module_from_spec(spec) spec.loader.exec_module(module) sys.modules[absolute_name] = module diff --git a/Doc/whatsnew/3.6.rst b/Doc/whatsnew/3.6.rst index 936ea2dc321e..3f5f5200f122 100644 --- a/Doc/whatsnew/3.6.rst +++ b/Doc/whatsnew/3.6.rst @@ -2316,6 +2316,17 @@ Changes in the Python API a :exc:`DeprecationWarning` in Python 3.6 and a :exc:`RuntimeError` in Python 3.8. +* With the introduction of :exc:`ModuleNotFoundError`, import system consumers + may start expecting import system replacements to raise that more specific + exception when appropriate, rather than the less-specific :exc:`ImportError`. + To provide future compatibility with such consumers, implementors of + alternative import systems that completely replace :func:`__import__` will + need to update their implementations to raise the new subclass when a module + can't be found at all. Implementors of compliant plugins to the default + import system shouldn't need to make any changes, as the default import + system will raise the new subclass when appropriate. + + Changes in the C API -------------------- From webhook-mailer at python.org Thu Jan 17 05:48:18 2019 From: webhook-mailer at python.org (Miss Islington (bot)) Date: Thu, 17 Jan 2019 10:48:18 -0000 Subject: [Python-checkins] bpo-35486: Note Py3.6 import system API requirement change (GH-11540) Message-ID: https://github.com/python/cpython/commit/422db3777874f4f31fc8f4e718f440a2abc59347 commit: 422db3777874f4f31fc8f4e718f440a2abc59347 branch: 3.7 author: Miss Islington (bot) <31488909+miss-islington at users.noreply.github.com> committer: GitHub date: 2019-01-17T02:48:15-08:00 summary: bpo-35486: Note Py3.6 import system API requirement change (GH-11540) While the introduction of ModuleNotFoundError was fully backwards compatible on the import API consumer side, folks providing alternative implementations of `__import__` need to make an update to be forward compatible with clients that start relying on the new subclass. https://bugs.python.org/issue35486 (cherry picked from commit cee29b46a19116261b083dc803217aa754c7df40) Co-authored-by: Nick Coghlan files: M Doc/library/importlib.rst M Doc/whatsnew/3.6.rst diff --git a/Doc/library/importlib.rst b/Doc/library/importlib.rst index 3c9a99abf9a2..23831c75842f 100644 --- a/Doc/library/importlib.rst +++ b/Doc/library/importlib.rst @@ -1737,7 +1737,8 @@ Python 3.6 and newer for other parts of the code). if spec is not None: break else: - raise ImportError(f'No module named {absolute_name!r}') + msg = f'No module named {absolute_name!r}' + raise ModuleNotFoundError(msg, name=absolute_name) module = importlib.util.module_from_spec(spec) spec.loader.exec_module(module) sys.modules[absolute_name] = module diff --git a/Doc/whatsnew/3.6.rst b/Doc/whatsnew/3.6.rst index 936ea2dc321e..3f5f5200f122 100644 --- a/Doc/whatsnew/3.6.rst +++ b/Doc/whatsnew/3.6.rst @@ -2316,6 +2316,17 @@ Changes in the Python API a :exc:`DeprecationWarning` in Python 3.6 and a :exc:`RuntimeError` in Python 3.8. +* With the introduction of :exc:`ModuleNotFoundError`, import system consumers + may start expecting import system replacements to raise that more specific + exception when appropriate, rather than the less-specific :exc:`ImportError`. + To provide future compatibility with such consumers, implementors of + alternative import systems that completely replace :func:`__import__` will + need to update their implementations to raise the new subclass when a module + can't be found at all. Implementors of compliant plugins to the default + import system shouldn't need to make any changes, as the default import + system will raise the new subclass when appropriate. + + Changes in the C API -------------------- From webhook-mailer at python.org Thu Jan 17 06:52:27 2019 From: webhook-mailer at python.org (Miss Islington (bot)) Date: Thu, 17 Jan 2019 11:52:27 -0000 Subject: [Python-checkins] Fixes typo in asyncio.queue doc (GH-11581) Message-ID: https://github.com/python/cpython/commit/97e12996f31f6ada4173e2cd4b6807c98ba379a4 commit: 97e12996f31f6ada4173e2cd4b6807c98ba379a4 branch: master author: Slam <3lnc.slam at gmail.com> committer: Miss Islington (bot) <31488909+miss-islington at users.noreply.github.com> date: 2019-01-17T03:52:17-08:00 summary: Fixes typo in asyncio.queue doc (GH-11581) Typo fix for method doc, I'm pretty sure coro is meant, because there's no consumer threads for thread-unsafe queue. Most probably this piece of doc was copied from `queue.Queue` There's not BPO bug for this, afaik. files: M Doc/library/asyncio-queue.rst diff --git a/Doc/library/asyncio-queue.rst b/Doc/library/asyncio-queue.rst index bd0e70c0d9fc..7be1023c80cc 100644 --- a/Doc/library/asyncio-queue.rst +++ b/Doc/library/asyncio-queue.rst @@ -64,7 +64,7 @@ Queue Block until all items in the queue have been received and processed. The count of unfinished tasks goes up whenever an item is added - to the queue. The count goes down whenever a consumer thread calls + to the queue. The count goes down whenever a consumer coroutine calls :meth:`task_done` to indicate that the item was retrieved and all work on it is complete. When the count of unfinished tasks drops to zero, :meth:`join` unblocks. From webhook-mailer at python.org Thu Jan 17 06:58:41 2019 From: webhook-mailer at python.org (Miss Islington (bot)) Date: Thu, 17 Jan 2019 11:58:41 -0000 Subject: [Python-checkins] Fixes typo in asyncio.queue doc (GH-11581) Message-ID: https://github.com/python/cpython/commit/6d84071514346efd5eddee1a2963e7ea25ceb901 commit: 6d84071514346efd5eddee1a2963e7ea25ceb901 branch: 3.7 author: Miss Islington (bot) <31488909+miss-islington at users.noreply.github.com> committer: GitHub date: 2019-01-17T03:58:37-08:00 summary: Fixes typo in asyncio.queue doc (GH-11581) Typo fix for method doc, I'm pretty sure coro is meant, because there's no consumer threads for thread-unsafe queue. Most probably this piece of doc was copied from `queue.Queue` There's not BPO bug for this, afaik. (cherry picked from commit 97e12996f31f6ada4173e2cd4b6807c98ba379a4) Co-authored-by: Slam <3lnc.slam at gmail.com> files: M Doc/library/asyncio-queue.rst diff --git a/Doc/library/asyncio-queue.rst b/Doc/library/asyncio-queue.rst index bd0e70c0d9fc..7be1023c80cc 100644 --- a/Doc/library/asyncio-queue.rst +++ b/Doc/library/asyncio-queue.rst @@ -64,7 +64,7 @@ Queue Block until all items in the queue have been received and processed. The count of unfinished tasks goes up whenever an item is added - to the queue. The count goes down whenever a consumer thread calls + to the queue. The count goes down whenever a consumer coroutine calls :meth:`task_done` to indicate that the item was retrieved and all work on it is complete. When the count of unfinished tasks drops to zero, :meth:`join` unblocks. From webhook-mailer at python.org Thu Jan 17 07:14:51 2019 From: webhook-mailer at python.org (Victor Stinner) Date: Thu, 17 Jan 2019 12:14:51 -0000 Subject: [Python-checkins] bpo-35283: Add deprecation warning for Thread.isAlive (GH-11454) Message-ID: https://github.com/python/cpython/commit/89669ffe10a9db6343f6ee42239e412c8ad96bde commit: 89669ffe10a9db6343f6ee42239e412c8ad96bde branch: master author: Dong-hee Na committer: Victor Stinner date: 2019-01-17T13:14:45+01:00 summary: bpo-35283: Add deprecation warning for Thread.isAlive (GH-11454) Add a deprecated warning for the threading.Thread.isAlive() method. files: A Misc/NEWS.d/next/Library/2019-01-07-17-17-16.bpo-35283.WClosC.rst M Doc/whatsnew/3.8.rst M Lib/test/support/__init__.py M Lib/test/test_threading.py M Lib/threading.py diff --git a/Doc/whatsnew/3.8.rst b/Doc/whatsnew/3.8.rst index 053fe902c481..bf8d8f15434b 100644 --- a/Doc/whatsnew/3.8.rst +++ b/Doc/whatsnew/3.8.rst @@ -394,6 +394,8 @@ Deprecated (Contributed by Serhiy Storchaka in :issue:`33710`.) +* The :meth:`~threading.Thread.isAlive()` method of :class:`threading.Thread` has been deprecated. + (Contributed by Dong-hee Na in :issue:`35283`.) API and Feature Removals ======================== diff --git a/Lib/test/support/__init__.py b/Lib/test/support/__init__.py index dd1790d592b1..697182ea775f 100644 --- a/Lib/test/support/__init__.py +++ b/Lib/test/support/__init__.py @@ -2264,14 +2264,14 @@ def start_threads(threads, unlock=None): endtime += 60 for t in started: t.join(max(endtime - time.monotonic(), 0.01)) - started = [t for t in started if t.isAlive()] + started = [t for t in started if t.is_alive()] if not started: break if verbose: print('Unable to join %d threads during a period of ' '%d minutes' % (len(started), timeout)) finally: - started = [t for t in started if t.isAlive()] + started = [t for t in started if t.is_alive()] if started: faulthandler.dump_traceback(sys.stdout) raise AssertionError('Unable to join %d threads' % len(started)) diff --git a/Lib/test/test_threading.py b/Lib/test/test_threading.py index 8160a5af0064..af2d6b59b971 100644 --- a/Lib/test/test_threading.py +++ b/Lib/test/test_threading.py @@ -415,7 +415,8 @@ def test_old_threading_api(self): t.setDaemon(True) t.getName() t.setName("name") - t.isAlive() + with self.assertWarnsRegex(DeprecationWarning, 'use is_alive()'): + t.isAlive() e = threading.Event() e.isSet() threading.activeCount() diff --git a/Lib/threading.py b/Lib/threading.py index bb41456fb141..7bc8a8573c72 100644 --- a/Lib/threading.py +++ b/Lib/threading.py @@ -1091,7 +1091,15 @@ def is_alive(self): self._wait_for_tstate_lock(False) return not self._is_stopped - isAlive = is_alive + def isAlive(self): + """Return whether the thread is alive. + + This method is deprecated, use is_alive() instead. + """ + import warnings + warnings.warn('isAlive() is deprecated, use is_alive() instead', + DeprecationWarning, stacklevel=2) + return self.is_alive() @property def daemon(self): diff --git a/Misc/NEWS.d/next/Library/2019-01-07-17-17-16.bpo-35283.WClosC.rst b/Misc/NEWS.d/next/Library/2019-01-07-17-17-16.bpo-35283.WClosC.rst new file mode 100644 index 000000000000..711865281b45 --- /dev/null +++ b/Misc/NEWS.d/next/Library/2019-01-07-17-17-16.bpo-35283.WClosC.rst @@ -0,0 +1,2 @@ +Add a deprecated warning for the :meth:`threading.Thread.isAlive` method. +Patch by Dong-hee Na. From webhook-mailer at python.org Thu Jan 17 07:16:56 2019 From: webhook-mailer at python.org (Victor Stinner) Date: Thu, 17 Jan 2019 12:16:56 -0000 Subject: [Python-checkins] bpo-35701: Added __weakref__ slot to uuid.UUID (GH-11570) Message-ID: https://github.com/python/cpython/commit/f1d8e7cf17a010d2657822e06a41b30c9542a8c7 commit: f1d8e7cf17a010d2657822e06a41b30c9542a8c7 branch: master author: David H committer: Victor Stinner date: 2019-01-17T13:16:51+01:00 summary: bpo-35701: Added __weakref__ slot to uuid.UUID (GH-11570) Added test for weakreferencing a uuid.UUID object. files: M Lib/test/test_uuid.py M Lib/uuid.py diff --git a/Lib/test/test_uuid.py b/Lib/test/test_uuid.py index 757bf3cc4193..992ef0cbf804 100644 --- a/Lib/test/test_uuid.py +++ b/Lib/test/test_uuid.py @@ -9,6 +9,7 @@ import shutil import subprocess import sys +import weakref from unittest import mock py_uuid = support.import_fresh_module('uuid', blocked=['_uuid']) @@ -657,6 +658,11 @@ def testIssue8621(self): self.assertNotEqual(parent_value, child_value) + def test_uuid_weakref(self): + # bpo-35701: check that weak referencing to a UUID object can be created + strong = self.uuid.uuid4() + weak = weakref.ref(strong) + self.assertIs(strong, weak()) class TestUUIDWithoutExtModule(BaseTestUUID, unittest.TestCase): uuid = py_uuid diff --git a/Lib/uuid.py b/Lib/uuid.py index 4468d4a6c1f9..ddc63ccd082c 100644 --- a/Lib/uuid.py +++ b/Lib/uuid.py @@ -118,7 +118,7 @@ class UUID: uuid_generate_time_safe(3). """ - __slots__ = ('int', 'is_safe') + __slots__ = ('int', 'is_safe', '__weakref__') def __init__(self, hex=None, bytes=None, bytes_le=None, fields=None, int=None, version=None, From webhook-mailer at python.org Thu Jan 17 09:16:02 2019 From: webhook-mailer at python.org (Berker Peksag) Date: Thu, 17 Jan 2019 14:16:02 -0000 Subject: [Python-checkins] bpo-33687: Fix call to os.chmod() in uu.decode() (GH-7282) Message-ID: https://github.com/python/cpython/commit/17f05bbc78dbcd1db308266c31370da9ec1b1d47 commit: 17f05bbc78dbcd1db308266c31370da9ec1b1d47 branch: master author: Timo Furrer committer: Berker Peksag date: 2019-01-17T17:15:53+03:00 summary: bpo-33687: Fix call to os.chmod() in uu.decode() (GH-7282) files: A Misc/NEWS.d/next/Library/2018-06-10-14-08-52.bpo-33687.1zZdnA.rst M Lib/test/test_uu.py M Lib/uu.py diff --git a/Lib/test/test_uu.py b/Lib/test/test_uu.py index 1147205a3b53..c9f05e5b760d 100644 --- a/Lib/test/test_uu.py +++ b/Lib/test/test_uu.py @@ -6,6 +6,8 @@ import unittest from test import support +import os +import stat import sys import uu import io @@ -218,6 +220,23 @@ def test_decodetwice(self): with open(self.tmpin, 'rb') as f: self.assertRaises(uu.Error, uu.decode, f) + def test_decode_mode(self): + # Verify that decode() will set the given mode for the out_file + expected_mode = 0o444 + with open(self.tmpin, 'wb') as f: + f.write(encodedtextwrapped(expected_mode, self.tmpout)) + + # make file writable again, so it can be removed (Windows only) + self.addCleanup(os.chmod, self.tmpout, expected_mode | stat.S_IWRITE) + + with open(self.tmpin, 'rb') as f: + uu.decode(f) + + self.assertEqual( + stat.S_IMODE(os.stat(self.tmpout).st_mode), + expected_mode + ) + if __name__=="__main__": unittest.main() diff --git a/Lib/uu.py b/Lib/uu.py index 8333e864d8f9..9b1e5e607207 100755 --- a/Lib/uu.py +++ b/Lib/uu.py @@ -133,10 +133,7 @@ def decode(in_file, out_file=None, mode=None, quiet=False): out_file = sys.stdout.buffer elif isinstance(out_file, str): fp = open(out_file, 'wb') - try: - os.path.chmod(out_file, mode) - except AttributeError: - pass + os.chmod(out_file, mode) out_file = fp opened_files.append(out_file) # diff --git a/Misc/NEWS.d/next/Library/2018-06-10-14-08-52.bpo-33687.1zZdnA.rst b/Misc/NEWS.d/next/Library/2018-06-10-14-08-52.bpo-33687.1zZdnA.rst new file mode 100644 index 000000000000..63c5bfcac474 --- /dev/null +++ b/Misc/NEWS.d/next/Library/2018-06-10-14-08-52.bpo-33687.1zZdnA.rst @@ -0,0 +1,2 @@ +Fix the call to ``os.chmod()`` for ``uu.decode()`` if a mode is given or +decoded. Patch by Timo Furrer. From webhook-mailer at python.org Thu Jan 17 09:33:03 2019 From: webhook-mailer at python.org (Berker Peksag) Date: Thu, 17 Jan 2019 14:33:03 -0000 Subject: [Python-checkins] bpo-33687: Fix call to os.chmod() in uu.decode() (GH-7282) Message-ID: https://github.com/python/cpython/commit/a261b737617ca8d52e04bf3ead346b1b8786a212 commit: a261b737617ca8d52e04bf3ead346b1b8786a212 branch: 3.7 author: Miss Islington (bot) <31488909+miss-islington at users.noreply.github.com> committer: Berker Peksag date: 2019-01-17T17:32:59+03:00 summary: bpo-33687: Fix call to os.chmod() in uu.decode() (GH-7282) (cherry picked from commit 17f05bbc78dbcd1db308266c31370da9ec1b1d47) Co-authored-by: Timo Furrer files: A Misc/NEWS.d/next/Library/2018-06-10-14-08-52.bpo-33687.1zZdnA.rst M Lib/test/test_uu.py M Lib/uu.py diff --git a/Lib/test/test_uu.py b/Lib/test/test_uu.py index 1147205a3b53..c9f05e5b760d 100644 --- a/Lib/test/test_uu.py +++ b/Lib/test/test_uu.py @@ -6,6 +6,8 @@ import unittest from test import support +import os +import stat import sys import uu import io @@ -218,6 +220,23 @@ def test_decodetwice(self): with open(self.tmpin, 'rb') as f: self.assertRaises(uu.Error, uu.decode, f) + def test_decode_mode(self): + # Verify that decode() will set the given mode for the out_file + expected_mode = 0o444 + with open(self.tmpin, 'wb') as f: + f.write(encodedtextwrapped(expected_mode, self.tmpout)) + + # make file writable again, so it can be removed (Windows only) + self.addCleanup(os.chmod, self.tmpout, expected_mode | stat.S_IWRITE) + + with open(self.tmpin, 'rb') as f: + uu.decode(f) + + self.assertEqual( + stat.S_IMODE(os.stat(self.tmpout).st_mode), + expected_mode + ) + if __name__=="__main__": unittest.main() diff --git a/Lib/uu.py b/Lib/uu.py index 8333e864d8f9..9b1e5e607207 100755 --- a/Lib/uu.py +++ b/Lib/uu.py @@ -133,10 +133,7 @@ def decode(in_file, out_file=None, mode=None, quiet=False): out_file = sys.stdout.buffer elif isinstance(out_file, str): fp = open(out_file, 'wb') - try: - os.path.chmod(out_file, mode) - except AttributeError: - pass + os.chmod(out_file, mode) out_file = fp opened_files.append(out_file) # diff --git a/Misc/NEWS.d/next/Library/2018-06-10-14-08-52.bpo-33687.1zZdnA.rst b/Misc/NEWS.d/next/Library/2018-06-10-14-08-52.bpo-33687.1zZdnA.rst new file mode 100644 index 000000000000..63c5bfcac474 --- /dev/null +++ b/Misc/NEWS.d/next/Library/2018-06-10-14-08-52.bpo-33687.1zZdnA.rst @@ -0,0 +1,2 @@ +Fix the call to ``os.chmod()`` for ``uu.decode()`` if a mode is given or +decoded. Patch by Timo Furrer. From webhook-mailer at python.org Thu Jan 17 18:44:23 2019 From: webhook-mailer at python.org (Terry Jan Reedy) Date: Thu, 17 Jan 2019 23:44:23 -0000 Subject: [Python-checkins] bpo-34161: Update idlelib/NEWS.txt to 2019 Jan 17 (GH-11597) Message-ID: https://github.com/python/cpython/commit/56c16057c639acc2fb89c6b783425320f23a5f6c commit: 56c16057c639acc2fb89c6b783425320f23a5f6c branch: master author: Terry Jan Reedy committer: GitHub date: 2019-01-17T18:44:13-05:00 summary: bpo-34161: Update idlelib/NEWS.txt to 2019 Jan 17 (GH-11597) files: M Lib/idlelib/NEWS.txt diff --git a/Lib/idlelib/NEWS.txt b/Lib/idlelib/NEWS.txt index d1748a21bce4..a458c395dda2 100644 --- a/Lib/idlelib/NEWS.txt +++ b/Lib/idlelib/NEWS.txt @@ -3,6 +3,21 @@ Released on 2019-10-20? ====================================== +bpo-35660: Fix imports in window module. + +bpo-35641: Properly format calltip for function without docstring. + +bpo-33987: Use ttk Frame for ttk widgets. + +bpo-34055: Fix erroneous 'smart' indents and newlines in IDLE Shell. + +bpo-28097: Add Previous/Next History entries to Shell menu. + +bpo-35591: Find Selection now works when selection not found. + +bpo-35598: Update config_key: use PEP 8 names and ttk widgets, +make some objects global, and add tests. + bpo-35196: Speed up squeezer line counting. bpo-35208: Squeezer now counts wrapped lines before newlines. From webhook-mailer at python.org Thu Jan 17 19:00:55 2019 From: webhook-mailer at python.org (Terry Jan Reedy) Date: Fri, 18 Jan 2019 00:00:55 -0000 Subject: [Python-checkins] bpo-23156: Remove obsolete tix install directions (GH-11595) Message-ID: https://github.com/python/cpython/commit/cf27c06229eb4b8280bb5f2b93a57e33163411f4 commit: cf27c06229eb4b8280bb5f2b93a57e33163411f4 branch: master author: Terry Jan Reedy committer: GitHub date: 2019-01-17T19:00:51-05:00 summary: bpo-23156: Remove obsolete tix install directions (GH-11595) Tix was deprecated in 3.6 and the doc is wrong. New users should use ttk. files: M Doc/library/tkinter.tix.rst diff --git a/Doc/library/tkinter.tix.rst b/Doc/library/tkinter.tix.rst index 11ed75513778..88b936c47a6d 100644 --- a/Doc/library/tkinter.tix.rst +++ b/Doc/library/tkinter.tix.rst @@ -76,17 +76,6 @@ the following:: root = tix.Tk() root.tk.eval('package require Tix') -If this fails, you have a Tk installation problem which must be resolved before -proceeding. Use the environment variable :envvar:`TIX_LIBRARY` to point to the -installed Tix library directory, and make sure you have the dynamic -object library (:file:`tix8183.dll` or :file:`libtix8183.so`) in the same -directory that contains your Tk dynamic object library (:file:`tk8183.dll` or -:file:`libtk8183.so`). The directory with the dynamic object library should also -have a file called :file:`pkgIndex.tcl` (case sensitive), which contains the -line:: - - package ifneeded Tix 8.1 [list load "[file join $dir tix8183.dll]" Tix] - Tix Widgets ----------- From webhook-mailer at python.org Thu Jan 17 19:07:14 2019 From: webhook-mailer at python.org (Miss Islington (bot)) Date: Fri, 18 Jan 2019 00:07:14 -0000 Subject: [Python-checkins] bpo-23156: Remove obsolete tix install directions (GH-11595) Message-ID: https://github.com/python/cpython/commit/ebb08beb08461eb5f147aaca6f86cafa4ea15bff commit: ebb08beb08461eb5f147aaca6f86cafa4ea15bff branch: 3.7 author: Miss Islington (bot) <31488909+miss-islington at users.noreply.github.com> committer: GitHub date: 2019-01-17T16:07:10-08:00 summary: bpo-23156: Remove obsolete tix install directions (GH-11595) Tix was deprecated in 3.6 and the doc is wrong. New users should use ttk. (cherry picked from commit cf27c06229eb4b8280bb5f2b93a57e33163411f4) Co-authored-by: Terry Jan Reedy files: M Doc/library/tkinter.tix.rst diff --git a/Doc/library/tkinter.tix.rst b/Doc/library/tkinter.tix.rst index 11ed75513778..88b936c47a6d 100644 --- a/Doc/library/tkinter.tix.rst +++ b/Doc/library/tkinter.tix.rst @@ -76,17 +76,6 @@ the following:: root = tix.Tk() root.tk.eval('package require Tix') -If this fails, you have a Tk installation problem which must be resolved before -proceeding. Use the environment variable :envvar:`TIX_LIBRARY` to point to the -installed Tix library directory, and make sure you have the dynamic -object library (:file:`tix8183.dll` or :file:`libtix8183.so`) in the same -directory that contains your Tk dynamic object library (:file:`tk8183.dll` or -:file:`libtk8183.so`). The directory with the dynamic object library should also -have a file called :file:`pkgIndex.tcl` (case sensitive), which contains the -line:: - - package ifneeded Tix 8.1 [list load "[file join $dir tix8183.dll]" Tix] - Tix Widgets ----------- From webhook-mailer at python.org Thu Jan 17 19:49:09 2019 From: webhook-mailer at python.org (Terry Jan Reedy) Date: Fri, 18 Jan 2019 00:49:09 -0000 Subject: [Python-checkins] bpo-34162: Update idlelib/NEWS.txt to 2019 Jan 17 (GH-11597) (GH-11598) Message-ID: https://github.com/python/cpython/commit/59d7bdb3386ab78ccf6edbbeba9669124515c707 commit: 59d7bdb3386ab78ccf6edbbeba9669124515c707 branch: 3.7 author: Miss Islington (bot) <31488909+miss-islington at users.noreply.github.com> committer: Terry Jan Reedy date: 2019-01-17T19:49:04-05:00 summary: bpo-34162: Update idlelib/NEWS.txt to 2019 Jan 17 (GH-11597) (GH-11598) (cherry picked from commit 56c16057c639acc2fb89c6b783425320f23a5f6c) Co-authored-by: Terry Jan Reedy files: M Lib/idlelib/NEWS.txt diff --git a/Lib/idlelib/NEWS.txt b/Lib/idlelib/NEWS.txt index 8cd4011ac857..aa50145b9d95 100644 --- a/Lib/idlelib/NEWS.txt +++ b/Lib/idlelib/NEWS.txt @@ -3,6 +3,21 @@ Released on 2019-??-?? ====================================== +bpo-35660: Fix imports in window module. + +bpo-35641: Properly format calltip for function without docstring. + +bpo-33987: Use ttk Frame for ttk widgets. + +bpo-34055: Fix erroneous 'smart' indents and newlines in IDLE Shell. + +bpo-28097: Add Previous/Next History entries to Shell menu. + +bpo-35591: Find Selection now works when selection not found. + +bpo-35598: Update config_key: use PEP 8 names and ttk widgets, +make some objects global, and add tests. + bpo-35196: Speed up squeezer line counting. bpo-35208: Squeezer now counts wrapped lines before newlines. From webhook-mailer at python.org Thu Jan 17 20:00:51 2019 From: webhook-mailer at python.org (Ned Deily) Date: Fri, 18 Jan 2019 01:00:51 -0000 Subject: [Python-checkins] bpo-35601: Alleviate race condition when waiting for SIGALRM in test_asyncio (GH-11337) (GH-11348) Message-ID: https://github.com/python/cpython/commit/7eef540ab89e426b622373f43713521876447f2f commit: 7eef540ab89e426b622373f43713521876447f2f branch: 3.6 author: Miss Islington (bot) <31488909+miss-islington at users.noreply.github.com> committer: Ned Deily date: 2019-01-17T20:00:46-05:00 summary: bpo-35601: Alleviate race condition when waiting for SIGALRM in test_asyncio (GH-11337) (GH-11348) There is a race condition regarding signal delivery in test_signal_handling_args for test_asyncio.test_events.KqueueEventLoopTests. The signal can be received at any moment outside the time window provided in the test. The fix is to wait for the signal to be received instead with a bigger timeout. (cherry picked from commit 5471420faa84519530f29b08f2b042b2288e3e96) Co-authored-by: Pablo Galindo files: M Lib/test/test_asyncio/test_events.py diff --git a/Lib/test/test_asyncio/test_events.py b/Lib/test/test_asyncio/test_events.py index 91d8a964f162..b62c0f3b7912 100644 --- a/Lib/test/test_asyncio/test_events.py +++ b/Lib/test/test_asyncio/test_events.py @@ -557,6 +557,7 @@ def my_handler(): self.loop.add_signal_handler(signal.SIGALRM, my_handler) signal.setitimer(signal.ITIMER_REAL, 0.01, 0) # Send SIGALRM once. + self.loop.call_later(60, self.loop.stop) self.loop.run_forever() self.assertEqual(caught, 1) @@ -569,11 +570,12 @@ def my_handler(*args): nonlocal caught caught += 1 self.assertEqual(args, some_args) + self.loop.stop() self.loop.add_signal_handler(signal.SIGALRM, my_handler, *some_args) signal.setitimer(signal.ITIMER_REAL, 0.1, 0) # Send SIGALRM once. - self.loop.call_later(0.5, self.loop.stop) + self.loop.call_later(60, self.loop.stop) self.loop.run_forever() self.assertEqual(caught, 1) From webhook-mailer at python.org Thu Jan 17 20:02:47 2019 From: webhook-mailer at python.org (Ned Deily) Date: Fri, 18 Jan 2019 01:02:47 -0000 Subject: [Python-checkins] Make sure file object is close if socket.create_connection fails (GH-11334) (GH-11351) Message-ID: https://github.com/python/cpython/commit/dc020cc9800ae85f3a241b89ff5fcbc35ba39406 commit: dc020cc9800ae85f3a241b89ff5fcbc35ba39406 branch: 3.6 author: Miss Islington (bot) <31488909+miss-islington at users.noreply.github.com> committer: Ned Deily date: 2019-01-17T20:02:43-05:00 summary: Make sure file object is close if socket.create_connection fails (GH-11334) (GH-11351) The problem affects _testWithTimeoutTriggeredSend in test_socket.py. (cherry picked from commit 1f511e1af060e98fb789319a96076c06e7f98135) Co-authored-by: Pablo Galindo files: M Lib/test/test_socket.py diff --git a/Lib/test/test_socket.py b/Lib/test/test_socket.py index 56adec18c636..95c3938ac234 100644 --- a/Lib/test/test_socket.py +++ b/Lib/test/test_socket.py @@ -5363,11 +5363,10 @@ def testWithTimeout(self): def _testWithTimeoutTriggeredSend(self): address = self.serv.getsockname() - file = open(support.TESTFN, 'rb') - with socket.create_connection(address, timeout=0.01) as sock, \ - file as file: - meth = self.meth_from_sock(sock) - self.assertRaises(socket.timeout, meth, file) + with open(support.TESTFN, 'rb') as file: + with socket.create_connection(address, timeout=0.01) as sock: + meth = self.meth_from_sock(sock) + self.assertRaises(socket.timeout, meth, file) def testWithTimeoutTriggeredSend(self): conn = self.accept_conn() From webhook-mailer at python.org Thu Jan 17 20:07:42 2019 From: webhook-mailer at python.org (Ned Deily) Date: Fri, 18 Jan 2019 01:07:42 -0000 Subject: [Python-checkins] bpo-35525: Correct the argument name for NNTP.starttls() (GH-11310) (GH-11417) Message-ID: https://github.com/python/cpython/commit/7887c02d3372ebe3b39379588364134521a36c4e commit: 7887c02d3372ebe3b39379588364134521a36c4e branch: 3.6 author: Miss Islington (bot) <31488909+miss-islington at users.noreply.github.com> committer: Ned Deily date: 2019-01-17T20:07:39-05:00 summary: bpo-35525: Correct the argument name for NNTP.starttls() (GH-11310) (GH-11417) (cherry picked from commit e9a044ec16989bd4b39763c0588c17200a925350) Co-authored-by: Harmandeep Singh files: M Doc/library/nntplib.rst diff --git a/Doc/library/nntplib.rst b/Doc/library/nntplib.rst index d8ef8a692a95..56188c7ef538 100644 --- a/Doc/library/nntplib.rst +++ b/Doc/library/nntplib.rst @@ -232,10 +232,10 @@ tuples or objects that the method normally returns will be empty. .. versionadded:: 3.2 -.. method:: NNTP.starttls(ssl_context=None) +.. method:: NNTP.starttls(context=None) Send a ``STARTTLS`` command. This will enable encryption on the NNTP - connection. The *ssl_context* argument is optional and should be a + connection. The *context* argument is optional and should be a :class:`ssl.SSLContext` object. Please read :ref:`ssl-security` for best practices. From webhook-mailer at python.org Thu Jan 17 20:11:14 2019 From: webhook-mailer at python.org (Ned Deily) Date: Fri, 18 Jan 2019 01:11:14 -0000 Subject: [Python-checkins] bpo-35486: Note Py3.6 import system API requirement change (GH-11540) (GH-11588) Message-ID: https://github.com/python/cpython/commit/1edb3dc6ff70db88a7e89586578e58a86ee0e75e commit: 1edb3dc6ff70db88a7e89586578e58a86ee0e75e branch: 3.6 author: Miss Islington (bot) <31488909+miss-islington at users.noreply.github.com> committer: Ned Deily date: 2019-01-17T20:11:09-05:00 summary: bpo-35486: Note Py3.6 import system API requirement change (GH-11540) (GH-11588) While the introduction of ModuleNotFoundError was fully backwards compatible on the import API consumer side, folks providing alternative implementations of `__import__` need to make an update to be forward compatible with clients that start relying on the new subclass. https://bugs.python.org/issue35486 (cherry picked from commit cee29b46a19116261b083dc803217aa754c7df40) Co-authored-by: Nick Coghlan files: M Doc/library/importlib.rst M Doc/whatsnew/3.6.rst diff --git a/Doc/library/importlib.rst b/Doc/library/importlib.rst index 25055f7ce39c..30293d4d985e 100644 --- a/Doc/library/importlib.rst +++ b/Doc/library/importlib.rst @@ -1488,7 +1488,8 @@ Python 3.6 and newer for other parts of the code). if spec is not None: break else: - raise ImportError(f'No module named {absolute_name!r}') + msg = f'No module named {absolute_name!r}' + raise ModuleNotFoundError(msg, name=absolute_name) module = importlib.util.module_from_spec(spec) spec.loader.exec_module(module) sys.modules[absolute_name] = module diff --git a/Doc/whatsnew/3.6.rst b/Doc/whatsnew/3.6.rst index 009584428fc1..bd5c6340130e 100644 --- a/Doc/whatsnew/3.6.rst +++ b/Doc/whatsnew/3.6.rst @@ -2328,6 +2328,17 @@ Changes in the Python API a :exc:`DeprecationWarning` in Python 3.6 and a :exc:`RuntimeError` in Python 3.8. +* With the introduction of :exc:`ModuleNotFoundError`, import system consumers + may start expecting import system replacements to raise that more specific + exception when appropriate, rather than the less-specific :exc:`ImportError`. + To provide future compatibility with such consumers, implementors of + alternative import systems that completely replace :func:`__import__` will + need to update their implementations to raise the new subclass when a module + can't be found at all. Implementors of compliant plugins to the default + import system shouldn't need to make any changes, as the default import + system will raise the new subclass when appropriate. + + Changes in the C API -------------------- From webhook-mailer at python.org Thu Jan 17 21:26:16 2019 From: webhook-mailer at python.org (Terry Jan Reedy) Date: Fri, 18 Jan 2019 02:26:16 -0000 Subject: [Python-checkins] bpo-35730: IDLE - test squeezer reload() by checking load_font() (GH-11585) Message-ID: https://github.com/python/cpython/commit/e55cf024cae203f63b4f78f1b21c1375fe424441 commit: e55cf024cae203f63b4f78f1b21c1375fe424441 branch: master author: Tal Einat committer: Terry Jan Reedy date: 2019-01-17T21:26:06-05:00 summary: bpo-35730: IDLE - test squeezer reload() by checking load_font() (GH-11585) files: M Lib/idlelib/idle_test/test_squeezer.py diff --git a/Lib/idlelib/idle_test/test_squeezer.py b/Lib/idlelib/idle_test/test_squeezer.py index 71eccd3693f0..4e3da030a3ad 100644 --- a/Lib/idlelib/idle_test/test_squeezer.py +++ b/Lib/idlelib/idle_test/test_squeezer.py @@ -293,25 +293,21 @@ def test_squeeze_text_before_existing_squeezed_text(self): def test_reload(self): """Test the reload() class-method.""" editwin = self.make_mock_editor_window(with_text_widget=True) - text_widget = editwin.text squeezer = self.make_squeezer_instance(editwin) + squeezer.load_font = Mock() - orig_zero_char_width = squeezer.zero_char_width orig_auto_squeeze_min_lines = squeezer.auto_squeeze_min_lines - # Increase both font size and auto-squeeze-min-lines. - text_widget["font"] = ('Courier', 20) + # Increase auto-squeeze-min-lines. new_auto_squeeze_min_lines = orig_auto_squeeze_min_lines + 10 self.set_idleconf_option_with_cleanup( 'main', 'PyShell', 'auto-squeeze-min-lines', str(new_auto_squeeze_min_lines)) Squeezer.reload() - # The following failed on Gentoo buildbots. Issue title will be - # IDLE: Fix squeezer test_reload. - #self.assertGreater(squeezer.zero_char_width, orig_zero_char_width) self.assertEqual(squeezer.auto_squeeze_min_lines, new_auto_squeeze_min_lines) + squeezer.load_font.assert_called() def test_reload_no_squeezer_instances(self): """Test that Squeezer.reload() runs without any instances existing.""" From webhook-mailer at python.org Thu Jan 17 21:44:14 2019 From: webhook-mailer at python.org (Miss Islington (bot)) Date: Fri, 18 Jan 2019 02:44:14 -0000 Subject: [Python-checkins] bpo-35730: IDLE - test squeezer reload() by checking load_font() (GH-11585) Message-ID: https://github.com/python/cpython/commit/237f864c905531b2da211bebc5f6109b0b797bac commit: 237f864c905531b2da211bebc5f6109b0b797bac branch: 3.7 author: Miss Islington (bot) <31488909+miss-islington at users.noreply.github.com> committer: GitHub date: 2019-01-17T18:44:09-08:00 summary: bpo-35730: IDLE - test squeezer reload() by checking load_font() (GH-11585) (cherry picked from commit e55cf024cae203f63b4f78f1b21c1375fe424441) Co-authored-by: Tal Einat files: M Lib/idlelib/idle_test/test_squeezer.py diff --git a/Lib/idlelib/idle_test/test_squeezer.py b/Lib/idlelib/idle_test/test_squeezer.py index 71eccd3693f0..4e3da030a3ad 100644 --- a/Lib/idlelib/idle_test/test_squeezer.py +++ b/Lib/idlelib/idle_test/test_squeezer.py @@ -293,25 +293,21 @@ def test_squeeze_text_before_existing_squeezed_text(self): def test_reload(self): """Test the reload() class-method.""" editwin = self.make_mock_editor_window(with_text_widget=True) - text_widget = editwin.text squeezer = self.make_squeezer_instance(editwin) + squeezer.load_font = Mock() - orig_zero_char_width = squeezer.zero_char_width orig_auto_squeeze_min_lines = squeezer.auto_squeeze_min_lines - # Increase both font size and auto-squeeze-min-lines. - text_widget["font"] = ('Courier', 20) + # Increase auto-squeeze-min-lines. new_auto_squeeze_min_lines = orig_auto_squeeze_min_lines + 10 self.set_idleconf_option_with_cleanup( 'main', 'PyShell', 'auto-squeeze-min-lines', str(new_auto_squeeze_min_lines)) Squeezer.reload() - # The following failed on Gentoo buildbots. Issue title will be - # IDLE: Fix squeezer test_reload. - #self.assertGreater(squeezer.zero_char_width, orig_zero_char_width) self.assertEqual(squeezer.auto_squeeze_min_lines, new_auto_squeeze_min_lines) + squeezer.load_font.assert_called() def test_reload_no_squeezer_instances(self): """Test that Squeezer.reload() runs without any instances existing.""" From webhook-mailer at python.org Fri Jan 18 00:47:53 2019 From: webhook-mailer at python.org (Serhiy Storchaka) Date: Fri, 18 Jan 2019 05:47:53 -0000 Subject: [Python-checkins] bpo-34850: Emit a warning for "is" and "is not" with a literal. (GH-9642) Message-ID: https://github.com/python/cpython/commit/3bcbedc9f1471d957a30a90f9d1251516b422416 commit: 3bcbedc9f1471d957a30a90f9d1251516b422416 branch: master author: Serhiy Storchaka committer: GitHub date: 2019-01-18T07:47:48+02:00 summary: bpo-34850: Emit a warning for "is" and "is not" with a literal. (GH-9642) files: A Misc/NEWS.d/next/Core and Builtins/2018-09-30-11-19-55.bpo-34850.CbgDwb.rst M Doc/whatsnew/3.8.rst M Lib/test/test_ast.py M Lib/test/test_grammar.py M Python/compile.c diff --git a/Doc/whatsnew/3.8.rst b/Doc/whatsnew/3.8.rst index bf8d8f15434b..0360be604f56 100644 --- a/Doc/whatsnew/3.8.rst +++ b/Doc/whatsnew/3.8.rst @@ -442,6 +442,13 @@ Changes in Python behavior in the leftmost :keyword:`!for` clause). (Contributed by Serhiy Storchaka in :issue:`10544`.) +* The compiler now produces a :exc:`SyntaxWarning` when identity checks + (``is`` and ``is not``) are used with certain types of literals + (e.g. strings, ints). These can often work by accident in CPython, + but are not guaranteed by the language spec. The warning advises users + to use equality tests (``==`` and ``!=``) instead. + (Contributed by Serhiy Storchaka in :issue:`34850`.) + Changes in the Python API ------------------------- diff --git a/Lib/test/test_ast.py b/Lib/test/test_ast.py index 2c8d8ab7e3fe..897e705a42ca 100644 --- a/Lib/test/test_ast.py +++ b/Lib/test/test_ast.py @@ -1146,11 +1146,12 @@ def test_stdlib_validates(self): tests = [fn for fn in os.listdir(stdlib) if fn.endswith(".py")] tests.extend(["test/test_grammar.py", "test/test_unpack_ex.py"]) for module in tests: - fn = os.path.join(stdlib, module) - with open(fn, "r", encoding="utf-8") as fp: - source = fp.read() - mod = ast.parse(source, fn) - compile(mod, fn, "exec") + with self.subTest(module): + fn = os.path.join(stdlib, module) + with open(fn, "r", encoding="utf-8") as fp: + source = fp.read() + mod = ast.parse(source, fn) + compile(mod, fn, "exec") class ConstantTests(unittest.TestCase): diff --git a/Lib/test/test_grammar.py b/Lib/test/test_grammar.py index 3d8b1514f0cd..3ed19ff1cb04 100644 --- a/Lib/test/test_grammar.py +++ b/Lib/test/test_grammar.py @@ -1226,11 +1226,33 @@ def test_comparison(self): if 1 > 1: pass if 1 <= 1: pass if 1 >= 1: pass - if 1 is 1: pass - if 1 is not 1: pass + if x is x: pass + if x is not x: pass if 1 in (): pass if 1 not in (): pass - if 1 < 1 > 1 == 1 >= 1 <= 1 != 1 in 1 not in 1 is 1 is not 1: pass + if 1 < 1 > 1 == 1 >= 1 <= 1 != 1 in 1 not in x is x is not x: pass + + def test_comparison_is_literal(self): + def check(test, msg='"is" with a literal'): + with self.assertWarnsRegex(SyntaxWarning, msg): + compile(test, '', 'exec') + with warnings.catch_warnings(): + warnings.filterwarnings('error', category=SyntaxWarning) + with self.assertRaisesRegex(SyntaxError, msg): + compile(test, '', 'exec') + + check('x is 1') + check('x is "thing"') + check('1 is x') + check('x is y is 1') + check('x is not 1', '"is not" with a literal') + + with warnings.catch_warnings(): + warnings.filterwarnings('error', category=SyntaxWarning) + compile('x is None', '', 'exec') + compile('x is False', '', 'exec') + compile('x is True', '', 'exec') + compile('x is ...', '', 'exec') def test_binary_mask_ops(self): x = 1 & 1 @@ -1520,9 +1542,11 @@ def test_paren_evaluation(self): self.assertEqual(16 // (4 // 2), 8) self.assertEqual((16 // 4) // 2, 2) self.assertEqual(16 // 4 // 2, 2) - self.assertTrue(False is (2 is 3)) - self.assertFalse((False is 2) is 3) - self.assertFalse(False is 2 is 3) + x = 2 + y = 3 + self.assertTrue(False is (x is y)) + self.assertFalse((False is x) is y) + self.assertFalse(False is x is y) def test_matrix_mul(self): # This is not intended to be a comprehensive test, rather just to be few diff --git a/Misc/NEWS.d/next/Core and Builtins/2018-09-30-11-19-55.bpo-34850.CbgDwb.rst b/Misc/NEWS.d/next/Core and Builtins/2018-09-30-11-19-55.bpo-34850.CbgDwb.rst new file mode 100644 index 000000000000..bc5d5d1d5808 --- /dev/null +++ b/Misc/NEWS.d/next/Core and Builtins/2018-09-30-11-19-55.bpo-34850.CbgDwb.rst @@ -0,0 +1,5 @@ +The compiler now produces a :exc:`SyntaxWarning` when identity checks +(``is`` and ``is not``) are used with certain types of literals +(e.g. strings, ints). These can often work by accident in CPython, +but are not guaranteed by the language spec. The warning advises users +to use equality tests (``==`` and ``!=``) instead. diff --git a/Python/compile.c b/Python/compile.c index 45e78cb22cd8..5aebda0da4d1 100644 --- a/Python/compile.c +++ b/Python/compile.c @@ -2284,6 +2284,47 @@ compiler_class(struct compiler *c, stmt_ty s) return 1; } +/* Return 0 if the expression is a constant value except named singletons. + Return 1 otherwise. */ +static int +check_is_arg(expr_ty e) +{ + if (e->kind != Constant_kind) { + return 1; + } + PyObject *value = e->v.Constant.value; + return (value == Py_None + || value == Py_False + || value == Py_True + || value == Py_Ellipsis); +} + +/* Check operands of identity chacks ("is" and "is not"). + Emit a warning if any operand is a constant except named singletons. + Return 0 on error. + */ +static int +check_compare(struct compiler *c, expr_ty e) +{ + Py_ssize_t i, n; + int left = check_is_arg(e->v.Compare.left); + n = asdl_seq_LEN(e->v.Compare.ops); + for (i = 0; i < n; i++) { + cmpop_ty op = (cmpop_ty)asdl_seq_GET(e->v.Compare.ops, i); + int right = check_is_arg((expr_ty)asdl_seq_GET(e->v.Compare.comparators, i)); + if (op == Is || op == IsNot) { + if (!right || !left) { + const char *msg = (op == Is) + ? "\"is\" with a literal. Did you mean \"==\"?" + : "\"is not\" with a literal. Did you mean \"!=\"?"; + return compiler_warn(c, msg); + } + } + left = right; + } + return 1; +} + static int cmpop(cmpop_ty op) { @@ -2363,6 +2404,9 @@ compiler_jump_if(struct compiler *c, expr_ty e, basicblock *next, int cond) return 1; } case Compare_kind: { + if (!check_compare(c, e)) { + return 0; + } Py_ssize_t i, n = asdl_seq_LEN(e->v.Compare.ops) - 1; if (n > 0) { basicblock *cleanup = compiler_new_block(c); @@ -3670,6 +3714,9 @@ compiler_compare(struct compiler *c, expr_ty e) { Py_ssize_t i, n; + if (!check_compare(c, e)) { + return 0; + } VISIT(c, expr, e->v.Compare.left); assert(asdl_seq_LEN(e->v.Compare.ops) > 0); n = asdl_seq_LEN(e->v.Compare.ops) - 1; From webhook-mailer at python.org Fri Jan 18 02:10:01 2019 From: webhook-mailer at python.org (Terry Jan Reedy) Date: Fri, 18 Jan 2019 07:10:01 -0000 Subject: [Python-checkins] bpo-35769: Change IDLE's name for new files from 'Untitled' to 'untitled' (GH-11602) Message-ID: https://github.com/python/cpython/commit/a902239f22c322d8988c514dd1c724aade3e4ef3 commit: a902239f22c322d8988c514dd1c724aade3e4ef3 branch: master author: Terry Jan Reedy committer: GitHub date: 2019-01-18T02:09:53-05:00 summary: bpo-35769: Change IDLE's name for new files from 'Untitled' to 'untitled' (GH-11602) 'Untitled' violates the PEP 8 standard for .py files files: A Misc/NEWS.d/next/IDLE/2019-01-18-01-24-23.bpo-35769.GqsB34.rst M Lib/idlelib/NEWS.txt M Lib/idlelib/editor.py diff --git a/Lib/idlelib/NEWS.txt b/Lib/idlelib/NEWS.txt index a458c395dda2..222f18710a74 100644 --- a/Lib/idlelib/NEWS.txt +++ b/Lib/idlelib/NEWS.txt @@ -3,6 +3,8 @@ Released on 2019-10-20? ====================================== +bpo-35769: Change new file name from 'Untitled' to 'untitled'. + bpo-35660: Fix imports in window module. bpo-35641: Properly format calltip for function without docstring. diff --git a/Lib/idlelib/editor.py b/Lib/idlelib/editor.py index d13ac3786da4..e05b52a96dcc 100644 --- a/Lib/idlelib/editor.py +++ b/Lib/idlelib/editor.py @@ -943,7 +943,7 @@ def saved_change_hook(self): elif long: title = long else: - title = "Untitled" + title = "untitled" icon = short or long or title if not self.get_saved(): title = "*%s*" % title @@ -965,7 +965,7 @@ def short_title(self): if filename: filename = os.path.basename(filename) else: - filename = "Untitled" + filename = "untitled" # return unicode string to display non-ASCII chars correctly return self._filename_to_unicode(filename) diff --git a/Misc/NEWS.d/next/IDLE/2019-01-18-01-24-23.bpo-35769.GqsB34.rst b/Misc/NEWS.d/next/IDLE/2019-01-18-01-24-23.bpo-35769.GqsB34.rst new file mode 100644 index 000000000000..79003a984a9f --- /dev/null +++ b/Misc/NEWS.d/next/IDLE/2019-01-18-01-24-23.bpo-35769.GqsB34.rst @@ -0,0 +1 @@ +Change IDLE's new file name from 'Untitled' to 'untitled' From webhook-mailer at python.org Fri Jan 18 02:24:14 2019 From: webhook-mailer at python.org (Miss Islington (bot)) Date: Fri, 18 Jan 2019 07:24:14 -0000 Subject: [Python-checkins] bpo-35769: Change IDLE's name for new files from 'Untitled' to 'untitled' (GH-11602) Message-ID: https://github.com/python/cpython/commit/5f9a168a313485791d85250e5bf673b66bd51244 commit: 5f9a168a313485791d85250e5bf673b66bd51244 branch: 3.7 author: Miss Islington (bot) <31488909+miss-islington at users.noreply.github.com> committer: GitHub date: 2019-01-17T23:24:09-08:00 summary: bpo-35769: Change IDLE's name for new files from 'Untitled' to 'untitled' (GH-11602) 'Untitled' violates the PEP 8 standard for .py files (cherry picked from commit a902239f22c322d8988c514dd1c724aade3e4ef3) Co-authored-by: Terry Jan Reedy files: A Misc/NEWS.d/next/IDLE/2019-01-18-01-24-23.bpo-35769.GqsB34.rst M Lib/idlelib/NEWS.txt M Lib/idlelib/editor.py diff --git a/Lib/idlelib/NEWS.txt b/Lib/idlelib/NEWS.txt index aa50145b9d95..a1964def362b 100644 --- a/Lib/idlelib/NEWS.txt +++ b/Lib/idlelib/NEWS.txt @@ -3,6 +3,8 @@ Released on 2019-??-?? ====================================== +bpo-35769: Change new file name from 'Untitled' to 'untitled'. + bpo-35660: Fix imports in window module. bpo-35641: Properly format calltip for function without docstring. diff --git a/Lib/idlelib/editor.py b/Lib/idlelib/editor.py index d13ac3786da4..e05b52a96dcc 100644 --- a/Lib/idlelib/editor.py +++ b/Lib/idlelib/editor.py @@ -943,7 +943,7 @@ def saved_change_hook(self): elif long: title = long else: - title = "Untitled" + title = "untitled" icon = short or long or title if not self.get_saved(): title = "*%s*" % title @@ -965,7 +965,7 @@ def short_title(self): if filename: filename = os.path.basename(filename) else: - filename = "Untitled" + filename = "untitled" # return unicode string to display non-ASCII chars correctly return self._filename_to_unicode(filename) diff --git a/Misc/NEWS.d/next/IDLE/2019-01-18-01-24-23.bpo-35769.GqsB34.rst b/Misc/NEWS.d/next/IDLE/2019-01-18-01-24-23.bpo-35769.GqsB34.rst new file mode 100644 index 000000000000..79003a984a9f --- /dev/null +++ b/Misc/NEWS.d/next/IDLE/2019-01-18-01-24-23.bpo-35769.GqsB34.rst @@ -0,0 +1 @@ +Change IDLE's new file name from 'Untitled' to 'untitled' From solipsis at pitrou.net Fri Jan 18 04:08:48 2019 From: solipsis at pitrou.net (solipsis at pitrou.net) Date: Fri, 18 Jan 2019 09:08:48 +0000 Subject: [Python-checkins] Daily reference leaks (4243df51fe43): sum=4 Message-ID: <20190118090848.1.AEB771960435DF80@psf.io> results for 4243df51fe43 on branch "default" -------------------------------------------- test_collections leaked [0, 7, -7] memory blocks, sum=0 test_functools leaked [0, 3, 1] memory blocks, sum=4 test_multiprocessing_fork leaked [1, -2, 2] memory blocks, sum=1 test_multiprocessing_forkserver leaked [1, 0, -2] memory blocks, sum=-1 Command line was: ['./python', '-m', 'test.regrtest', '-uall', '-R', '3:3:/home/psf-users/antoine/refleaks/refloggKInIx', '--timeout', '7200'] From webhook-mailer at python.org Fri Jan 18 04:50:52 2019 From: webhook-mailer at python.org (Victor Stinner) Date: Fri, 18 Jan 2019 09:50:52 -0000 Subject: [Python-checkins] bpo-35283: Update the docstring of threading.Thread.join method (GH-11596) Message-ID: https://github.com/python/cpython/commit/36d9e9a4d5238d5a2f09679b6c51be66fbfc12c4 commit: 36d9e9a4d5238d5a2f09679b6c51be66fbfc12c4 branch: master author: Dong-hee Na committer: Victor Stinner date: 2019-01-18T10:50:47+01:00 summary: bpo-35283: Update the docstring of threading.Thread.join method (GH-11596) files: M Lib/threading.py diff --git a/Lib/threading.py b/Lib/threading.py index 7bc8a8573c72..000981f6d963 100644 --- a/Lib/threading.py +++ b/Lib/threading.py @@ -1007,7 +1007,7 @@ def join(self, timeout=None): When the timeout argument is present and not None, it should be a floating point number specifying a timeout for the operation in seconds (or fractions thereof). As join() always returns None, you must call - isAlive() after join() to decide whether a timeout happened -- if the + is_alive() after join() to decide whether a timeout happened -- if the thread is still alive, the join() call timed out. When the timeout argument is not present or None, the operation will From webhook-mailer at python.org Fri Jan 18 09:09:49 2019 From: webhook-mailer at python.org (Victor Stinner) Date: Fri, 18 Jan 2019 14:09:49 -0000 Subject: [Python-checkins] bpo-35283: Add pending deprecation warning for Thread.isAlive (GH-11604) Message-ID: https://github.com/python/cpython/commit/c2647f2e45d2741fc44fd621966e05d15f2cd26a commit: c2647f2e45d2741fc44fd621966e05d15f2cd26a branch: 3.7 author: Dong-hee Na committer: Victor Stinner date: 2019-01-18T15:09:43+01:00 summary: bpo-35283: Add pending deprecation warning for Thread.isAlive (GH-11604) Add a pending deprecated warning for the threading.Thread.isAlive() method. files: A Misc/NEWS.d/next/Library/2019-01-07-17-17-16.bpo-35283.WClosC.rst M Lib/test/support/__init__.py M Lib/test/test_threading.py M Lib/threading.py diff --git a/Lib/test/support/__init__.py b/Lib/test/support/__init__.py index 25c05edad340..a9cfa2a39540 100644 --- a/Lib/test/support/__init__.py +++ b/Lib/test/support/__init__.py @@ -2224,14 +2224,14 @@ def start_threads(threads, unlock=None): endtime += 60 for t in started: t.join(max(endtime - time.monotonic(), 0.01)) - started = [t for t in started if t.isAlive()] + started = [t for t in started if t.is_alive()] if not started: break if verbose: print('Unable to join %d threads during a period of ' '%d minutes' % (len(started), timeout)) finally: - started = [t for t in started if t.isAlive()] + started = [t for t in started if t.is_alive()] if started: faulthandler.dump_traceback(sys.stdout) raise AssertionError('Unable to join %d threads' % len(started)) diff --git a/Lib/test/test_threading.py b/Lib/test/test_threading.py index 8160a5af0064..27f328dbe63c 100644 --- a/Lib/test/test_threading.py +++ b/Lib/test/test_threading.py @@ -415,7 +415,8 @@ def test_old_threading_api(self): t.setDaemon(True) t.getName() t.setName("name") - t.isAlive() + with self.assertWarnsRegex(PendingDeprecationWarning, 'use is_alive()'): + t.isAlive() e = threading.Event() e.isSet() threading.activeCount() diff --git a/Lib/threading.py b/Lib/threading.py index bb41456fb141..f260a7cceca4 100644 --- a/Lib/threading.py +++ b/Lib/threading.py @@ -1007,7 +1007,7 @@ def join(self, timeout=None): When the timeout argument is present and not None, it should be a floating point number specifying a timeout for the operation in seconds (or fractions thereof). As join() always returns None, you must call - isAlive() after join() to decide whether a timeout happened -- if the + is_alive() after join() to decide whether a timeout happened -- if the thread is still alive, the join() call timed out. When the timeout argument is not present or None, the operation will @@ -1091,7 +1091,15 @@ def is_alive(self): self._wait_for_tstate_lock(False) return not self._is_stopped - isAlive = is_alive + def isAlive(self): + """Return whether the thread is alive. + + This method is deprecated, use is_alive() instead. + """ + import warnings + warnings.warn('isAlive() is deprecated, use is_alive() instead', + PendingDeprecationWarning, stacklevel=2) + return self.is_alive() @property def daemon(self): diff --git a/Misc/NEWS.d/next/Library/2019-01-07-17-17-16.bpo-35283.WClosC.rst b/Misc/NEWS.d/next/Library/2019-01-07-17-17-16.bpo-35283.WClosC.rst new file mode 100644 index 000000000000..99544f4cf4ff --- /dev/null +++ b/Misc/NEWS.d/next/Library/2019-01-07-17-17-16.bpo-35283.WClosC.rst @@ -0,0 +1,2 @@ +Add a pending deprecated warning for the :meth:`threading.Thread.isAlive` method. +Patch by Dong-hee Na. From webhook-mailer at python.org Fri Jan 18 10:09:33 2019 From: webhook-mailer at python.org (Victor Stinner) Date: Fri, 18 Jan 2019 15:09:33 -0000 Subject: [Python-checkins] bpo-35045: Accept TLSv1 default in min max test (GH-11510) Message-ID: https://github.com/python/cpython/commit/34de2d312b3687994ddbc29adb66e88f672034c7 commit: 34de2d312b3687994ddbc29adb66e88f672034c7 branch: master author: Christian Heimes committer: Victor Stinner date: 2019-01-18T16:09:30+01:00 summary: bpo-35045: Accept TLSv1 default in min max test (GH-11510) Make ssl tests less strict and also accept TLSv1 as system default. The changes unbreaks test_min_max_version on Fedora 29. Signed-off-by: Christian Heimes files: A Misc/NEWS.d/next/Tests/2019-01-10-18-35-42.bpo-35045.qdd6d9.rst M Lib/test/test_ssl.py diff --git a/Lib/test/test_ssl.py b/Lib/test/test_ssl.py index 1fc657f4d867..9e571cc78e4b 100644 --- a/Lib/test/test_ssl.py +++ b/Lib/test/test_ssl.py @@ -1088,8 +1088,11 @@ def test_hostname_checks_common_name(self): "required OpenSSL 1.1.0g") def test_min_max_version(self): ctx = ssl.SSLContext(ssl.PROTOCOL_TLS_SERVER) - self.assertEqual( - ctx.minimum_version, ssl.TLSVersion.MINIMUM_SUPPORTED + # OpenSSL default is MINIMUM_SUPPORTED, however some vendors like + # Fedora override the setting to TLS 1.0. + self.assertIn( + ctx.minimum_version, + {ssl.TLSVersion.MINIMUM_SUPPORTED, ssl.TLSVersion.TLSv1} ) self.assertEqual( ctx.maximum_version, ssl.TLSVersion.MAXIMUM_SUPPORTED diff --git a/Misc/NEWS.d/next/Tests/2019-01-10-18-35-42.bpo-35045.qdd6d9.rst b/Misc/NEWS.d/next/Tests/2019-01-10-18-35-42.bpo-35045.qdd6d9.rst new file mode 100644 index 000000000000..630a22d77868 --- /dev/null +++ b/Misc/NEWS.d/next/Tests/2019-01-10-18-35-42.bpo-35045.qdd6d9.rst @@ -0,0 +1,2 @@ +Make ssl tests less strict and also accept TLSv1 as system default. The +changes unbreaks test_min_max_version on Fedora 29. From webhook-mailer at python.org Fri Jan 18 10:29:13 2019 From: webhook-mailer at python.org (Miss Islington (bot)) Date: Fri, 18 Jan 2019 15:29:13 -0000 Subject: [Python-checkins] bpo-35045: Accept TLSv1 default in min max test (GH-11510) Message-ID: https://github.com/python/cpython/commit/6ca7183b3549d3eaa8a0c3b73255eeac24d7974d commit: 6ca7183b3549d3eaa8a0c3b73255eeac24d7974d branch: 3.7 author: Miss Islington (bot) <31488909+miss-islington at users.noreply.github.com> committer: GitHub date: 2019-01-18T07:29:08-08:00 summary: bpo-35045: Accept TLSv1 default in min max test (GH-11510) Make ssl tests less strict and also accept TLSv1 as system default. The changes unbreaks test_min_max_version on Fedora 29. Signed-off-by: Christian Heimes (cherry picked from commit 34de2d312b3687994ddbc29adb66e88f672034c7) Co-authored-by: Christian Heimes files: A Misc/NEWS.d/next/Tests/2019-01-10-18-35-42.bpo-35045.qdd6d9.rst M Lib/test/test_ssl.py diff --git a/Lib/test/test_ssl.py b/Lib/test/test_ssl.py index b6794ce3a817..38ecddd07d98 100644 --- a/Lib/test/test_ssl.py +++ b/Lib/test/test_ssl.py @@ -1108,8 +1108,11 @@ def test_hostname_checks_common_name(self): "required OpenSSL 1.1.0g") def test_min_max_version(self): ctx = ssl.SSLContext(ssl.PROTOCOL_TLS_SERVER) - self.assertEqual( - ctx.minimum_version, ssl.TLSVersion.MINIMUM_SUPPORTED + # OpenSSL default is MINIMUM_SUPPORTED, however some vendors like + # Fedora override the setting to TLS 1.0. + self.assertIn( + ctx.minimum_version, + {ssl.TLSVersion.MINIMUM_SUPPORTED, ssl.TLSVersion.TLSv1} ) self.assertEqual( ctx.maximum_version, ssl.TLSVersion.MAXIMUM_SUPPORTED diff --git a/Misc/NEWS.d/next/Tests/2019-01-10-18-35-42.bpo-35045.qdd6d9.rst b/Misc/NEWS.d/next/Tests/2019-01-10-18-35-42.bpo-35045.qdd6d9.rst new file mode 100644 index 000000000000..630a22d77868 --- /dev/null +++ b/Misc/NEWS.d/next/Tests/2019-01-10-18-35-42.bpo-35045.qdd6d9.rst @@ -0,0 +1,2 @@ +Make ssl tests less strict and also accept TLSv1 as system default. The +changes unbreaks test_min_max_version on Fedora 29. From webhook-mailer at python.org Fri Jan 18 10:49:21 2019 From: webhook-mailer at python.org (Brian Curtin) Date: Fri, 18 Jan 2019 15:49:21 -0000 Subject: [Python-checkins] bpo-21257: document http.client.parse_headers (GH-11443) Message-ID: https://github.com/python/cpython/commit/478f8291327a3e3ab17b5857699565df43a9e952 commit: 478f8291327a3e3ab17b5857699565df43a9e952 branch: master author: Ashwin Ramaswami committer: Brian Curtin date: 2019-01-18T08:49:16-07:00 summary: bpo-21257: document http.client.parse_headers (GH-11443) Document http.client.parse_headers files: A Misc/NEWS.d/next/Documentation/2019-01-15-21-45-27.bpo-21257.U9LKkx.rst M Doc/library/http.client.rst diff --git a/Doc/library/http.client.rst b/Doc/library/http.client.rst index 3408c103e2f3..beaa720d732b 100644 --- a/Doc/library/http.client.rst +++ b/Doc/library/http.client.rst @@ -115,6 +115,25 @@ The module provides the following classes: The *strict* parameter was removed. HTTP 0.9 style "Simple Responses" are no longer supported. +This module provides the following function: + +.. function:: parse_headers(fp) + + Parse the headers from a file pointer *fp* representing a HTTP + request/response. The file has to be a :class:`BufferedIOBase` reader + (i.e. not text) and must provide a valid :rfc:`2822` style header. + + This function returns an instance of :class:`http.client.HTTPMessage` + that holds the header fields, but no payload + (the same as :attr:`HTTPResponse.msg` + and :attr:`http.server.BaseHTTPRequestHandler.headers`). + After returning, the file pointer *fp* is ready to read the HTTP body. + + .. note:: + :meth:`parse_headers` does not parse the start-line of a HTTP message; + it only parses the ``Name: value`` lines. The file has to be ready to + read these field lines, so the first line should already be consumed + before calling the function. The following exceptions are raised as appropriate: diff --git a/Misc/NEWS.d/next/Documentation/2019-01-15-21-45-27.bpo-21257.U9LKkx.rst b/Misc/NEWS.d/next/Documentation/2019-01-15-21-45-27.bpo-21257.U9LKkx.rst new file mode 100644 index 000000000000..ad035e95b51c --- /dev/null +++ b/Misc/NEWS.d/next/Documentation/2019-01-15-21-45-27.bpo-21257.U9LKkx.rst @@ -0,0 +1 @@ +Document :func:`http.client.parse_headers`. From webhook-mailer at python.org Fri Jan 18 14:00:49 2019 From: webhook-mailer at python.org (Terry Jan Reedy) Date: Fri, 18 Jan 2019 19:00:49 -0000 Subject: [Python-checkins] bpo-35770: IDLE macosx deletes Options => Configure IDLE. (GH-11614) Message-ID: https://github.com/python/cpython/commit/39ed289a3511d2e9bf0950a9d5dc53c8194f61b9 commit: 39ed289a3511d2e9bf0950a9d5dc53c8194f61b9 branch: master author: Terry Jan Reedy committer: GitHub date: 2019-01-18T14:00:45-05:00 summary: bpo-35770: IDLE macosx deletes Options => Configure IDLE. (GH-11614) It previously deleted Window => Zoom Height by mistake. (Zoom Height is now on the Options menu). On Mac, the settings dialog is accessed via Preferences on the IDLE menu. files: A Misc/NEWS.d/next/IDLE/2019-01-18-13-04-30.bpo-35770.2LxJGu.rst M Lib/idlelib/NEWS.txt M Lib/idlelib/macosx.py diff --git a/Lib/idlelib/NEWS.txt b/Lib/idlelib/NEWS.txt index 222f18710a74..61457e93d238 100644 --- a/Lib/idlelib/NEWS.txt +++ b/Lib/idlelib/NEWS.txt @@ -3,6 +3,11 @@ Released on 2019-10-20? ====================================== +bpo-35770: IDLE macosx deletes Options => Configure IDLE. +It previously deleted Window => Zoom Height by mistake. +(Zoom Height is now on the Options menu). On Mac, the settings +dialog is accessed via Preferences on the IDLE menu. + bpo-35769: Change new file name from 'Untitled' to 'untitled'. bpo-35660: Fix imports in window module. diff --git a/Lib/idlelib/macosx.py b/Lib/idlelib/macosx.py index 9be4ed2ec411..d6a1b376a1c2 100644 --- a/Lib/idlelib/macosx.py +++ b/Lib/idlelib/macosx.py @@ -178,7 +178,7 @@ def overrideRootMenu(root, flist): del mainmenu.menudefs[-1][1][0:2] # Remove the 'Configure Idle' entry from the options menu, it is in the # application menu as 'Preferences' - del mainmenu.menudefs[-2][1][0] + del mainmenu.menudefs[-3][1][0:1] menubar = Menu(root) root.configure(menu=menubar) menudict = {} diff --git a/Misc/NEWS.d/next/IDLE/2019-01-18-13-04-30.bpo-35770.2LxJGu.rst b/Misc/NEWS.d/next/IDLE/2019-01-18-13-04-30.bpo-35770.2LxJGu.rst new file mode 100644 index 000000000000..89e4bdef83ef --- /dev/null +++ b/Misc/NEWS.d/next/IDLE/2019-01-18-13-04-30.bpo-35770.2LxJGu.rst @@ -0,0 +1,3 @@ +IDLE macosx deletes Options => Configure IDLE. It previously deleted Window +=> Zoom Height by mistake. (Zoom Height is now on the Options menu). On +Mac, the settings dialog is accessed via Preferences on the IDLE menu. From webhook-mailer at python.org Fri Jan 18 14:16:05 2019 From: webhook-mailer at python.org (Miss Islington (bot)) Date: Fri, 18 Jan 2019 19:16:05 -0000 Subject: [Python-checkins] bpo-35770: IDLE macosx deletes Options => Configure IDLE. (GH-11614) Message-ID: https://github.com/python/cpython/commit/a01e23559fd77083a2c6c59692b70d7896e5f59a commit: a01e23559fd77083a2c6c59692b70d7896e5f59a branch: 3.7 author: Miss Islington (bot) <31488909+miss-islington at users.noreply.github.com> committer: GitHub date: 2019-01-18T11:16:01-08:00 summary: bpo-35770: IDLE macosx deletes Options => Configure IDLE. (GH-11614) It previously deleted Window => Zoom Height by mistake. (Zoom Height is now on the Options menu). On Mac, the settings dialog is accessed via Preferences on the IDLE menu. (cherry picked from commit 39ed289a3511d2e9bf0950a9d5dc53c8194f61b9) Co-authored-by: Terry Jan Reedy files: A Misc/NEWS.d/next/IDLE/2019-01-18-13-04-30.bpo-35770.2LxJGu.rst M Lib/idlelib/NEWS.txt M Lib/idlelib/macosx.py diff --git a/Lib/idlelib/NEWS.txt b/Lib/idlelib/NEWS.txt index a1964def362b..4d3da19ac372 100644 --- a/Lib/idlelib/NEWS.txt +++ b/Lib/idlelib/NEWS.txt @@ -3,6 +3,11 @@ Released on 2019-??-?? ====================================== +bpo-35770: IDLE macosx deletes Options => Configure IDLE. +It previously deleted Window => Zoom Height by mistake. +(Zoom Height is now on the Options menu). On Mac, the settings +dialog is accessed via Preferences on the IDLE menu. + bpo-35769: Change new file name from 'Untitled' to 'untitled'. bpo-35660: Fix imports in window module. diff --git a/Lib/idlelib/macosx.py b/Lib/idlelib/macosx.py index 9be4ed2ec411..d6a1b376a1c2 100644 --- a/Lib/idlelib/macosx.py +++ b/Lib/idlelib/macosx.py @@ -178,7 +178,7 @@ def overrideRootMenu(root, flist): del mainmenu.menudefs[-1][1][0:2] # Remove the 'Configure Idle' entry from the options menu, it is in the # application menu as 'Preferences' - del mainmenu.menudefs[-2][1][0] + del mainmenu.menudefs[-3][1][0:1] menubar = Menu(root) root.configure(menu=menubar) menudict = {} diff --git a/Misc/NEWS.d/next/IDLE/2019-01-18-13-04-30.bpo-35770.2LxJGu.rst b/Misc/NEWS.d/next/IDLE/2019-01-18-13-04-30.bpo-35770.2LxJGu.rst new file mode 100644 index 000000000000..89e4bdef83ef --- /dev/null +++ b/Misc/NEWS.d/next/IDLE/2019-01-18-13-04-30.bpo-35770.2LxJGu.rst @@ -0,0 +1,3 @@ +IDLE macosx deletes Options => Configure IDLE. It previously deleted Window +=> Zoom Height by mistake. (Zoom Height is now on the Options menu). On +Mac, the settings dialog is accessed via Preferences on the IDLE menu. From webhook-mailer at python.org Fri Jan 18 14:30:31 2019 From: webhook-mailer at python.org (Serhiy Storchaka) Date: Fri, 18 Jan 2019 19:30:31 -0000 Subject: [Python-checkins] bpo-35733: Make isinstance(ast.Constant(boolean), ast.Num) be false. (GH-11547) Message-ID: https://github.com/python/cpython/commit/74176226179ed56ad1c910bec5c4100e72ab4e84 commit: 74176226179ed56ad1c910bec5c4100e72ab4e84 branch: master author: Anthony Sottile committer: Serhiy Storchaka date: 2019-01-18T21:30:28+02:00 summary: bpo-35733: Make isinstance(ast.Constant(boolean), ast.Num) be false. (GH-11547) files: A Misc/NEWS.d/next/Library/2019-01-13-18-42-41.bpo-35733.eFfLiv.rst M Lib/ast.py M Lib/test/test_ast.py diff --git a/Lib/ast.py b/Lib/ast.py index 4c8c7795ff82..03b8a1b16b7a 100644 --- a/Lib/ast.py +++ b/Lib/ast.py @@ -346,7 +346,10 @@ def __instancecheck__(cls, inst): except AttributeError: return False else: - return isinstance(value, _const_types[cls]) + return ( + isinstance(value, _const_types[cls]) and + not isinstance(value, _const_types_not.get(cls, ())) + ) return type.__instancecheck__(cls, inst) def _new(cls, *args, **kwargs): @@ -384,3 +387,6 @@ def __new__(cls, *args, **kwargs): NameConstant: (type(None), bool), Ellipsis: (type(...),), } +_const_types_not = { + Num: (bool,), +} diff --git a/Lib/test/test_ast.py b/Lib/test/test_ast.py index 897e705a42ca..4bf77ff046e1 100644 --- a/Lib/test/test_ast.py +++ b/Lib/test/test_ast.py @@ -411,12 +411,16 @@ def test_isinstance(self): self.assertFalse(isinstance(ast.Str('42'), ast.Bytes)) self.assertFalse(isinstance(ast.Num(42), ast.NameConstant)) self.assertFalse(isinstance(ast.Num(42), ast.Ellipsis)) + self.assertFalse(isinstance(ast.NameConstant(True), ast.Num)) + self.assertFalse(isinstance(ast.NameConstant(False), ast.Num)) self.assertFalse(isinstance(ast.Constant('42'), ast.Num)) self.assertFalse(isinstance(ast.Constant(42), ast.Str)) self.assertFalse(isinstance(ast.Constant('42'), ast.Bytes)) self.assertFalse(isinstance(ast.Constant(42), ast.NameConstant)) self.assertFalse(isinstance(ast.Constant(42), ast.Ellipsis)) + self.assertFalse(isinstance(ast.Constant(True), ast.Num)) + self.assertFalse(isinstance(ast.Constant(False), ast.Num)) self.assertFalse(isinstance(ast.Constant(), ast.Num)) self.assertFalse(isinstance(ast.Constant(), ast.Str)) diff --git a/Misc/NEWS.d/next/Library/2019-01-13-18-42-41.bpo-35733.eFfLiv.rst b/Misc/NEWS.d/next/Library/2019-01-13-18-42-41.bpo-35733.eFfLiv.rst new file mode 100644 index 000000000000..8e5ef9b84178 --- /dev/null +++ b/Misc/NEWS.d/next/Library/2019-01-13-18-42-41.bpo-35733.eFfLiv.rst @@ -0,0 +1,2 @@ +``ast.Constant(boolean)`` no longer an instance of :class:`ast.Num`. Patch by Anthony +Sottile. From webhook-mailer at python.org Fri Jan 18 17:05:43 2019 From: webhook-mailer at python.org (Terry Jan Reedy) Date: Fri, 18 Jan 2019 22:05:43 -0000 Subject: [Python-checkins] bpo-35770: Fix off-by-1 error. (#11618) Message-ID: https://github.com/python/cpython/commit/2cf1ddaff4c869780d9e796b21ef3e506f8ad321 commit: 2cf1ddaff4c869780d9e796b21ef3e506f8ad321 branch: master author: Terry Jan Reedy committer: GitHub date: 2019-01-18T17:05:40-05:00 summary: bpo-35770: Fix off-by-1 error. (#11618) files: M Lib/idlelib/macosx.py diff --git a/Lib/idlelib/macosx.py b/Lib/idlelib/macosx.py index d6a1b376a1c2..eeaab59ae802 100644 --- a/Lib/idlelib/macosx.py +++ b/Lib/idlelib/macosx.py @@ -178,7 +178,7 @@ def overrideRootMenu(root, flist): del mainmenu.menudefs[-1][1][0:2] # Remove the 'Configure Idle' entry from the options menu, it is in the # application menu as 'Preferences' - del mainmenu.menudefs[-3][1][0:1] + del mainmenu.menudefs[-3][1][0:2] menubar = Menu(root) root.configure(menu=menubar) menudict = {} From webhook-mailer at python.org Fri Jan 18 17:23:56 2019 From: webhook-mailer at python.org (Miss Islington (bot)) Date: Fri, 18 Jan 2019 22:23:56 -0000 Subject: [Python-checkins] bpo-35770: Fix off-by-1 error. (GH-11618) Message-ID: https://github.com/python/cpython/commit/47290e7642dd41d94437dd0e2c0f6bfceb0281b5 commit: 47290e7642dd41d94437dd0e2c0f6bfceb0281b5 branch: 3.7 author: Miss Islington (bot) <31488909+miss-islington at users.noreply.github.com> committer: GitHub date: 2019-01-18T14:23:51-08:00 summary: bpo-35770: Fix off-by-1 error. (GH-11618) (cherry picked from commit 2cf1ddaff4c869780d9e796b21ef3e506f8ad321) Co-authored-by: Terry Jan Reedy files: M Lib/idlelib/macosx.py diff --git a/Lib/idlelib/macosx.py b/Lib/idlelib/macosx.py index d6a1b376a1c2..eeaab59ae802 100644 --- a/Lib/idlelib/macosx.py +++ b/Lib/idlelib/macosx.py @@ -178,7 +178,7 @@ def overrideRootMenu(root, flist): del mainmenu.menudefs[-1][1][0:2] # Remove the 'Configure Idle' entry from the options menu, it is in the # application menu as 'Preferences' - del mainmenu.menudefs[-3][1][0:1] + del mainmenu.menudefs[-3][1][0:2] menubar = Menu(root) root.configure(menu=menubar) menudict = {} From solipsis at pitrou.net Sat Jan 19 04:08:58 2019 From: solipsis at pitrou.net (solipsis at pitrou.net) Date: Sat, 19 Jan 2019 09:08:58 +0000 Subject: [Python-checkins] Daily reference leaks (4243df51fe43): sum=3 Message-ID: <20190119090858.1.5D9D1609B7245034@psf.io> results for 4243df51fe43 on branch "default" -------------------------------------------- test_functools leaked [0, 3, 1] memory blocks, sum=4 test_multiprocessing_forkserver leaked [-2, 2, -1] memory blocks, sum=-1 Command line was: ['./python', '-m', 'test.regrtest', '-uall', '-R', '3:3:/home/psf-users/antoine/refleaks/reflogOPkOdZ', '--timeout', '7200'] From solipsis at pitrou.net Sun Jan 20 04:10:49 2019 From: solipsis at pitrou.net (solipsis at pitrou.net) Date: Sun, 20 Jan 2019 09:10:49 +0000 Subject: [Python-checkins] Daily reference leaks (4243df51fe43): sum=5 Message-ID: <20190120091049.1.30106BA02BB3DFCE@psf.io> results for 4243df51fe43 on branch "default" -------------------------------------------- test_collections leaked [-7, 8, 0] memory blocks, sum=1 test_functools leaked [0, 3, 1] memory blocks, sum=4 Command line was: ['./python', '-m', 'test.regrtest', '-uall', '-R', '3:3:/home/psf-users/antoine/refleaks/reflog2FFGXL', '--timeout', '7200'] From webhook-mailer at python.org Sun Jan 20 13:47:45 2019 From: webhook-mailer at python.org (Ned Deily) Date: Sun, 20 Jan 2019 18:47:45 -0000 Subject: [Python-checkins] bpo-35699: fix distuils cannot detect Build Tools 2017 anymore (GH-11495) Message-ID: https://github.com/python/cpython/commit/b2dc4a3313c236fedbd6df664722cd47f3d91a72 commit: b2dc4a3313c236fedbd6df664722cd47f3d91a72 branch: master author: Marc Schlaich committer: Ned Deily date: 2019-01-20T13:47:42-05:00 summary: bpo-35699: fix distuils cannot detect Build Tools 2017 anymore (GH-11495) files: A Misc/NEWS.d/next/Library/2019-01-11-07-09-25.bpo-35699.VDiENF.rst M Lib/distutils/_msvccompiler.py diff --git a/Lib/distutils/_msvccompiler.py b/Lib/distutils/_msvccompiler.py index 84b4ef59599d..58b20a210247 100644 --- a/Lib/distutils/_msvccompiler.py +++ b/Lib/distutils/_msvccompiler.py @@ -78,6 +78,7 @@ def _find_vc2017(): "-prerelease", "-requires", "Microsoft.VisualStudio.Component.VC.Tools.x86.x64", "-property", "installationPath", + "-products", "*", ], encoding="mbcs", errors="strict").strip() except (subprocess.CalledProcessError, OSError, UnicodeDecodeError): return None, None diff --git a/Misc/NEWS.d/next/Library/2019-01-11-07-09-25.bpo-35699.VDiENF.rst b/Misc/NEWS.d/next/Library/2019-01-11-07-09-25.bpo-35699.VDiENF.rst new file mode 100644 index 000000000000..e632e7bdcffe --- /dev/null +++ b/Misc/NEWS.d/next/Library/2019-01-11-07-09-25.bpo-35699.VDiENF.rst @@ -0,0 +1 @@ +Fixed detection of Visual Studio Build Tools 2017 in distutils \ No newline at end of file From webhook-mailer at python.org Sun Jan 20 13:58:00 2019 From: webhook-mailer at python.org (Pablo Galindo) Date: Sun, 20 Jan 2019 18:58:00 -0000 Subject: [Python-checkins] Add information about DeprecationWarning for invalid escaped characters in the re module (GH-5255) Message-ID: https://github.com/python/cpython/commit/e8239b8e8199b76ef647ff3bf080ce2eb7733e04 commit: e8239b8e8199b76ef647ff3bf080ce2eb7733e04 branch: master author: Pablo Galindo committer: GitHub date: 2019-01-20T18:57:56Z summary: Add information about DeprecationWarning for invalid escaped characters in the re module (GH-5255) files: M Doc/library/re.rst diff --git a/Doc/library/re.rst b/Doc/library/re.rst index 2f829559ff17..ac6455a22074 100644 --- a/Doc/library/re.rst +++ b/Doc/library/re.rst @@ -28,7 +28,10 @@ character for the same purpose in string literals; for example, to match a literal backslash, one might have to write ``'\\\\'`` as the pattern string, because the regular expression must be ``\\``, and each backslash must be expressed as ``\\`` inside a regular Python string -literal. +literal. Also, please note that any invalid escape sequences in Python's +usage of the backslash in string literals now generate a :exc:`DeprecationWarning` +and in the future this will become a :exc:`SyntaxError`. This behaviour +will happen even if it is a valid escape sequence for a regular expression. The solution is to use Python's raw string notation for regular expression patterns; backslashes are not handled in any special way in a string literal From webhook-mailer at python.org Sun Jan 20 14:06:11 2019 From: webhook-mailer at python.org (Miss Islington (bot)) Date: Sun, 20 Jan 2019 19:06:11 -0000 Subject: [Python-checkins] bpo-35699: fix distuils cannot detect Build Tools 2017 anymore (GH-11495) Message-ID: https://github.com/python/cpython/commit/2fa53cfa8960f4bcb36718da4424e980fc305ef5 commit: 2fa53cfa8960f4bcb36718da4424e980fc305ef5 branch: 3.7 author: Miss Islington (bot) <31488909+miss-islington at users.noreply.github.com> committer: GitHub date: 2019-01-20T11:06:08-08:00 summary: bpo-35699: fix distuils cannot detect Build Tools 2017 anymore (GH-11495) (cherry picked from commit b2dc4a3313c236fedbd6df664722cd47f3d91a72) Co-authored-by: Marc Schlaich files: A Misc/NEWS.d/next/Library/2019-01-11-07-09-25.bpo-35699.VDiENF.rst M Lib/distutils/_msvccompiler.py diff --git a/Lib/distutils/_msvccompiler.py b/Lib/distutils/_msvccompiler.py index 84b4ef59599d..58b20a210247 100644 --- a/Lib/distutils/_msvccompiler.py +++ b/Lib/distutils/_msvccompiler.py @@ -78,6 +78,7 @@ def _find_vc2017(): "-prerelease", "-requires", "Microsoft.VisualStudio.Component.VC.Tools.x86.x64", "-property", "installationPath", + "-products", "*", ], encoding="mbcs", errors="strict").strip() except (subprocess.CalledProcessError, OSError, UnicodeDecodeError): return None, None diff --git a/Misc/NEWS.d/next/Library/2019-01-11-07-09-25.bpo-35699.VDiENF.rst b/Misc/NEWS.d/next/Library/2019-01-11-07-09-25.bpo-35699.VDiENF.rst new file mode 100644 index 000000000000..e632e7bdcffe --- /dev/null +++ b/Misc/NEWS.d/next/Library/2019-01-11-07-09-25.bpo-35699.VDiENF.rst @@ -0,0 +1 @@ +Fixed detection of Visual Studio Build Tools 2017 in distutils \ No newline at end of file From webhook-mailer at python.org Mon Jan 21 03:57:59 2019 From: webhook-mailer at python.org (Chris Withers) Date: Mon, 21 Jan 2019 08:57:59 -0000 Subject: [Python-checkins] bpo-20239: Allow repeated deletion of unittest.mock.Mock attributes (#11057) Message-ID: https://github.com/python/cpython/commit/222d303ade8aadf0adcae5190fac603bdcafe3f0 commit: 222d303ade8aadf0adcae5190fac603bdcafe3f0 branch: master author: Pablo Galindo committer: Chris Withers date: 2019-01-21T08:57:46Z summary: bpo-20239: Allow repeated deletion of unittest.mock.Mock attributes (#11057) * Allow repeated deletion of unittest.mock.Mock attributes * fixup! Allow repeated deletion of unittest.mock.Mock attributes * fixup! fixup! Allow repeated deletion of unittest.mock.Mock attributes files: A Misc/NEWS.d/next/Library/2018-12-09-21-35-49.bpo-20239.V4mWBL.rst M Lib/unittest/mock.py M Lib/unittest/test/testmock/testmock.py diff --git a/Lib/unittest/mock.py b/Lib/unittest/mock.py index 3a22a48c997f..ef5c55d6a165 100644 --- a/Lib/unittest/mock.py +++ b/Lib/unittest/mock.py @@ -729,11 +729,10 @@ def __delattr__(self, name): # not set on the instance itself return - if name in self.__dict__: - object.__delattr__(self, name) - obj = self._mock_children.get(name, _missing) - if obj is _deleted: + if name in self.__dict__: + super().__delattr__(name) + elif obj is _deleted: raise AttributeError(name) if obj is not _missing: del self._mock_children[name] diff --git a/Lib/unittest/test/testmock/testmock.py b/Lib/unittest/test/testmock/testmock.py index 193ae9f9acbf..64e2fcf61c12 100644 --- a/Lib/unittest/test/testmock/testmock.py +++ b/Lib/unittest/test/testmock/testmock.py @@ -1769,6 +1769,33 @@ def test_attribute_deletion(self): self.assertRaises(AttributeError, getattr, mock, 'f') + def test_mock_does_not_raise_on_repeated_attribute_deletion(self): + # bpo-20239: Assigning and deleting twice an attribute raises. + for mock in (Mock(), MagicMock(), NonCallableMagicMock(), + NonCallableMock()): + mock.foo = 3 + self.assertTrue(hasattr(mock, 'foo')) + self.assertEqual(mock.foo, 3) + + del mock.foo + self.assertFalse(hasattr(mock, 'foo')) + + mock.foo = 4 + self.assertTrue(hasattr(mock, 'foo')) + self.assertEqual(mock.foo, 4) + + del mock.foo + self.assertFalse(hasattr(mock, 'foo')) + + + def test_mock_raises_when_deleting_nonexistent_attribute(self): + for mock in (Mock(), MagicMock(), NonCallableMagicMock(), + NonCallableMock()): + del mock.foo + with self.assertRaises(AttributeError): + del mock.foo + + def test_reset_mock_does_not_raise_on_attr_deletion(self): # bpo-31177: reset_mock should not raise AttributeError when attributes # were deleted in a mock instance diff --git a/Misc/NEWS.d/next/Library/2018-12-09-21-35-49.bpo-20239.V4mWBL.rst b/Misc/NEWS.d/next/Library/2018-12-09-21-35-49.bpo-20239.V4mWBL.rst new file mode 100644 index 000000000000..fe9c69d234cf --- /dev/null +++ b/Misc/NEWS.d/next/Library/2018-12-09-21-35-49.bpo-20239.V4mWBL.rst @@ -0,0 +1,2 @@ +Allow repeated assignment deletion of :class:`unittest.mock.Mock` attributes. +Patch by Pablo Galindo. From solipsis at pitrou.net Mon Jan 21 04:10:38 2019 From: solipsis at pitrou.net (solipsis at pitrou.net) Date: Mon, 21 Jan 2019 09:10:38 +0000 Subject: [Python-checkins] Daily reference leaks (4243df51fe43): sum=13 Message-ID: <20190121091038.1.1FA3694864561BC3@psf.io> results for 4243df51fe43 on branch "default" -------------------------------------------- test_collections leaked [0, 0, 7] memory blocks, sum=7 test_functools leaked [0, 3, 1] memory blocks, sum=4 test_multiprocessing_spawn leaked [2, -1, 1] memory blocks, sum=2 Command line was: ['./python', '-m', 'test.regrtest', '-uall', '-R', '3:3:/home/psf-users/antoine/refleaks/reflogpY2O5v', '--timeout', '7200'] From webhook-mailer at python.org Mon Jan 21 04:24:26 2019 From: webhook-mailer at python.org (Victor Stinner) Date: Mon, 21 Jan 2019 09:24:26 -0000 Subject: [Python-checkins] bpo-35772: Fix test_tarfile on ppc64 (GH-11606) Message-ID: https://github.com/python/cpython/commit/b2385458ceddaf3d0d91456923716259d3915023 commit: b2385458ceddaf3d0d91456923716259d3915023 branch: master author: Victor Stinner committer: GitHub date: 2019-01-21T10:24:12+01:00 summary: bpo-35772: Fix test_tarfile on ppc64 (GH-11606) Fix sparse file tests of test_tarfile on ppc64le with the tmpfs filesystem. Fix the function testing if the filesystem supports sparse files: create a file which contains data and "holes", instead of creating a file which contains no data. tmpfs effective block size is a page size (tmpfs lives in the page cache). RHEL uses 64 KiB pages on aarch64, ppc64 and ppc64le, only s390x and x86_64 use 4 KiB pages, whereas the test punch holes of 4 KiB. test.pythoninfo: Add resource.getpagesize(). files: A Misc/NEWS.d/next/Tests/2019-01-18-12-19-19.bpo-35772.sGBbsn.rst M Lib/test/pythoninfo.py M Lib/test/test_tarfile.py diff --git a/Lib/test/pythoninfo.py b/Lib/test/pythoninfo.py index 7e94a31cecea..07f3cb3bea42 100644 --- a/Lib/test/pythoninfo.py +++ b/Lib/test/pythoninfo.py @@ -529,6 +529,8 @@ def collect_resource(info_add): value = resource.getrlimit(key) info_add('resource.%s' % name, value) + call_func(info_add, 'resource.pagesize', resource, 'getpagesize') + def collect_test_socket(info_add): try: diff --git a/Lib/test/test_tarfile.py b/Lib/test/test_tarfile.py index 7d2eec8a7ccf..5e4d75ecfce1 100644 --- a/Lib/test/test_tarfile.py +++ b/Lib/test/test_tarfile.py @@ -973,16 +973,21 @@ def test_sparse_file_10(self): def _fs_supports_holes(): # Return True if the platform knows the st_blocks stat attribute and # uses st_blocks units of 512 bytes, and if the filesystem is able to - # store holes in files. + # store holes of 4 KiB in files. + # + # The function returns False if page size is larger than 4 KiB. + # For example, ppc64 uses pages of 64 KiB. if sys.platform.startswith("linux"): # Linux evidentially has 512 byte st_blocks units. name = os.path.join(TEMPDIR, "sparse-test") with open(name, "wb") as fobj: + # Seek to "punch a hole" of 4 KiB fobj.seek(4096) + fobj.write(b'x' * 4096) fobj.truncate() s = os.stat(name) support.unlink(name) - return s.st_blocks == 0 + return (s.st_blocks * 512 < s.st_size) else: return False diff --git a/Misc/NEWS.d/next/Tests/2019-01-18-12-19-19.bpo-35772.sGBbsn.rst b/Misc/NEWS.d/next/Tests/2019-01-18-12-19-19.bpo-35772.sGBbsn.rst new file mode 100644 index 000000000000..cfd282f1d090 --- /dev/null +++ b/Misc/NEWS.d/next/Tests/2019-01-18-12-19-19.bpo-35772.sGBbsn.rst @@ -0,0 +1,6 @@ +Fix sparse file tests of test_tarfile on ppc64 with the tmpfs filesystem. Fix +the function testing if the filesystem supports sparse files: create a file +which contains data and "holes", instead of creating a file which contains no +data. tmpfs effective block size is a page size (tmpfs lives in the page cache). +RHEL uses 64 KiB pages on aarch64, ppc64, ppc64le, only s390x and x86_64 use 4 +KiB pages, whereas the test punch holes of 4 KiB. From webhook-mailer at python.org Mon Jan 21 04:37:59 2019 From: webhook-mailer at python.org (Chris Withers) Date: Mon, 21 Jan 2019 09:37:59 -0000 Subject: [Python-checkins] bpo-20239: Allow repeated deletion of unittest.mock.Mock attributes (GH-11629) Message-ID: https://github.com/python/cpython/commit/d358a8cda75446a8e0b5d99149f709395d5eae19 commit: d358a8cda75446a8e0b5d99149f709395d5eae19 branch: 3.7 author: Miss Islington (bot) <31488909+miss-islington at users.noreply.github.com> committer: Chris Withers date: 2019-01-21T09:37:54Z summary: bpo-20239: Allow repeated deletion of unittest.mock.Mock attributes (GH-11629) * Allow repeated deletion of unittest.mock.Mock attributes * fixup! Allow repeated deletion of unittest.mock.Mock attributes * fixup! fixup! Allow repeated deletion of unittest.mock.Mock attributes (cherry picked from commit 222d303ade8aadf0adcae5190fac603bdcafe3f0) Co-authored-by: Pablo Galindo files: A Misc/NEWS.d/next/Library/2018-12-09-21-35-49.bpo-20239.V4mWBL.rst M Lib/unittest/mock.py M Lib/unittest/test/testmock/testmock.py diff --git a/Lib/unittest/mock.py b/Lib/unittest/mock.py index 955af5d2b85d..42fbc22e748c 100644 --- a/Lib/unittest/mock.py +++ b/Lib/unittest/mock.py @@ -728,11 +728,10 @@ def __delattr__(self, name): # not set on the instance itself return - if name in self.__dict__: - object.__delattr__(self, name) - obj = self._mock_children.get(name, _missing) - if obj is _deleted: + if name in self.__dict__: + super().__delattr__(name) + elif obj is _deleted: raise AttributeError(name) if obj is not _missing: del self._mock_children[name] diff --git a/Lib/unittest/test/testmock/testmock.py b/Lib/unittest/test/testmock/testmock.py index 49ecbb446629..cf0ee8fbb234 100644 --- a/Lib/unittest/test/testmock/testmock.py +++ b/Lib/unittest/test/testmock/testmock.py @@ -1739,6 +1739,33 @@ def test_attribute_deletion(self): self.assertRaises(AttributeError, getattr, mock, 'f') + def test_mock_does_not_raise_on_repeated_attribute_deletion(self): + # bpo-20239: Assigning and deleting twice an attribute raises. + for mock in (Mock(), MagicMock(), NonCallableMagicMock(), + NonCallableMock()): + mock.foo = 3 + self.assertTrue(hasattr(mock, 'foo')) + self.assertEqual(mock.foo, 3) + + del mock.foo + self.assertFalse(hasattr(mock, 'foo')) + + mock.foo = 4 + self.assertTrue(hasattr(mock, 'foo')) + self.assertEqual(mock.foo, 4) + + del mock.foo + self.assertFalse(hasattr(mock, 'foo')) + + + def test_mock_raises_when_deleting_nonexistent_attribute(self): + for mock in (Mock(), MagicMock(), NonCallableMagicMock(), + NonCallableMock()): + del mock.foo + with self.assertRaises(AttributeError): + del mock.foo + + def test_reset_mock_does_not_raise_on_attr_deletion(self): # bpo-31177: reset_mock should not raise AttributeError when attributes # were deleted in a mock instance diff --git a/Misc/NEWS.d/next/Library/2018-12-09-21-35-49.bpo-20239.V4mWBL.rst b/Misc/NEWS.d/next/Library/2018-12-09-21-35-49.bpo-20239.V4mWBL.rst new file mode 100644 index 000000000000..fe9c69d234cf --- /dev/null +++ b/Misc/NEWS.d/next/Library/2018-12-09-21-35-49.bpo-20239.V4mWBL.rst @@ -0,0 +1,2 @@ +Allow repeated assignment deletion of :class:`unittest.mock.Mock` attributes. +Patch by Pablo Galindo. From webhook-mailer at python.org Mon Jan 21 04:44:34 2019 From: webhook-mailer at python.org (Miss Islington (bot)) Date: Mon, 21 Jan 2019 09:44:34 -0000 Subject: [Python-checkins] bpo-35772: Fix test_tarfile on ppc64 (GH-11606) Message-ID: https://github.com/python/cpython/commit/d1dd6be613381b996b9071443ef081de8e5f3aff commit: d1dd6be613381b996b9071443ef081de8e5f3aff branch: 3.7 author: Miss Islington (bot) <31488909+miss-islington at users.noreply.github.com> committer: GitHub date: 2019-01-21T01:44:30-08:00 summary: bpo-35772: Fix test_tarfile on ppc64 (GH-11606) Fix sparse file tests of test_tarfile on ppc64le with the tmpfs filesystem. Fix the function testing if the filesystem supports sparse files: create a file which contains data and "holes", instead of creating a file which contains no data. tmpfs effective block size is a page size (tmpfs lives in the page cache). RHEL uses 64 KiB pages on aarch64, ppc64 and ppc64le, only s390x and x86_64 use 4 KiB pages, whereas the test punch holes of 4 KiB. test.pythoninfo: Add resource.getpagesize(). (cherry picked from commit b2385458ceddaf3d0d91456923716259d3915023) Co-authored-by: Victor Stinner files: A Misc/NEWS.d/next/Tests/2019-01-18-12-19-19.bpo-35772.sGBbsn.rst M Lib/test/pythoninfo.py M Lib/test/test_tarfile.py diff --git a/Lib/test/pythoninfo.py b/Lib/test/pythoninfo.py index 085c45d9cc04..94abfdd86ca2 100644 --- a/Lib/test/pythoninfo.py +++ b/Lib/test/pythoninfo.py @@ -516,6 +516,8 @@ def collect_resource(info_add): value = resource.getrlimit(key) info_add('resource.%s' % name, value) + call_func(info_add, 'resource.pagesize', resource, 'getpagesize') + def collect_test_socket(info_add): try: diff --git a/Lib/test/test_tarfile.py b/Lib/test/test_tarfile.py index 7d2eec8a7ccf..5e4d75ecfce1 100644 --- a/Lib/test/test_tarfile.py +++ b/Lib/test/test_tarfile.py @@ -973,16 +973,21 @@ def test_sparse_file_10(self): def _fs_supports_holes(): # Return True if the platform knows the st_blocks stat attribute and # uses st_blocks units of 512 bytes, and if the filesystem is able to - # store holes in files. + # store holes of 4 KiB in files. + # + # The function returns False if page size is larger than 4 KiB. + # For example, ppc64 uses pages of 64 KiB. if sys.platform.startswith("linux"): # Linux evidentially has 512 byte st_blocks units. name = os.path.join(TEMPDIR, "sparse-test") with open(name, "wb") as fobj: + # Seek to "punch a hole" of 4 KiB fobj.seek(4096) + fobj.write(b'x' * 4096) fobj.truncate() s = os.stat(name) support.unlink(name) - return s.st_blocks == 0 + return (s.st_blocks * 512 < s.st_size) else: return False diff --git a/Misc/NEWS.d/next/Tests/2019-01-18-12-19-19.bpo-35772.sGBbsn.rst b/Misc/NEWS.d/next/Tests/2019-01-18-12-19-19.bpo-35772.sGBbsn.rst new file mode 100644 index 000000000000..cfd282f1d090 --- /dev/null +++ b/Misc/NEWS.d/next/Tests/2019-01-18-12-19-19.bpo-35772.sGBbsn.rst @@ -0,0 +1,6 @@ +Fix sparse file tests of test_tarfile on ppc64 with the tmpfs filesystem. Fix +the function testing if the filesystem supports sparse files: create a file +which contains data and "holes", instead of creating a file which contains no +data. tmpfs effective block size is a page size (tmpfs lives in the page cache). +RHEL uses 64 KiB pages on aarch64, ppc64, ppc64le, only s390x and x86_64 use 4 +KiB pages, whereas the test punch holes of 4 KiB. From webhook-mailer at python.org Mon Jan 21 09:35:50 2019 From: webhook-mailer at python.org (Pablo Galindo) Date: Mon, 21 Jan 2019 14:35:50 -0000 Subject: [Python-checkins] bpo-35794: Catch PermissionError in test_no_such_executable (GH-11635) Message-ID: https://github.com/python/cpython/commit/e9b185f2a493cc54f0d49eac44bf21e8d7de2990 commit: e9b185f2a493cc54f0d49eac44bf21e8d7de2990 branch: master author: Pablo Galindo committer: GitHub date: 2019-01-21T14:35:43Z summary: bpo-35794: Catch PermissionError in test_no_such_executable (GH-11635) PermissionError can be raised if there are directories in the $PATH that are not accessible when using posix_spawnp. files: M Lib/test/test_posix.py diff --git a/Lib/test/test_posix.py b/Lib/test/test_posix.py index 79fc3c264418..165b83713408 100644 --- a/Lib/test/test_posix.py +++ b/Lib/test/test_posix.py @@ -1522,7 +1522,9 @@ def test_no_such_executable(self): pid = self.spawn_func(no_such_executable, [no_such_executable], os.environ) - except FileNotFoundError as exc: + # bpo-35794: PermissionError can be raised if there are + # directories in the $PATH that are not accessible. + except (FileNotFoundError, PermissionError) as exc: self.assertEqual(exc.filename, no_such_executable) else: pid2, status = os.waitpid(pid, 0) From webhook-mailer at python.org Mon Jan 21 14:20:16 2019 From: webhook-mailer at python.org (Miss Islington (bot)) Date: Mon, 21 Jan 2019 19:20:16 -0000 Subject: [Python-checkins] bpo-35782: Fix error message in randrange (GH-11620) Message-ID: https://github.com/python/cpython/commit/2433a2ab705e93f9a44f01c260d351b205a73e9d commit: 2433a2ab705e93f9a44f01c260d351b205a73e9d branch: master author: Kumar Akshay committer: Miss Islington (bot) <31488909+miss-islington at users.noreply.github.com> date: 2019-01-21T11:19:59-08:00 summary: bpo-35782: Fix error message in randrange (GH-11620) https://bugs.python.org/issue35782 files: M Lib/random.py diff --git a/Lib/random.py b/Lib/random.py index e00a02623890..8925b52c4730 100644 --- a/Lib/random.py +++ b/Lib/random.py @@ -216,7 +216,7 @@ def randrange(self, start, stop=None, step=1, _int=int): if step == 1 and width > 0: return istart + self._randbelow(width) if step == 1: - raise ValueError("empty range for randrange() (%d,%d, %d)" % (istart, istop, width)) + raise ValueError("empty range for randrange() (%d, %d, %d)" % (istart, istop, width)) # Non-unit step argument supplied. istep = _int(step) From webhook-mailer at python.org Mon Jan 21 15:49:45 2019 From: webhook-mailer at python.org (Antoine Pitrou) Date: Mon, 21 Jan 2019 20:49:45 -0000 Subject: [Python-checkins] bpo-35758: Fix building on ARM + MSVC (gh-11531) Message-ID: https://github.com/python/cpython/commit/7a2368063f25746d4008a74aca0dc0b82f86ff7b commit: 7a2368063f25746d4008a74aca0dc0b82f86ff7b branch: master author: Minmin Gong committer: Antoine Pitrou date: 2019-01-21T21:49:40+01:00 summary: bpo-35758: Fix building on ARM + MSVC (gh-11531) * Disable x87 control word for non-x86 targets On msvc, x87 control word is only available on x86 target. Need to disable it for other targets to prevent compiling problems. * Include immintrin.h on x86 and x64 only Immintrin.h is only available on x86 and x64. Need to disable it for other targets to prevent compiling problems. files: A Misc/NEWS.d/next/Windows/2019-01-21-05-18-14.bpo-35758.8LsY3l.rst M Include/internal/pycore_atomic.h M Include/pyport.h diff --git a/Include/internal/pycore_atomic.h b/Include/internal/pycore_atomic.h index f430a5c26ff8..5669f71b941f 100644 --- a/Include/internal/pycore_atomic.h +++ b/Include/internal/pycore_atomic.h @@ -19,7 +19,9 @@ extern "C" { #if defined(_MSC_VER) #include -#include +#if defined(_M_IX86) || defined(_M_X64) +# include +#endif #endif /* This is modeled after the atomics interface from C1x, according to diff --git a/Include/pyport.h b/Include/pyport.h index 7f88c4f629a0..4971a493ccee 100644 --- a/Include/pyport.h +++ b/Include/pyport.h @@ -406,7 +406,7 @@ extern "C" { #endif /* get and set x87 control word for VisualStudio/x86 */ -#if defined(_MSC_VER) && !defined(_WIN64) /* x87 not supported in 64-bit */ +#if defined(_MSC_VER) && defined(_M_IX86) /* x87 only supported in x86 */ #define HAVE_PY_SET_53BIT_PRECISION 1 #define _Py_SET_53BIT_PRECISION_HEADER \ unsigned int old_387controlword, new_387controlword, out_387controlword diff --git a/Misc/NEWS.d/next/Windows/2019-01-21-05-18-14.bpo-35758.8LsY3l.rst b/Misc/NEWS.d/next/Windows/2019-01-21-05-18-14.bpo-35758.8LsY3l.rst new file mode 100644 index 000000000000..c1e19d465b3c --- /dev/null +++ b/Misc/NEWS.d/next/Windows/2019-01-21-05-18-14.bpo-35758.8LsY3l.rst @@ -0,0 +1 @@ +Allow building on ARM with MSVC. From solipsis at pitrou.net Tue Jan 22 04:08:12 2019 From: solipsis at pitrou.net (solipsis at pitrou.net) Date: Tue, 22 Jan 2019 09:08:12 +0000 Subject: [Python-checkins] Daily reference leaks (4243df51fe43): sum=11 Message-ID: <20190122090812.1.965A837DD0FEF158@psf.io> results for 4243df51fe43 on branch "default" -------------------------------------------- test_asyncio leaked [0, 3, 0] memory blocks, sum=3 test_collections leaked [7, 0, 0] memory blocks, sum=7 test_functools leaked [0, 3, 1] memory blocks, sum=4 test_multiprocessing_fork leaked [-1, 1, -2] memory blocks, sum=-2 test_multiprocessing_spawn leaked [-2, 2, -1] memory blocks, sum=-1 Command line was: ['./python', '-m', 'test.regrtest', '-uall', '-R', '3:3:/home/psf-users/antoine/refleaks/reflogrGacqD', '--timeout', '7200'] From webhook-mailer at python.org Tue Jan 22 06:18:28 2019 From: webhook-mailer at python.org (Ivan Levkivskyi) Date: Tue, 22 Jan 2019 11:18:28 -0000 Subject: [Python-checkins] bpo-33416: Add end positions to Python AST (GH-11605) Message-ID: https://github.com/python/cpython/commit/9932a22897ef9905161dac7476e6976370e13515 commit: 9932a22897ef9905161dac7476e6976370e13515 branch: master author: Ivan Levkivskyi committer: GitHub date: 2019-01-22T11:18:22Z summary: bpo-33416: Add end positions to Python AST (GH-11605) The majority of this PR is tediously passing `end_lineno` and `end_col_offset` everywhere. Here are non-trivial points: * It is not possible to reconstruct end positions in AST "on the fly", some information is lost after an AST node is constructed, so we need two more attributes for every AST node `end_lineno` and `end_col_offset`. * I add end position information to both CST and AST. Although it may be technically possible to avoid adding end positions to CST, the code becomes more cumbersome and less efficient. * Since the end position is not known for non-leaf CST nodes while the next token is added, this requires a bit of extra care (see `_PyNode_FinalizeEndPos`). Unless I made some mistake, the algorithm should be linear. * For statements, I "trim" the end position of suites to not include the terminal newlines and dedent (this seems to be what people would expect), for example in ```python class C: pass pass ``` the end line and end column for the class definition is (2, 8). * For `end_col_offset` I use the common Python convention for indexing, for example for `pass` the `end_col_offset` is 4 (not 3), so that `[0:4]` gives one the source code that corresponds to the node. * I added a helper function `ast.get_source_segment()`, to get source text segment corresponding to a given AST node. It is also useful for testing. An (inevitable) downside of this PR is that AST now takes almost 25% more memory. I think however it is probably justified by the benefits. files: A Misc/NEWS.d/next/Core and Builtins/2019-01-19-19-41-53.bpo-33416.VDeOU5.rst M Doc/library/ast.rst M Include/Python-ast.h M Include/node.h M Lib/ast.py M Lib/test/test_asdl_parser.py M Lib/test/test_ast.py M Lib/test/test_parser.py M Modules/parsermodule.c M Parser/Python.asdl M Parser/asdl_c.py M Parser/node.c M Parser/parser.c M Parser/parser.h M Parser/parsetok.c M Python/Python-ast.c M Python/ast.c M Python/ast_opt.c M Python/compile.c diff --git a/Doc/library/ast.rst b/Doc/library/ast.rst index 2883f3c6739f..7715a28ce1b8 100644 --- a/Doc/library/ast.rst +++ b/Doc/library/ast.rst @@ -61,13 +61,21 @@ Node classes .. attribute:: lineno col_offset + end_lineno + end_col_offset Instances of :class:`ast.expr` and :class:`ast.stmt` subclasses have - :attr:`lineno` and :attr:`col_offset` attributes. The :attr:`lineno` is - the line number of source text (1-indexed so the first line is line 1) and - the :attr:`col_offset` is the UTF-8 byte offset of the first token that - generated the node. The UTF-8 offset is recorded because the parser uses - UTF-8 internally. + :attr:`lineno`, :attr:`col_offset`, :attr:`lineno`, and :attr:`col_offset` + attributes. The :attr:`lineno` and :attr:`end_lineno` are the first and + last line numbers of source text span (1-indexed so the first line is line 1) + and the :attr:`col_offset` and :attr:`end_col_offset` are the corresponding + UTF-8 byte offsets of the first and last tokens that generated the node. + The UTF-8 offset is recorded because the parser uses UTF-8 internally. + + Note that the end positions are not required by the compiler and are + therefore optional. The end offset is *after* the last symbol, for example + one can get the source segment of a one-line expression node using + ``source_line[node.col_offset : node.end_col_offset]``. The constructor of a class :class:`ast.T` parses its arguments as follows: @@ -162,6 +170,18 @@ and classes for traversing abstract syntax trees: :class:`AsyncFunctionDef` is now supported. +.. function:: get_source_segment(source, node, *, padded=False) + + Get source code segment of the *source* that generated *node*. + If some location information (:attr:`lineno`, :attr:`end_lineno`, + :attr:`col_offset`, or :attr:`end_col_offset`) is missing, return ``None``. + + If *padded* is ``True``, the first line of a multi-line statement will + be padded with spaces to match its original position. + + .. versionadded:: 3.8 + + .. function:: fix_missing_locations(node) When you compile a node tree with :func:`compile`, the compiler expects @@ -173,14 +193,16 @@ and classes for traversing abstract syntax trees: .. function:: increment_lineno(node, n=1) - Increment the line number of each node in the tree starting at *node* by *n*. - This is useful to "move code" to a different location in a file. + Increment the line number and end line number of each node in the tree + starting at *node* by *n*. This is useful to "move code" to a different + location in a file. .. function:: copy_location(new_node, old_node) - Copy source location (:attr:`lineno` and :attr:`col_offset`) from *old_node* - to *new_node* if possible, and return *new_node*. + Copy source location (:attr:`lineno`, :attr:`col_offset`, :attr:`end_lineno`, + and :attr:`end_col_offset`) from *old_node* to *new_node* if possible, + and return *new_node*. .. function:: iter_fields(node) diff --git a/Include/Python-ast.h b/Include/Python-ast.h index 1a2b8297810c..f8394e6c26ad 100644 --- a/Include/Python-ast.h +++ b/Include/Python-ast.h @@ -210,6 +210,8 @@ struct _stmt { } v; int lineno; int col_offset; + int end_lineno; + int end_col_offset; }; enum _expr_kind {BoolOp_kind=1, BinOp_kind=2, UnaryOp_kind=3, Lambda_kind=4, @@ -353,6 +355,8 @@ struct _expr { } v; int lineno; int col_offset; + int end_lineno; + int end_col_offset; }; enum _slice_kind {Slice_kind=1, ExtSlice_kind=2, Index_kind=3}; @@ -396,6 +400,8 @@ struct _excepthandler { } v; int lineno; int col_offset; + int end_lineno; + int end_col_offset; }; struct _arguments { @@ -412,6 +418,8 @@ struct _arg { expr_ty annotation; int lineno; int col_offset; + int end_lineno; + int end_col_offset; }; struct _keyword { @@ -430,6 +438,7 @@ struct _withitem { }; +// Note: these macros affect function definitions, not only call sites. #define Module(a0, a1) _Py_Module(a0, a1) mod_ty _Py_Module(asdl_seq * body, PyArena *arena); #define Interactive(a0, a1) _Py_Interactive(a0, a1) @@ -438,152 +447,188 @@ mod_ty _Py_Interactive(asdl_seq * body, PyArena *arena); mod_ty _Py_Expression(expr_ty body, PyArena *arena); #define Suite(a0, a1) _Py_Suite(a0, a1) mod_ty _Py_Suite(asdl_seq * body, PyArena *arena); -#define FunctionDef(a0, a1, a2, a3, a4, a5, a6, a7) _Py_FunctionDef(a0, a1, a2, a3, a4, a5, a6, a7) +#define FunctionDef(a0, a1, a2, a3, a4, a5, a6, a7, a8, a9) _Py_FunctionDef(a0, a1, a2, a3, a4, a5, a6, a7, a8, a9) stmt_ty _Py_FunctionDef(identifier name, arguments_ty args, asdl_seq * body, asdl_seq * decorator_list, expr_ty returns, int lineno, - int col_offset, PyArena *arena); -#define AsyncFunctionDef(a0, a1, a2, a3, a4, a5, a6, a7) _Py_AsyncFunctionDef(a0, a1, a2, a3, a4, a5, a6, a7) + int col_offset, int end_lineno, int end_col_offset, + PyArena *arena); +#define AsyncFunctionDef(a0, a1, a2, a3, a4, a5, a6, a7, a8, a9) _Py_AsyncFunctionDef(a0, a1, a2, a3, a4, a5, a6, a7, a8, a9) stmt_ty _Py_AsyncFunctionDef(identifier name, arguments_ty args, asdl_seq * body, asdl_seq * decorator_list, expr_ty returns, - int lineno, int col_offset, PyArena *arena); -#define ClassDef(a0, a1, a2, a3, a4, a5, a6, a7) _Py_ClassDef(a0, a1, a2, a3, a4, a5, a6, a7) + int lineno, int col_offset, int end_lineno, int + end_col_offset, PyArena *arena); +#define ClassDef(a0, a1, a2, a3, a4, a5, a6, a7, a8, a9) _Py_ClassDef(a0, a1, a2, a3, a4, a5, a6, a7, a8, a9) stmt_ty _Py_ClassDef(identifier name, asdl_seq * bases, asdl_seq * keywords, asdl_seq * body, asdl_seq * decorator_list, int lineno, - int col_offset, PyArena *arena); -#define Return(a0, a1, a2, a3) _Py_Return(a0, a1, a2, a3) -stmt_ty _Py_Return(expr_ty value, int lineno, int col_offset, PyArena *arena); -#define Delete(a0, a1, a2, a3) _Py_Delete(a0, a1, a2, a3) -stmt_ty _Py_Delete(asdl_seq * targets, int lineno, int col_offset, PyArena - *arena); -#define Assign(a0, a1, a2, a3, a4) _Py_Assign(a0, a1, a2, a3, a4) + int col_offset, int end_lineno, int end_col_offset, + PyArena *arena); +#define Return(a0, a1, a2, a3, a4, a5) _Py_Return(a0, a1, a2, a3, a4, a5) +stmt_ty _Py_Return(expr_ty value, int lineno, int col_offset, int end_lineno, + int end_col_offset, PyArena *arena); +#define Delete(a0, a1, a2, a3, a4, a5) _Py_Delete(a0, a1, a2, a3, a4, a5) +stmt_ty _Py_Delete(asdl_seq * targets, int lineno, int col_offset, int + end_lineno, int end_col_offset, PyArena *arena); +#define Assign(a0, a1, a2, a3, a4, a5, a6) _Py_Assign(a0, a1, a2, a3, a4, a5, a6) stmt_ty _Py_Assign(asdl_seq * targets, expr_ty value, int lineno, int - col_offset, PyArena *arena); -#define AugAssign(a0, a1, a2, a3, a4, a5) _Py_AugAssign(a0, a1, a2, a3, a4, a5) + col_offset, int end_lineno, int end_col_offset, PyArena + *arena); +#define AugAssign(a0, a1, a2, a3, a4, a5, a6, a7) _Py_AugAssign(a0, a1, a2, a3, a4, a5, a6, a7) stmt_ty _Py_AugAssign(expr_ty target, operator_ty op, expr_ty value, int - lineno, int col_offset, PyArena *arena); -#define AnnAssign(a0, a1, a2, a3, a4, a5, a6) _Py_AnnAssign(a0, a1, a2, a3, a4, a5, a6) + lineno, int col_offset, int end_lineno, int + end_col_offset, PyArena *arena); +#define AnnAssign(a0, a1, a2, a3, a4, a5, a6, a7, a8) _Py_AnnAssign(a0, a1, a2, a3, a4, a5, a6, a7, a8) stmt_ty _Py_AnnAssign(expr_ty target, expr_ty annotation, expr_ty value, int - simple, int lineno, int col_offset, PyArena *arena); -#define For(a0, a1, a2, a3, a4, a5, a6) _Py_For(a0, a1, a2, a3, a4, a5, a6) + simple, int lineno, int col_offset, int end_lineno, int + end_col_offset, PyArena *arena); +#define For(a0, a1, a2, a3, a4, a5, a6, a7, a8) _Py_For(a0, a1, a2, a3, a4, a5, a6, a7, a8) stmt_ty _Py_For(expr_ty target, expr_ty iter, asdl_seq * body, asdl_seq * - orelse, int lineno, int col_offset, PyArena *arena); -#define AsyncFor(a0, a1, a2, a3, a4, a5, a6) _Py_AsyncFor(a0, a1, a2, a3, a4, a5, a6) + orelse, int lineno, int col_offset, int end_lineno, int + end_col_offset, PyArena *arena); +#define AsyncFor(a0, a1, a2, a3, a4, a5, a6, a7, a8) _Py_AsyncFor(a0, a1, a2, a3, a4, a5, a6, a7, a8) stmt_ty _Py_AsyncFor(expr_ty target, expr_ty iter, asdl_seq * body, asdl_seq * - orelse, int lineno, int col_offset, PyArena *arena); -#define While(a0, a1, a2, a3, a4, a5) _Py_While(a0, a1, a2, a3, a4, a5) + orelse, int lineno, int col_offset, int end_lineno, int + end_col_offset, PyArena *arena); +#define While(a0, a1, a2, a3, a4, a5, a6, a7) _Py_While(a0, a1, a2, a3, a4, a5, a6, a7) stmt_ty _Py_While(expr_ty test, asdl_seq * body, asdl_seq * orelse, int lineno, - int col_offset, PyArena *arena); -#define If(a0, a1, a2, a3, a4, a5) _Py_If(a0, a1, a2, a3, a4, a5) + int col_offset, int end_lineno, int end_col_offset, PyArena + *arena); +#define If(a0, a1, a2, a3, a4, a5, a6, a7) _Py_If(a0, a1, a2, a3, a4, a5, a6, a7) stmt_ty _Py_If(expr_ty test, asdl_seq * body, asdl_seq * orelse, int lineno, - int col_offset, PyArena *arena); -#define With(a0, a1, a2, a3, a4) _Py_With(a0, a1, a2, a3, a4) + int col_offset, int end_lineno, int end_col_offset, PyArena + *arena); +#define With(a0, a1, a2, a3, a4, a5, a6) _Py_With(a0, a1, a2, a3, a4, a5, a6) stmt_ty _Py_With(asdl_seq * items, asdl_seq * body, int lineno, int col_offset, - PyArena *arena); -#define AsyncWith(a0, a1, a2, a3, a4) _Py_AsyncWith(a0, a1, a2, a3, a4) + int end_lineno, int end_col_offset, PyArena *arena); +#define AsyncWith(a0, a1, a2, a3, a4, a5, a6) _Py_AsyncWith(a0, a1, a2, a3, a4, a5, a6) stmt_ty _Py_AsyncWith(asdl_seq * items, asdl_seq * body, int lineno, int - col_offset, PyArena *arena); -#define Raise(a0, a1, a2, a3, a4) _Py_Raise(a0, a1, a2, a3, a4) -stmt_ty _Py_Raise(expr_ty exc, expr_ty cause, int lineno, int col_offset, - PyArena *arena); -#define Try(a0, a1, a2, a3, a4, a5, a6) _Py_Try(a0, a1, a2, a3, a4, a5, a6) + col_offset, int end_lineno, int end_col_offset, PyArena + *arena); +#define Raise(a0, a1, a2, a3, a4, a5, a6) _Py_Raise(a0, a1, a2, a3, a4, a5, a6) +stmt_ty _Py_Raise(expr_ty exc, expr_ty cause, int lineno, int col_offset, int + end_lineno, int end_col_offset, PyArena *arena); +#define Try(a0, a1, a2, a3, a4, a5, a6, a7, a8) _Py_Try(a0, a1, a2, a3, a4, a5, a6, a7, a8) stmt_ty _Py_Try(asdl_seq * body, asdl_seq * handlers, asdl_seq * orelse, - asdl_seq * finalbody, int lineno, int col_offset, PyArena - *arena); -#define Assert(a0, a1, a2, a3, a4) _Py_Assert(a0, a1, a2, a3, a4) -stmt_ty _Py_Assert(expr_ty test, expr_ty msg, int lineno, int col_offset, - PyArena *arena); -#define Import(a0, a1, a2, a3) _Py_Import(a0, a1, a2, a3) -stmt_ty _Py_Import(asdl_seq * names, int lineno, int col_offset, PyArena - *arena); -#define ImportFrom(a0, a1, a2, a3, a4, a5) _Py_ImportFrom(a0, a1, a2, a3, a4, a5) + asdl_seq * finalbody, int lineno, int col_offset, int + end_lineno, int end_col_offset, PyArena *arena); +#define Assert(a0, a1, a2, a3, a4, a5, a6) _Py_Assert(a0, a1, a2, a3, a4, a5, a6) +stmt_ty _Py_Assert(expr_ty test, expr_ty msg, int lineno, int col_offset, int + end_lineno, int end_col_offset, PyArena *arena); +#define Import(a0, a1, a2, a3, a4, a5) _Py_Import(a0, a1, a2, a3, a4, a5) +stmt_ty _Py_Import(asdl_seq * names, int lineno, int col_offset, int + end_lineno, int end_col_offset, PyArena *arena); +#define ImportFrom(a0, a1, a2, a3, a4, a5, a6, a7) _Py_ImportFrom(a0, a1, a2, a3, a4, a5, a6, a7) stmt_ty _Py_ImportFrom(identifier module, asdl_seq * names, int level, int - lineno, int col_offset, PyArena *arena); -#define Global(a0, a1, a2, a3) _Py_Global(a0, a1, a2, a3) -stmt_ty _Py_Global(asdl_seq * names, int lineno, int col_offset, PyArena - *arena); -#define Nonlocal(a0, a1, a2, a3) _Py_Nonlocal(a0, a1, a2, a3) -stmt_ty _Py_Nonlocal(asdl_seq * names, int lineno, int col_offset, PyArena - *arena); -#define Expr(a0, a1, a2, a3) _Py_Expr(a0, a1, a2, a3) -stmt_ty _Py_Expr(expr_ty value, int lineno, int col_offset, PyArena *arena); -#define Pass(a0, a1, a2) _Py_Pass(a0, a1, a2) -stmt_ty _Py_Pass(int lineno, int col_offset, PyArena *arena); -#define Break(a0, a1, a2) _Py_Break(a0, a1, a2) -stmt_ty _Py_Break(int lineno, int col_offset, PyArena *arena); -#define Continue(a0, a1, a2) _Py_Continue(a0, a1, a2) -stmt_ty _Py_Continue(int lineno, int col_offset, PyArena *arena); -#define BoolOp(a0, a1, a2, a3, a4) _Py_BoolOp(a0, a1, a2, a3, a4) + lineno, int col_offset, int end_lineno, int + end_col_offset, PyArena *arena); +#define Global(a0, a1, a2, a3, a4, a5) _Py_Global(a0, a1, a2, a3, a4, a5) +stmt_ty _Py_Global(asdl_seq * names, int lineno, int col_offset, int + end_lineno, int end_col_offset, PyArena *arena); +#define Nonlocal(a0, a1, a2, a3, a4, a5) _Py_Nonlocal(a0, a1, a2, a3, a4, a5) +stmt_ty _Py_Nonlocal(asdl_seq * names, int lineno, int col_offset, int + end_lineno, int end_col_offset, PyArena *arena); +#define Expr(a0, a1, a2, a3, a4, a5) _Py_Expr(a0, a1, a2, a3, a4, a5) +stmt_ty _Py_Expr(expr_ty value, int lineno, int col_offset, int end_lineno, int + end_col_offset, PyArena *arena); +#define Pass(a0, a1, a2, a3, a4) _Py_Pass(a0, a1, a2, a3, a4) +stmt_ty _Py_Pass(int lineno, int col_offset, int end_lineno, int + end_col_offset, PyArena *arena); +#define Break(a0, a1, a2, a3, a4) _Py_Break(a0, a1, a2, a3, a4) +stmt_ty _Py_Break(int lineno, int col_offset, int end_lineno, int + end_col_offset, PyArena *arena); +#define Continue(a0, a1, a2, a3, a4) _Py_Continue(a0, a1, a2, a3, a4) +stmt_ty _Py_Continue(int lineno, int col_offset, int end_lineno, int + end_col_offset, PyArena *arena); +#define BoolOp(a0, a1, a2, a3, a4, a5, a6) _Py_BoolOp(a0, a1, a2, a3, a4, a5, a6) expr_ty _Py_BoolOp(boolop_ty op, asdl_seq * values, int lineno, int col_offset, - PyArena *arena); -#define BinOp(a0, a1, a2, a3, a4, a5) _Py_BinOp(a0, a1, a2, a3, a4, a5) + int end_lineno, int end_col_offset, PyArena *arena); +#define BinOp(a0, a1, a2, a3, a4, a5, a6, a7) _Py_BinOp(a0, a1, a2, a3, a4, a5, a6, a7) expr_ty _Py_BinOp(expr_ty left, operator_ty op, expr_ty right, int lineno, int - col_offset, PyArena *arena); -#define UnaryOp(a0, a1, a2, a3, a4) _Py_UnaryOp(a0, a1, a2, a3, a4) + col_offset, int end_lineno, int end_col_offset, PyArena + *arena); +#define UnaryOp(a0, a1, a2, a3, a4, a5, a6) _Py_UnaryOp(a0, a1, a2, a3, a4, a5, a6) expr_ty _Py_UnaryOp(unaryop_ty op, expr_ty operand, int lineno, int col_offset, - PyArena *arena); -#define Lambda(a0, a1, a2, a3, a4) _Py_Lambda(a0, a1, a2, a3, a4) + int end_lineno, int end_col_offset, PyArena *arena); +#define Lambda(a0, a1, a2, a3, a4, a5, a6) _Py_Lambda(a0, a1, a2, a3, a4, a5, a6) expr_ty _Py_Lambda(arguments_ty args, expr_ty body, int lineno, int col_offset, - PyArena *arena); -#define IfExp(a0, a1, a2, a3, a4, a5) _Py_IfExp(a0, a1, a2, a3, a4, a5) + int end_lineno, int end_col_offset, PyArena *arena); +#define IfExp(a0, a1, a2, a3, a4, a5, a6, a7) _Py_IfExp(a0, a1, a2, a3, a4, a5, a6, a7) expr_ty _Py_IfExp(expr_ty test, expr_ty body, expr_ty orelse, int lineno, int - col_offset, PyArena *arena); -#define Dict(a0, a1, a2, a3, a4) _Py_Dict(a0, a1, a2, a3, a4) + col_offset, int end_lineno, int end_col_offset, PyArena + *arena); +#define Dict(a0, a1, a2, a3, a4, a5, a6) _Py_Dict(a0, a1, a2, a3, a4, a5, a6) expr_ty _Py_Dict(asdl_seq * keys, asdl_seq * values, int lineno, int - col_offset, PyArena *arena); -#define Set(a0, a1, a2, a3) _Py_Set(a0, a1, a2, a3) -expr_ty _Py_Set(asdl_seq * elts, int lineno, int col_offset, PyArena *arena); -#define ListComp(a0, a1, a2, a3, a4) _Py_ListComp(a0, a1, a2, a3, a4) + col_offset, int end_lineno, int end_col_offset, PyArena + *arena); +#define Set(a0, a1, a2, a3, a4, a5) _Py_Set(a0, a1, a2, a3, a4, a5) +expr_ty _Py_Set(asdl_seq * elts, int lineno, int col_offset, int end_lineno, + int end_col_offset, PyArena *arena); +#define ListComp(a0, a1, a2, a3, a4, a5, a6) _Py_ListComp(a0, a1, a2, a3, a4, a5, a6) expr_ty _Py_ListComp(expr_ty elt, asdl_seq * generators, int lineno, int - col_offset, PyArena *arena); -#define SetComp(a0, a1, a2, a3, a4) _Py_SetComp(a0, a1, a2, a3, a4) + col_offset, int end_lineno, int end_col_offset, PyArena + *arena); +#define SetComp(a0, a1, a2, a3, a4, a5, a6) _Py_SetComp(a0, a1, a2, a3, a4, a5, a6) expr_ty _Py_SetComp(expr_ty elt, asdl_seq * generators, int lineno, int - col_offset, PyArena *arena); -#define DictComp(a0, a1, a2, a3, a4, a5) _Py_DictComp(a0, a1, a2, a3, a4, a5) + col_offset, int end_lineno, int end_col_offset, PyArena + *arena); +#define DictComp(a0, a1, a2, a3, a4, a5, a6, a7) _Py_DictComp(a0, a1, a2, a3, a4, a5, a6, a7) expr_ty _Py_DictComp(expr_ty key, expr_ty value, asdl_seq * generators, int - lineno, int col_offset, PyArena *arena); -#define GeneratorExp(a0, a1, a2, a3, a4) _Py_GeneratorExp(a0, a1, a2, a3, a4) + lineno, int col_offset, int end_lineno, int + end_col_offset, PyArena *arena); +#define GeneratorExp(a0, a1, a2, a3, a4, a5, a6) _Py_GeneratorExp(a0, a1, a2, a3, a4, a5, a6) expr_ty _Py_GeneratorExp(expr_ty elt, asdl_seq * generators, int lineno, int - col_offset, PyArena *arena); -#define Await(a0, a1, a2, a3) _Py_Await(a0, a1, a2, a3) -expr_ty _Py_Await(expr_ty value, int lineno, int col_offset, PyArena *arena); -#define Yield(a0, a1, a2, a3) _Py_Yield(a0, a1, a2, a3) -expr_ty _Py_Yield(expr_ty value, int lineno, int col_offset, PyArena *arena); -#define YieldFrom(a0, a1, a2, a3) _Py_YieldFrom(a0, a1, a2, a3) -expr_ty _Py_YieldFrom(expr_ty value, int lineno, int col_offset, PyArena - *arena); -#define Compare(a0, a1, a2, a3, a4, a5) _Py_Compare(a0, a1, a2, a3, a4, a5) + col_offset, int end_lineno, int end_col_offset, + PyArena *arena); +#define Await(a0, a1, a2, a3, a4, a5) _Py_Await(a0, a1, a2, a3, a4, a5) +expr_ty _Py_Await(expr_ty value, int lineno, int col_offset, int end_lineno, + int end_col_offset, PyArena *arena); +#define Yield(a0, a1, a2, a3, a4, a5) _Py_Yield(a0, a1, a2, a3, a4, a5) +expr_ty _Py_Yield(expr_ty value, int lineno, int col_offset, int end_lineno, + int end_col_offset, PyArena *arena); +#define YieldFrom(a0, a1, a2, a3, a4, a5) _Py_YieldFrom(a0, a1, a2, a3, a4, a5) +expr_ty _Py_YieldFrom(expr_ty value, int lineno, int col_offset, int + end_lineno, int end_col_offset, PyArena *arena); +#define Compare(a0, a1, a2, a3, a4, a5, a6, a7) _Py_Compare(a0, a1, a2, a3, a4, a5, a6, a7) expr_ty _Py_Compare(expr_ty left, asdl_int_seq * ops, asdl_seq * comparators, - int lineno, int col_offset, PyArena *arena); -#define Call(a0, a1, a2, a3, a4, a5) _Py_Call(a0, a1, a2, a3, a4, a5) + int lineno, int col_offset, int end_lineno, int + end_col_offset, PyArena *arena); +#define Call(a0, a1, a2, a3, a4, a5, a6, a7) _Py_Call(a0, a1, a2, a3, a4, a5, a6, a7) expr_ty _Py_Call(expr_ty func, asdl_seq * args, asdl_seq * keywords, int - lineno, int col_offset, PyArena *arena); -#define FormattedValue(a0, a1, a2, a3, a4, a5) _Py_FormattedValue(a0, a1, a2, a3, a4, a5) + lineno, int col_offset, int end_lineno, int end_col_offset, + PyArena *arena); +#define FormattedValue(a0, a1, a2, a3, a4, a5, a6, a7) _Py_FormattedValue(a0, a1, a2, a3, a4, a5, a6, a7) expr_ty _Py_FormattedValue(expr_ty value, int conversion, expr_ty format_spec, - int lineno, int col_offset, PyArena *arena); -#define JoinedStr(a0, a1, a2, a3) _Py_JoinedStr(a0, a1, a2, a3) -expr_ty _Py_JoinedStr(asdl_seq * values, int lineno, int col_offset, PyArena - *arena); -#define Constant(a0, a1, a2, a3) _Py_Constant(a0, a1, a2, a3) -expr_ty _Py_Constant(constant value, int lineno, int col_offset, PyArena - *arena); -#define Attribute(a0, a1, a2, a3, a4, a5) _Py_Attribute(a0, a1, a2, a3, a4, a5) + int lineno, int col_offset, int end_lineno, int + end_col_offset, PyArena *arena); +#define JoinedStr(a0, a1, a2, a3, a4, a5) _Py_JoinedStr(a0, a1, a2, a3, a4, a5) +expr_ty _Py_JoinedStr(asdl_seq * values, int lineno, int col_offset, int + end_lineno, int end_col_offset, PyArena *arena); +#define Constant(a0, a1, a2, a3, a4, a5) _Py_Constant(a0, a1, a2, a3, a4, a5) +expr_ty _Py_Constant(constant value, int lineno, int col_offset, int + end_lineno, int end_col_offset, PyArena *arena); +#define Attribute(a0, a1, a2, a3, a4, a5, a6, a7) _Py_Attribute(a0, a1, a2, a3, a4, a5, a6, a7) expr_ty _Py_Attribute(expr_ty value, identifier attr, expr_context_ty ctx, int - lineno, int col_offset, PyArena *arena); -#define Subscript(a0, a1, a2, a3, a4, a5) _Py_Subscript(a0, a1, a2, a3, a4, a5) + lineno, int col_offset, int end_lineno, int + end_col_offset, PyArena *arena); +#define Subscript(a0, a1, a2, a3, a4, a5, a6, a7) _Py_Subscript(a0, a1, a2, a3, a4, a5, a6, a7) expr_ty _Py_Subscript(expr_ty value, slice_ty slice, expr_context_ty ctx, int - lineno, int col_offset, PyArena *arena); -#define Starred(a0, a1, a2, a3, a4) _Py_Starred(a0, a1, a2, a3, a4) + lineno, int col_offset, int end_lineno, int + end_col_offset, PyArena *arena); +#define Starred(a0, a1, a2, a3, a4, a5, a6) _Py_Starred(a0, a1, a2, a3, a4, a5, a6) expr_ty _Py_Starred(expr_ty value, expr_context_ty ctx, int lineno, int - col_offset, PyArena *arena); -#define Name(a0, a1, a2, a3, a4) _Py_Name(a0, a1, a2, a3, a4) + col_offset, int end_lineno, int end_col_offset, PyArena + *arena); +#define Name(a0, a1, a2, a3, a4, a5, a6) _Py_Name(a0, a1, a2, a3, a4, a5, a6) expr_ty _Py_Name(identifier id, expr_context_ty ctx, int lineno, int - col_offset, PyArena *arena); -#define List(a0, a1, a2, a3, a4) _Py_List(a0, a1, a2, a3, a4) + col_offset, int end_lineno, int end_col_offset, PyArena + *arena); +#define List(a0, a1, a2, a3, a4, a5, a6) _Py_List(a0, a1, a2, a3, a4, a5, a6) expr_ty _Py_List(asdl_seq * elts, expr_context_ty ctx, int lineno, int - col_offset, PyArena *arena); -#define Tuple(a0, a1, a2, a3, a4) _Py_Tuple(a0, a1, a2, a3, a4) + col_offset, int end_lineno, int end_col_offset, PyArena + *arena); +#define Tuple(a0, a1, a2, a3, a4, a5, a6) _Py_Tuple(a0, a1, a2, a3, a4, a5, a6) expr_ty _Py_Tuple(asdl_seq * elts, expr_context_ty ctx, int lineno, int - col_offset, PyArena *arena); + col_offset, int end_lineno, int end_col_offset, PyArena + *arena); #define Slice(a0, a1, a2, a3) _Py_Slice(a0, a1, a2, a3) slice_ty _Py_Slice(expr_ty lower, expr_ty upper, expr_ty step, PyArena *arena); #define ExtSlice(a0, a1) _Py_ExtSlice(a0, a1) @@ -593,17 +638,18 @@ slice_ty _Py_Index(expr_ty value, PyArena *arena); #define comprehension(a0, a1, a2, a3, a4) _Py_comprehension(a0, a1, a2, a3, a4) comprehension_ty _Py_comprehension(expr_ty target, expr_ty iter, asdl_seq * ifs, int is_async, PyArena *arena); -#define ExceptHandler(a0, a1, a2, a3, a4, a5) _Py_ExceptHandler(a0, a1, a2, a3, a4, a5) +#define ExceptHandler(a0, a1, a2, a3, a4, a5, a6, a7) _Py_ExceptHandler(a0, a1, a2, a3, a4, a5, a6, a7) excepthandler_ty _Py_ExceptHandler(expr_ty type, identifier name, asdl_seq * - body, int lineno, int col_offset, PyArena + body, int lineno, int col_offset, int + end_lineno, int end_col_offset, PyArena *arena); #define arguments(a0, a1, a2, a3, a4, a5, a6) _Py_arguments(a0, a1, a2, a3, a4, a5, a6) arguments_ty _Py_arguments(asdl_seq * args, arg_ty vararg, asdl_seq * kwonlyargs, asdl_seq * kw_defaults, arg_ty kwarg, asdl_seq * defaults, PyArena *arena); -#define arg(a0, a1, a2, a3, a4) _Py_arg(a0, a1, a2, a3, a4) +#define arg(a0, a1, a2, a3, a4, a5, a6) _Py_arg(a0, a1, a2, a3, a4, a5, a6) arg_ty _Py_arg(identifier arg, expr_ty annotation, int lineno, int col_offset, - PyArena *arena); + int end_lineno, int end_col_offset, PyArena *arena); #define keyword(a0, a1, a2) _Py_keyword(a0, a1, a2) keyword_ty _Py_keyword(identifier arg, expr_ty value, PyArena *arena); #define alias(a0, a1, a2) _Py_alias(a0, a1, a2) diff --git a/Include/node.h b/Include/node.h index 40596dfecf18..2b3907409738 100644 --- a/Include/node.h +++ b/Include/node.h @@ -14,11 +14,14 @@ typedef struct _node { int n_col_offset; int n_nchildren; struct _node *n_child; + int n_end_lineno; + int n_end_col_offset; } node; PyAPI_FUNC(node *) PyNode_New(int type); PyAPI_FUNC(int) PyNode_AddChild(node *n, int type, - char *str, int lineno, int col_offset); + char *str, int lineno, int col_offset, + int end_lineno, int end_col_offset); PyAPI_FUNC(void) PyNode_Free(node *n); #ifndef Py_LIMITED_API PyAPI_FUNC(Py_ssize_t) _PyNode_SizeOf(node *n); @@ -37,6 +40,7 @@ PyAPI_FUNC(Py_ssize_t) _PyNode_SizeOf(node *n); #define REQ(n, type) assert(TYPE(n) == (type)) PyAPI_FUNC(void) PyNode_ListTree(node *); +void _PyNode_FinalizeEndPos(node *n); // helper also used in parsetok.c #ifdef __cplusplus } diff --git a/Lib/ast.py b/Lib/ast.py index 03b8a1b16b7a..6c1e978b0586 100644 --- a/Lib/ast.py +++ b/Lib/ast.py @@ -115,10 +115,10 @@ def _format(node): def copy_location(new_node, old_node): """ - Copy source location (`lineno` and `col_offset` attributes) from - *old_node* to *new_node* if possible, and return *new_node*. + Copy source location (`lineno`, `col_offset`, `end_lineno`, and `end_col_offset` + attributes) from *old_node* to *new_node* if possible, and return *new_node*. """ - for attr in 'lineno', 'col_offset': + for attr in 'lineno', 'col_offset', 'end_lineno', 'end_col_offset': if attr in old_node._attributes and attr in new_node._attributes \ and hasattr(old_node, attr): setattr(new_node, attr, getattr(old_node, attr)) @@ -133,31 +133,44 @@ def fix_missing_locations(node): recursively where not already set, by setting them to the values of the parent node. It works recursively starting at *node*. """ - def _fix(node, lineno, col_offset): + def _fix(node, lineno, col_offset, end_lineno, end_col_offset): if 'lineno' in node._attributes: if not hasattr(node, 'lineno'): node.lineno = lineno else: lineno = node.lineno + if 'end_lineno' in node._attributes: + if not hasattr(node, 'end_lineno'): + node.end_lineno = end_lineno + else: + end_lineno = node.end_lineno if 'col_offset' in node._attributes: if not hasattr(node, 'col_offset'): node.col_offset = col_offset else: col_offset = node.col_offset + if 'end_col_offset' in node._attributes: + if not hasattr(node, 'end_col_offset'): + node.end_col_offset = end_col_offset + else: + end_col_offset = node.end_col_offset for child in iter_child_nodes(node): - _fix(child, lineno, col_offset) - _fix(node, 1, 0) + _fix(child, lineno, col_offset, end_lineno, end_col_offset) + _fix(node, 1, 0, 1, 0) return node def increment_lineno(node, n=1): """ - Increment the line number of each node in the tree starting at *node* by *n*. - This is useful to "move code" to a different location in a file. + Increment the line number and end line number of each node in the tree + starting at *node* by *n*. This is useful to "move code" to a different + location in a file. """ for child in walk(node): if 'lineno' in child._attributes: child.lineno = getattr(child, 'lineno', 0) + n + if 'end_lineno' in child._attributes: + child.end_lineno = getattr(child, 'end_lineno', 0) + n return node @@ -213,6 +226,77 @@ def get_docstring(node, clean=True): return text +def _splitlines_no_ff(source): + """Split a string into lines ignoring form feed and other chars. + + This mimics how the Python parser splits source code. + """ + idx = 0 + lines = [] + next_line = '' + while idx < len(source): + c = source[idx] + next_line += c + idx += 1 + # Keep \r\n together + if c == '\r' and idx < len(source) and source[idx] == '\n': + next_line += '\n' + idx += 1 + if c in '\r\n': + lines.append(next_line) + next_line = '' + + if next_line: + lines.append(next_line) + return lines + + +def _pad_whitespace(source): + """Replace all chars except '\f\t' in a line with spaces.""" + result = '' + for c in source: + if c in '\f\t': + result += c + else: + result += ' ' + return result + + +def get_source_segment(source, node, *, padded=False): + """Get source code segment of the *source* that generated *node*. + + If some location information (`lineno`, `end_lineno`, `col_offset`, + or `end_col_offset`) is missing, return None. + + If *padded* is `True`, the first line of a multi-line statement will + be padded with spaces to match its original position. + """ + try: + lineno = node.lineno - 1 + end_lineno = node.end_lineno - 1 + col_offset = node.col_offset + end_col_offset = node.end_col_offset + except AttributeError: + return None + + lines = _splitlines_no_ff(source) + if end_lineno == lineno: + return lines[lineno].encode()[col_offset:end_col_offset].decode() + + if padded: + padding = _pad_whitespace(lines[lineno].encode()[:col_offset].decode()) + else: + padding = '' + + first = padding + lines[lineno].encode()[col_offset:].decode() + last = lines[end_lineno].encode()[:end_col_offset].decode() + lines = lines[lineno+1:end_lineno] + + lines.insert(0, first) + lines.append(last) + return ''.join(lines) + + def walk(node): """ Recursively yield all descendant nodes in the tree starting at *node* diff --git a/Lib/test/test_asdl_parser.py b/Lib/test/test_asdl_parser.py index 15bc684904c9..30e6466dbb32 100644 --- a/Lib/test/test_asdl_parser.py +++ b/Lib/test/test_asdl_parser.py @@ -62,14 +62,16 @@ def test_product(self): def test_attributes(self): stmt = self.types['stmt'] - self.assertEqual(len(stmt.attributes), 2) + self.assertEqual(len(stmt.attributes), 4) self.assertEqual(str(stmt.attributes[0]), 'Field(int, lineno)') self.assertEqual(str(stmt.attributes[1]), 'Field(int, col_offset)') + self.assertEqual(str(stmt.attributes[2]), 'Field(int, end_lineno, opt=True)') + self.assertEqual(str(stmt.attributes[3]), 'Field(int, end_col_offset, opt=True)') def test_constructor_fields(self): ehandler = self.types['excepthandler'] self.assertEqual(len(ehandler.types), 1) - self.assertEqual(len(ehandler.attributes), 2) + self.assertEqual(len(ehandler.attributes), 4) cons = ehandler.types[0] self.assertIsInstance(cons, self.asdl.Constructor) diff --git a/Lib/test/test_ast.py b/Lib/test/test_ast.py index 4bf77ff046e1..09e425de2c30 100644 --- a/Lib/test/test_ast.py +++ b/Lib/test/test_ast.py @@ -4,6 +4,7 @@ import sys import unittest import weakref +from textwrap import dedent from test import support @@ -582,19 +583,22 @@ def test_dump(self): ) self.assertEqual(ast.dump(node, include_attributes=True), "Module(body=[Expr(value=Call(func=Name(id='spam', ctx=Load(), " - "lineno=1, col_offset=0), args=[Name(id='eggs', ctx=Load(), " - "lineno=1, col_offset=5), Constant(value='and cheese', lineno=1, " - "col_offset=11)], keywords=[], " - "lineno=1, col_offset=0), lineno=1, col_offset=0)])" + "lineno=1, col_offset=0, end_lineno=1, end_col_offset=4), " + "args=[Name(id='eggs', ctx=Load(), lineno=1, col_offset=5, " + "end_lineno=1, end_col_offset=9), Constant(value='and cheese', " + "lineno=1, col_offset=11, end_lineno=1, end_col_offset=23)], keywords=[], " + "lineno=1, col_offset=0, end_lineno=1, end_col_offset=24), " + "lineno=1, col_offset=0, end_lineno=1, end_col_offset=24)])" ) def test_copy_location(self): src = ast.parse('1 + 1', mode='eval') src.body.right = ast.copy_location(ast.Num(2), src.body.right) self.assertEqual(ast.dump(src, include_attributes=True), - 'Expression(body=BinOp(left=Constant(value=1, lineno=1, col_offset=0), ' - 'op=Add(), right=Constant(value=2, lineno=1, col_offset=4), lineno=1, ' - 'col_offset=0))' + 'Expression(body=BinOp(left=Constant(value=1, lineno=1, col_offset=0, ' + 'end_lineno=1, end_col_offset=1), op=Add(), right=Constant(value=2, ' + 'lineno=1, col_offset=4, end_lineno=1, end_col_offset=5), lineno=1, ' + 'col_offset=0, end_lineno=1, end_col_offset=5))' ) def test_fix_missing_locations(self): @@ -602,32 +606,37 @@ def test_fix_missing_locations(self): src.body.append(ast.Expr(ast.Call(ast.Name('spam', ast.Load()), [ast.Str('eggs')], []))) self.assertEqual(src, ast.fix_missing_locations(src)) + self.maxDiff = None self.assertEqual(ast.dump(src, include_attributes=True), "Module(body=[Expr(value=Call(func=Name(id='write', ctx=Load(), " - "lineno=1, col_offset=0), args=[Constant(value='spam', lineno=1, " - "col_offset=6)], keywords=[], " - "lineno=1, col_offset=0), lineno=1, col_offset=0), " - "Expr(value=Call(func=Name(id='spam', ctx=Load(), lineno=1, " - "col_offset=0), args=[Constant(value='eggs', lineno=1, col_offset=0)], " - "keywords=[], lineno=1, " - "col_offset=0), lineno=1, col_offset=0)])" + "lineno=1, col_offset=0, end_lineno=1, end_col_offset=5), " + "args=[Constant(value='spam', lineno=1, col_offset=6, end_lineno=1, " + "end_col_offset=12)], keywords=[], lineno=1, col_offset=0, end_lineno=1, " + "end_col_offset=13), lineno=1, col_offset=0, end_lineno=1, " + "end_col_offset=13), Expr(value=Call(func=Name(id='spam', ctx=Load(), " + "lineno=1, col_offset=0, end_lineno=1, end_col_offset=0), " + "args=[Constant(value='eggs', lineno=1, col_offset=0, end_lineno=1, " + "end_col_offset=0)], keywords=[], lineno=1, col_offset=0, end_lineno=1, " + "end_col_offset=0), lineno=1, col_offset=0, end_lineno=1, end_col_offset=0)])" ) def test_increment_lineno(self): src = ast.parse('1 + 1', mode='eval') self.assertEqual(ast.increment_lineno(src, n=3), src) self.assertEqual(ast.dump(src, include_attributes=True), - 'Expression(body=BinOp(left=Constant(value=1, lineno=4, col_offset=0), ' - 'op=Add(), right=Constant(value=1, lineno=4, col_offset=4), lineno=4, ' - 'col_offset=0))' + 'Expression(body=BinOp(left=Constant(value=1, lineno=4, col_offset=0, ' + 'end_lineno=4, end_col_offset=1), op=Add(), right=Constant(value=1, ' + 'lineno=4, col_offset=4, end_lineno=4, end_col_offset=5), lineno=4, ' + 'col_offset=0, end_lineno=4, end_col_offset=5))' ) # issue10869: do not increment lineno of root twice src = ast.parse('1 + 1', mode='eval') self.assertEqual(ast.increment_lineno(src.body, n=3), src.body) self.assertEqual(ast.dump(src, include_attributes=True), - 'Expression(body=BinOp(left=Constant(value=1, lineno=4, col_offset=0), ' - 'op=Add(), right=Constant(value=1, lineno=4, col_offset=4), lineno=4, ' - 'col_offset=0))' + 'Expression(body=BinOp(left=Constant(value=1, lineno=4, col_offset=0, ' + 'end_lineno=4, end_col_offset=1), op=Add(), right=Constant(value=1, ' + 'lineno=4, col_offset=4, end_lineno=4, end_col_offset=5), lineno=4, ' + 'col_offset=0, end_lineno=4, end_col_offset=5))' ) def test_iter_fields(self): @@ -1274,6 +1283,311 @@ def test_literal_eval(self): self.assertEqual(ast.literal_eval(binop), 10+20j) +class EndPositionTests(unittest.TestCase): + """Tests for end position of AST nodes. + + Testing end positions of nodes requires a bit of extra care + because of how LL parsers work. + """ + def _check_end_pos(self, ast_node, end_lineno, end_col_offset): + self.assertEqual(ast_node.end_lineno, end_lineno) + self.assertEqual(ast_node.end_col_offset, end_col_offset) + + def _check_content(self, source, ast_node, content): + self.assertEqual(ast.get_source_segment(source, ast_node), content) + + def _parse_value(self, s): + # Use duck-typing to support both single expression + # and a right hand side of an assignment statement. + return ast.parse(s).body[0].value + + def test_lambda(self): + s = 'lambda x, *y: None' + lam = self._parse_value(s) + self._check_content(s, lam.body, 'None') + self._check_content(s, lam.args.args[0], 'x') + self._check_content(s, lam.args.vararg, 'y') + + def test_func_def(self): + s = dedent(''' + def func(x: int, + *args: str, + z: float = 0, + **kwargs: Any) -> bool: + return True + ''').strip() + fdef = ast.parse(s).body[0] + self._check_end_pos(fdef, 5, 15) + self._check_content(s, fdef.body[0], 'return True') + self._check_content(s, fdef.args.args[0], 'x: int') + self._check_content(s, fdef.args.args[0].annotation, 'int') + self._check_content(s, fdef.args.kwarg, 'kwargs: Any') + self._check_content(s, fdef.args.kwarg.annotation, 'Any') + + def test_call(self): + s = 'func(x, y=2, **kw)' + call = self._parse_value(s) + self._check_content(s, call.func, 'func') + self._check_content(s, call.keywords[0].value, '2') + self._check_content(s, call.keywords[1].value, 'kw') + + def test_call_noargs(self): + s = 'x[0]()' + call = self._parse_value(s) + self._check_content(s, call.func, 'x[0]') + self._check_end_pos(call, 1, 6) + + def test_class_def(self): + s = dedent(''' + class C(A, B): + x: int = 0 + ''').strip() + cdef = ast.parse(s).body[0] + self._check_end_pos(cdef, 2, 14) + self._check_content(s, cdef.bases[1], 'B') + self._check_content(s, cdef.body[0], 'x: int = 0') + + def test_class_kw(self): + s = 'class S(metaclass=abc.ABCMeta): pass' + cdef = ast.parse(s).body[0] + self._check_content(s, cdef.keywords[0].value, 'abc.ABCMeta') + + def test_multi_line_str(self): + s = dedent(''' + x = """Some multi-line text. + + It goes on starting from same indent.""" + ''').strip() + assign = ast.parse(s).body[0] + self._check_end_pos(assign, 3, 40) + self._check_end_pos(assign.value, 3, 40) + + def test_continued_str(self): + s = dedent(''' + x = "first part" \\ + "second part" + ''').strip() + assign = ast.parse(s).body[0] + self._check_end_pos(assign, 2, 13) + self._check_end_pos(assign.value, 2, 13) + + def test_suites(self): + # We intentionally put these into the same string to check + # that empty lines are not part of the suite. + s = dedent(''' + while True: + pass + + if one(): + x = None + elif other(): + y = None + else: + z = None + + for x, y in stuff: + assert True + + try: + raise RuntimeError + except TypeError as e: + pass + + pass + ''').strip() + mod = ast.parse(s) + while_loop = mod.body[0] + if_stmt = mod.body[1] + for_loop = mod.body[2] + try_stmt = mod.body[3] + pass_stmt = mod.body[4] + + self._check_end_pos(while_loop, 2, 8) + self._check_end_pos(if_stmt, 9, 12) + self._check_end_pos(for_loop, 12, 15) + self._check_end_pos(try_stmt, 17, 8) + self._check_end_pos(pass_stmt, 19, 4) + + self._check_content(s, while_loop.test, 'True') + self._check_content(s, if_stmt.body[0], 'x = None') + self._check_content(s, if_stmt.orelse[0].test, 'other()') + self._check_content(s, for_loop.target, 'x, y') + self._check_content(s, try_stmt.body[0], 'raise RuntimeError') + self._check_content(s, try_stmt.handlers[0].type, 'TypeError') + + def test_fstring(self): + s = 'x = f"abc {x + y} abc"' + fstr = self._parse_value(s) + binop = fstr.values[1].value + self._check_content(s, binop, 'x + y') + + def test_fstring_multi_line(self): + s = dedent(''' + f"""Some multi-line text. + { + arg_one + + + arg_two + } + It goes on...""" + ''').strip() + fstr = self._parse_value(s) + binop = fstr.values[1].value + self._check_end_pos(binop, 5, 7) + self._check_content(s, binop.left, 'arg_one') + self._check_content(s, binop.right, 'arg_two') + + def test_import_from_multi_line(self): + s = dedent(''' + from x.y.z import ( + a, b, c as c + ) + ''').strip() + imp = ast.parse(s).body[0] + self._check_end_pos(imp, 3, 1) + + def test_slices(self): + s1 = 'f()[1, 2] [0]' + s2 = 'x[ a.b: c.d]' + sm = dedent(''' + x[ a.b: f () , + g () : c.d + ] + ''').strip() + i1, i2, im = map(self._parse_value, (s1, s2, sm)) + self._check_content(s1, i1.value, 'f()[1, 2]') + self._check_content(s1, i1.value.slice.value, '1, 2') + self._check_content(s2, i2.slice.lower, 'a.b') + self._check_content(s2, i2.slice.upper, 'c.d') + self._check_content(sm, im.slice.dims[0].upper, 'f ()') + self._check_content(sm, im.slice.dims[1].lower, 'g ()') + self._check_end_pos(im, 3, 3) + + def test_binop(self): + s = dedent(''' + (1 * 2 + (3 ) + + 4 + ) + ''').strip() + binop = self._parse_value(s) + self._check_end_pos(binop, 2, 6) + self._check_content(s, binop.right, '4') + self._check_content(s, binop.left, '1 * 2 + (3 )') + self._check_content(s, binop.left.right, '3') + + def test_boolop(self): + s = dedent(''' + if (one_condition and + (other_condition or yet_another_one)): + pass + ''').strip() + bop = ast.parse(s).body[0].test + self._check_end_pos(bop, 2, 44) + self._check_content(s, bop.values[1], + 'other_condition or yet_another_one') + + def test_tuples(self): + s1 = 'x = () ;' + s2 = 'x = 1 , ;' + s3 = 'x = (1 , 2 ) ;' + sm = dedent(''' + x = ( + a, b, + ) + ''').strip() + t1, t2, t3, tm = map(self._parse_value, (s1, s2, s3, sm)) + self._check_content(s1, t1, '()') + self._check_content(s2, t2, '1 ,') + self._check_content(s3, t3, '(1 , 2 )') + self._check_end_pos(tm, 3, 1) + + def test_attribute_spaces(self): + s = 'func(x. y .z)' + call = self._parse_value(s) + self._check_content(s, call, s) + self._check_content(s, call.args[0], 'x. y .z') + + def test_displays(self): + s1 = '[{}, {1, }, {1, 2,} ]' + s2 = '{a: b, f (): g () ,}' + c1 = self._parse_value(s1) + c2 = self._parse_value(s2) + self._check_content(s1, c1.elts[0], '{}') + self._check_content(s1, c1.elts[1], '{1, }') + self._check_content(s1, c1.elts[2], '{1, 2,}') + self._check_content(s2, c2.keys[1], 'f ()') + self._check_content(s2, c2.values[1], 'g ()') + + def test_comprehensions(self): + s = dedent(''' + x = [{x for x, y in stuff + if cond.x} for stuff in things] + ''').strip() + cmp = self._parse_value(s) + self._check_end_pos(cmp, 2, 37) + self._check_content(s, cmp.generators[0].iter, 'things') + self._check_content(s, cmp.elt.generators[0].iter, 'stuff') + self._check_content(s, cmp.elt.generators[0].ifs[0], 'cond.x') + self._check_content(s, cmp.elt.generators[0].target, 'x, y') + + def test_yield_await(self): + s = dedent(''' + async def f(): + yield x + await y + ''').strip() + fdef = ast.parse(s).body[0] + self._check_content(s, fdef.body[0].value, 'yield x') + self._check_content(s, fdef.body[1].value, 'await y') + + def test_source_segment_multi(self): + s_orig = dedent(''' + x = ( + a, b, + ) + () + ''').strip() + s_tuple = dedent(''' + ( + a, b, + ) + ''').strip() + binop = self._parse_value(s_orig) + self.assertEqual(ast.get_source_segment(s_orig, binop.left), s_tuple) + + def test_source_segment_padded(self): + s_orig = dedent(''' + class C: + def fun(self) -> None: + "?????" + ''').strip() + s_method = ' def fun(self) -> None:\n' \ + ' "?????"' + cdef = ast.parse(s_orig).body[0] + self.assertEqual(ast.get_source_segment(s_orig, cdef.body[0], padded=True), + s_method) + + def test_source_segment_endings(self): + s = 'v = 1\r\nw = 1\nx = 1\n\ry = 1\rz = 1\r\n' + v, w, x, y, z = ast.parse(s).body + self._check_content(s, v, 'v = 1') + self._check_content(s, w, 'w = 1') + self._check_content(s, x, 'x = 1') + self._check_content(s, y, 'y = 1') + self._check_content(s, z, 'z = 1') + + def test_source_segment_tabs(self): + s = dedent(''' + class C: + \t\f def fun(self) -> None: + \t\f pass + ''').strip() + s_method = ' \t\f def fun(self) -> None:\n' \ + ' \t\f pass' + + cdef = ast.parse(s).body[0] + self.assertEqual(ast.get_source_segment(s, cdef.body[0], padded=True), s_method) + + def main(): if __name__ != '__main__': return diff --git a/Lib/test/test_parser.py b/Lib/test/test_parser.py index 274e26061a19..9b58bb93c81c 100644 --- a/Lib/test/test_parser.py +++ b/Lib/test/test_parser.py @@ -880,7 +880,7 @@ def XXXROUNDUP(n): return 1 << (n - 1).bit_length() basesize = support.calcobjsize('Pii') - nodesize = struct.calcsize('hP3iP0h') + nodesize = struct.calcsize('hP3iP0h2i') def sizeofchildren(node): if node is None: return 0 diff --git a/Misc/NEWS.d/next/Core and Builtins/2019-01-19-19-41-53.bpo-33416.VDeOU5.rst b/Misc/NEWS.d/next/Core and Builtins/2019-01-19-19-41-53.bpo-33416.VDeOU5.rst new file mode 100644 index 000000000000..2e618e4f0e1f --- /dev/null +++ b/Misc/NEWS.d/next/Core and Builtins/2019-01-19-19-41-53.bpo-33416.VDeOU5.rst @@ -0,0 +1,2 @@ +Add end line and end column position information to the Python AST nodes. +This is a C-level backwards incompatible change. \ No newline at end of file diff --git a/Modules/parsermodule.c b/Modules/parsermodule.c index 8f88657c00b6..eabc5c810f6c 100644 --- a/Modules/parsermodule.c +++ b/Modules/parsermodule.c @@ -920,7 +920,7 @@ build_node_children(PyObject *tuple, node *root, int *line_num) Py_DECREF(elem); return NULL; } - err = PyNode_AddChild(root, type, strn, *line_num, 0); + err = PyNode_AddChild(root, type, strn, *line_num, 0, *line_num, 0); if (err == E_NOMEM) { Py_DECREF(elem); PyObject_FREE(strn); diff --git a/Parser/Python.asdl b/Parser/Python.asdl index eee982be1c95..cedf37a2d9f9 100644 --- a/Parser/Python.asdl +++ b/Parser/Python.asdl @@ -50,7 +50,7 @@ module Python -- XXX Jython will be different -- col_offset is the byte offset in the utf8 string the parser uses - attributes (int lineno, int col_offset) + attributes (int lineno, int col_offset, int? end_lineno, int? end_col_offset) -- BoolOp() can use left & right? expr = BoolOp(boolop op, expr* values) @@ -85,7 +85,7 @@ module Python | Tuple(expr* elts, expr_context ctx) -- col_offset is the byte offset in the utf8 string the parser uses - attributes (int lineno, int col_offset) + attributes (int lineno, int col_offset, int? end_lineno, int? end_col_offset) expr_context = Load | Store | Del | AugLoad | AugStore | Param @@ -105,13 +105,13 @@ module Python comprehension = (expr target, expr iter, expr* ifs, int is_async) excepthandler = ExceptHandler(expr? type, identifier? name, stmt* body) - attributes (int lineno, int col_offset) + attributes (int lineno, int col_offset, int? end_lineno, int? end_col_offset) arguments = (arg* args, arg? vararg, arg* kwonlyargs, expr* kw_defaults, arg? kwarg, expr* defaults) arg = (identifier arg, expr? annotation) - attributes (int lineno, int col_offset) + attributes (int lineno, int col_offset, int? end_lineno, int? end_col_offset) -- keyword arguments supplied to call (NULL identifier for **kwargs) keyword = (identifier? arg, expr value) diff --git a/Parser/asdl_c.py b/Parser/asdl_c.py index 75fb78b9c94e..8640b29b8f10 100644 --- a/Parser/asdl_c.py +++ b/Parser/asdl_c.py @@ -1250,10 +1250,12 @@ def main(srcfile, dump_module=False): f.write('#undef Yield /* undefine macro conflicting with */\n') f.write('\n') c = ChainOfVisitors(TypeDefVisitor(f), - StructVisitor(f), - PrototypeVisitor(f), - ) + StructVisitor(f)) + c.visit(mod) + f.write("// Note: these macros affect function definitions, not only call sites.\n") + PrototypeVisitor(f).visit(mod) + f.write("\n") f.write("PyObject* PyAST_mod2obj(mod_ty t);\n") f.write("mod_ty PyAST_obj2mod(PyObject* ast, PyArena* arena, int mode);\n") f.write("int PyAST_Check(PyObject* obj);\n") diff --git a/Parser/node.c b/Parser/node.c index 240d29057c4e..f1b70e0f6815 100644 --- a/Parser/node.c +++ b/Parser/node.c @@ -13,6 +13,8 @@ PyNode_New(int type) n->n_type = type; n->n_str = NULL; n->n_lineno = 0; + n->n_end_lineno = 0; + n->n_end_col_offset = -1; n->n_nchildren = 0; n->n_child = NULL; return n; @@ -75,14 +77,34 @@ fancy_roundup(int n) fancy_roundup(n)) +void +_PyNode_FinalizeEndPos(node *n) +{ + int nch = NCH(n); + node *last; + if (nch == 0) { + return; + } + last = CHILD(n, nch - 1); + _PyNode_FinalizeEndPos(last); + n->n_end_lineno = last->n_end_lineno; + n->n_end_col_offset = last->n_end_col_offset; +} + int -PyNode_AddChild(node *n1, int type, char *str, int lineno, int col_offset) +PyNode_AddChild(node *n1, int type, char *str, int lineno, int col_offset, + int end_lineno, int end_col_offset) { const int nch = n1->n_nchildren; int current_capacity; int required_capacity; node *n; + // finalize end position of previous node (if any) + if (nch > 0) { + _PyNode_FinalizeEndPos(CHILD(n1, nch - 1)); + } + if (nch == INT_MAX || nch < 0) return E_OVERFLOW; @@ -107,6 +129,8 @@ PyNode_AddChild(node *n1, int type, char *str, int lineno, int col_offset) n->n_str = str; n->n_lineno = lineno; n->n_col_offset = col_offset; + n->n_end_lineno = end_lineno; // this and below will be updates after all children are added. + n->n_end_col_offset = end_col_offset; n->n_nchildren = 0; n->n_child = NULL; return 0; diff --git a/Parser/parser.c b/Parser/parser.c index 41072c478c26..a9916d392aab 100644 --- a/Parser/parser.c +++ b/Parser/parser.c @@ -105,11 +105,13 @@ PyParser_Delete(parser_state *ps) /* PARSER STACK OPERATIONS */ static int -shift(stack *s, int type, char *str, int newstate, int lineno, int col_offset) +shift(stack *s, int type, char *str, int newstate, int lineno, int col_offset, + int end_lineno, int end_col_offset) { int err; assert(!s_empty(s)); - err = PyNode_AddChild(s->s_top->s_parent, type, str, lineno, col_offset); + err = PyNode_AddChild(s->s_top->s_parent, type, str, lineno, col_offset, + end_lineno, end_col_offset); if (err) return err; s->s_top->s_state = newstate; @@ -117,13 +119,15 @@ shift(stack *s, int type, char *str, int newstate, int lineno, int col_offset) } static int -push(stack *s, int type, dfa *d, int newstate, int lineno, int col_offset) +push(stack *s, int type, dfa *d, int newstate, int lineno, int col_offset, + int end_lineno, int end_col_offset) { int err; node *n; n = s->s_top->s_parent; assert(!s_empty(s)); - err = PyNode_AddChild(n, type, (char *)NULL, lineno, col_offset); + err = PyNode_AddChild(n, type, (char *)NULL, lineno, col_offset, + end_lineno, end_col_offset); if (err) return err; s->s_top->s_state = newstate; @@ -225,7 +229,9 @@ future_hack(parser_state *ps) int PyParser_AddToken(parser_state *ps, int type, char *str, - int lineno, int col_offset, int *expected_ret) + int lineno, int col_offset, + int end_lineno, int end_col_offset, + int *expected_ret) { int ilabel; int err; @@ -257,7 +263,8 @@ PyParser_AddToken(parser_state *ps, int type, char *str, dfa *d1 = PyGrammar_FindDFA( ps->p_grammar, nt); if ((err = push(&ps->p_stack, nt, d1, - arrow, lineno, col_offset)) > 0) { + arrow, lineno, col_offset, + end_lineno, end_col_offset)) > 0) { D(printf(" MemError: push\n")); return err; } @@ -267,7 +274,8 @@ PyParser_AddToken(parser_state *ps, int type, char *str, /* Shift the token */ if ((err = shift(&ps->p_stack, type, str, - x, lineno, col_offset)) > 0) { + x, lineno, col_offset, + end_lineno, end_col_offset)) > 0) { D(printf(" MemError: shift.\n")); return err; } diff --git a/Parser/parser.h b/Parser/parser.h index 39df9487285c..95cd39d209dd 100644 --- a/Parser/parser.h +++ b/Parser/parser.h @@ -32,7 +32,9 @@ typedef struct { parser_state *PyParser_New(grammar *g, int start); void PyParser_Delete(parser_state *ps); -int PyParser_AddToken(parser_state *ps, int type, char *str, int lineno, int col_offset, +int PyParser_AddToken(parser_state *ps, int type, char *str, + int lineno, int col_offset, + int end_lineno, int end_col_offset, int *expected_ret); void PyGrammar_AddAccelerators(grammar *g); diff --git a/Parser/parsetok.c b/Parser/parsetok.c index d37e28a0a36c..2b5254a8be67 100644 --- a/Parser/parsetok.c +++ b/Parser/parsetok.c @@ -187,7 +187,7 @@ parsetok(struct tok_state *tok, grammar *g, int start, perrdetail *err_ret, parser_state *ps; node *n; int started = 0; - int col_offset; + int col_offset, end_col_offset; if ((ps = PyParser_New(g, start)) == NULL) { err_ret->error = E_NOMEM; @@ -270,9 +270,16 @@ parsetok(struct tok_state *tok, grammar *g, int start, perrdetail *err_ret, col_offset = -1; } + if (b != NULL && b >= tok->line_start) { + end_col_offset = Py_SAFE_DOWNCAST(b - tok->line_start, + intptr_t, int); + } + else { + end_col_offset = -1; + } if ((err_ret->error = PyParser_AddToken(ps, (int)type, str, - lineno, col_offset, + lineno, col_offset, tok->lineno, end_col_offset, &(err_ret->expected))) != E_OK) { if (err_ret->error != E_DONE) { PyObject_FREE(str); @@ -368,6 +375,9 @@ parsetok(struct tok_state *tok, grammar *g, int start, perrdetail *err_ret, done: PyTokenizer_Free(tok); + if (n != NULL) { + _PyNode_FinalizeEndPos(n); + } return n; } diff --git a/Python/Python-ast.c b/Python/Python-ast.c index bbe8e69fdd50..e6c5bfe9b29e 100644 --- a/Python/Python-ast.c +++ b/Python/Python-ast.c @@ -28,9 +28,13 @@ static char *Suite_fields[]={ static PyTypeObject *stmt_type; _Py_IDENTIFIER(lineno); _Py_IDENTIFIER(col_offset); +_Py_IDENTIFIER(end_lineno); +_Py_IDENTIFIER(end_col_offset); static char *stmt_attributes[] = { "lineno", "col_offset", + "end_lineno", + "end_col_offset", }; static PyObject* ast2obj_stmt(void*); static PyTypeObject *FunctionDef_type; @@ -189,6 +193,8 @@ static PyTypeObject *expr_type; static char *expr_attributes[] = { "lineno", "col_offset", + "end_lineno", + "end_col_offset", }; static PyObject* ast2obj_expr(void*); static PyTypeObject *BoolOp_type; @@ -427,6 +433,8 @@ static PyTypeObject *excepthandler_type; static char *excepthandler_attributes[] = { "lineno", "col_offset", + "end_lineno", + "end_col_offset", }; static PyObject* ast2obj_excepthandler(void*); static PyTypeObject *ExceptHandler_type; @@ -456,6 +464,8 @@ static PyObject* ast2obj_arg(void*); static char *arg_attributes[] = { "lineno", "col_offset", + "end_lineno", + "end_col_offset", }; _Py_IDENTIFIER(arg); static char *arg_fields[]={ @@ -804,7 +814,7 @@ static int init_types(void) if (!Suite_type) return 0; stmt_type = make_type("stmt", &AST_type, NULL, 0); if (!stmt_type) return 0; - if (!add_attributes(stmt_type, stmt_attributes, 2)) return 0; + if (!add_attributes(stmt_type, stmt_attributes, 4)) return 0; FunctionDef_type = make_type("FunctionDef", stmt_type, FunctionDef_fields, 5); if (!FunctionDef_type) return 0; @@ -859,7 +869,7 @@ static int init_types(void) if (!Continue_type) return 0; expr_type = make_type("expr", &AST_type, NULL, 0); if (!expr_type) return 0; - if (!add_attributes(expr_type, expr_attributes, 2)) return 0; + if (!add_attributes(expr_type, expr_attributes, 4)) return 0; BoolOp_type = make_type("BoolOp", expr_type, BoolOp_fields, 2); if (!BoolOp_type) return 0; BinOp_type = make_type("BinOp", expr_type, BinOp_fields, 3); @@ -1082,7 +1092,7 @@ static int init_types(void) if (!add_attributes(comprehension_type, NULL, 0)) return 0; excepthandler_type = make_type("excepthandler", &AST_type, NULL, 0); if (!excepthandler_type) return 0; - if (!add_attributes(excepthandler_type, excepthandler_attributes, 2)) + if (!add_attributes(excepthandler_type, excepthandler_attributes, 4)) return 0; ExceptHandler_type = make_type("ExceptHandler", excepthandler_type, ExceptHandler_fields, 3); @@ -1092,7 +1102,7 @@ static int init_types(void) if (!add_attributes(arguments_type, NULL, 0)) return 0; arg_type = make_type("arg", &AST_type, arg_fields, 2); if (!arg_type) return 0; - if (!add_attributes(arg_type, arg_attributes, 2)) return 0; + if (!add_attributes(arg_type, arg_attributes, 4)) return 0; keyword_type = make_type("keyword", &AST_type, keyword_fields, 2); if (!keyword_type) return 0; if (!add_attributes(keyword_type, NULL, 0)) return 0; @@ -1181,8 +1191,8 @@ Suite(asdl_seq * body, PyArena *arena) stmt_ty FunctionDef(identifier name, arguments_ty args, asdl_seq * body, asdl_seq * - decorator_list, expr_ty returns, int lineno, int col_offset, - PyArena *arena) + decorator_list, expr_ty returns, int lineno, int col_offset, int + end_lineno, int end_col_offset, PyArena *arena) { stmt_ty p; if (!name) { @@ -1206,13 +1216,15 @@ FunctionDef(identifier name, arguments_ty args, asdl_seq * body, asdl_seq * p->v.FunctionDef.returns = returns; p->lineno = lineno; p->col_offset = col_offset; + p->end_lineno = end_lineno; + p->end_col_offset = end_col_offset; return p; } stmt_ty AsyncFunctionDef(identifier name, arguments_ty args, asdl_seq * body, asdl_seq * decorator_list, expr_ty returns, int lineno, int col_offset, - PyArena *arena) + int end_lineno, int end_col_offset, PyArena *arena) { stmt_ty p; if (!name) { @@ -1236,13 +1248,15 @@ AsyncFunctionDef(identifier name, arguments_ty args, asdl_seq * body, asdl_seq p->v.AsyncFunctionDef.returns = returns; p->lineno = lineno; p->col_offset = col_offset; + p->end_lineno = end_lineno; + p->end_col_offset = end_col_offset; return p; } stmt_ty ClassDef(identifier name, asdl_seq * bases, asdl_seq * keywords, asdl_seq * - body, asdl_seq * decorator_list, int lineno, int col_offset, PyArena - *arena) + body, asdl_seq * decorator_list, int lineno, int col_offset, int + end_lineno, int end_col_offset, PyArena *arena) { stmt_ty p; if (!name) { @@ -1261,11 +1275,14 @@ ClassDef(identifier name, asdl_seq * bases, asdl_seq * keywords, asdl_seq * p->v.ClassDef.decorator_list = decorator_list; p->lineno = lineno; p->col_offset = col_offset; + p->end_lineno = end_lineno; + p->end_col_offset = end_col_offset; return p; } stmt_ty -Return(expr_ty value, int lineno, int col_offset, PyArena *arena) +Return(expr_ty value, int lineno, int col_offset, int end_lineno, int + end_col_offset, PyArena *arena) { stmt_ty p; p = (stmt_ty)PyArena_Malloc(arena, sizeof(*p)); @@ -1275,11 +1292,14 @@ Return(expr_ty value, int lineno, int col_offset, PyArena *arena) p->v.Return.value = value; p->lineno = lineno; p->col_offset = col_offset; + p->end_lineno = end_lineno; + p->end_col_offset = end_col_offset; return p; } stmt_ty -Delete(asdl_seq * targets, int lineno, int col_offset, PyArena *arena) +Delete(asdl_seq * targets, int lineno, int col_offset, int end_lineno, int + end_col_offset, PyArena *arena) { stmt_ty p; p = (stmt_ty)PyArena_Malloc(arena, sizeof(*p)); @@ -1289,12 +1309,14 @@ Delete(asdl_seq * targets, int lineno, int col_offset, PyArena *arena) p->v.Delete.targets = targets; p->lineno = lineno; p->col_offset = col_offset; + p->end_lineno = end_lineno; + p->end_col_offset = end_col_offset; return p; } stmt_ty -Assign(asdl_seq * targets, expr_ty value, int lineno, int col_offset, PyArena - *arena) +Assign(asdl_seq * targets, expr_ty value, int lineno, int col_offset, int + end_lineno, int end_col_offset, PyArena *arena) { stmt_ty p; if (!value) { @@ -1310,12 +1332,14 @@ Assign(asdl_seq * targets, expr_ty value, int lineno, int col_offset, PyArena p->v.Assign.value = value; p->lineno = lineno; p->col_offset = col_offset; + p->end_lineno = end_lineno; + p->end_col_offset = end_col_offset; return p; } stmt_ty AugAssign(expr_ty target, operator_ty op, expr_ty value, int lineno, int - col_offset, PyArena *arena) + col_offset, int end_lineno, int end_col_offset, PyArena *arena) { stmt_ty p; if (!target) { @@ -1342,12 +1366,15 @@ AugAssign(expr_ty target, operator_ty op, expr_ty value, int lineno, int p->v.AugAssign.value = value; p->lineno = lineno; p->col_offset = col_offset; + p->end_lineno = end_lineno; + p->end_col_offset = end_col_offset; return p; } stmt_ty AnnAssign(expr_ty target, expr_ty annotation, expr_ty value, int simple, int - lineno, int col_offset, PyArena *arena) + lineno, int col_offset, int end_lineno, int end_col_offset, PyArena + *arena) { stmt_ty p; if (!target) { @@ -1370,12 +1397,14 @@ AnnAssign(expr_ty target, expr_ty annotation, expr_ty value, int simple, int p->v.AnnAssign.simple = simple; p->lineno = lineno; p->col_offset = col_offset; + p->end_lineno = end_lineno; + p->end_col_offset = end_col_offset; return p; } stmt_ty For(expr_ty target, expr_ty iter, asdl_seq * body, asdl_seq * orelse, int - lineno, int col_offset, PyArena *arena) + lineno, int col_offset, int end_lineno, int end_col_offset, PyArena *arena) { stmt_ty p; if (!target) { @@ -1398,12 +1427,15 @@ For(expr_ty target, expr_ty iter, asdl_seq * body, asdl_seq * orelse, int p->v.For.orelse = orelse; p->lineno = lineno; p->col_offset = col_offset; + p->end_lineno = end_lineno; + p->end_col_offset = end_col_offset; return p; } stmt_ty AsyncFor(expr_ty target, expr_ty iter, asdl_seq * body, asdl_seq * orelse, int - lineno, int col_offset, PyArena *arena) + lineno, int col_offset, int end_lineno, int end_col_offset, PyArena + *arena) { stmt_ty p; if (!target) { @@ -1426,12 +1458,14 @@ AsyncFor(expr_ty target, expr_ty iter, asdl_seq * body, asdl_seq * orelse, int p->v.AsyncFor.orelse = orelse; p->lineno = lineno; p->col_offset = col_offset; + p->end_lineno = end_lineno; + p->end_col_offset = end_col_offset; return p; } stmt_ty While(expr_ty test, asdl_seq * body, asdl_seq * orelse, int lineno, int - col_offset, PyArena *arena) + col_offset, int end_lineno, int end_col_offset, PyArena *arena) { stmt_ty p; if (!test) { @@ -1448,12 +1482,14 @@ While(expr_ty test, asdl_seq * body, asdl_seq * orelse, int lineno, int p->v.While.orelse = orelse; p->lineno = lineno; p->col_offset = col_offset; + p->end_lineno = end_lineno; + p->end_col_offset = end_col_offset; return p; } stmt_ty If(expr_ty test, asdl_seq * body, asdl_seq * orelse, int lineno, int - col_offset, PyArena *arena) + col_offset, int end_lineno, int end_col_offset, PyArena *arena) { stmt_ty p; if (!test) { @@ -1470,12 +1506,14 @@ If(expr_ty test, asdl_seq * body, asdl_seq * orelse, int lineno, int p->v.If.orelse = orelse; p->lineno = lineno; p->col_offset = col_offset; + p->end_lineno = end_lineno; + p->end_col_offset = end_col_offset; return p; } stmt_ty -With(asdl_seq * items, asdl_seq * body, int lineno, int col_offset, PyArena - *arena) +With(asdl_seq * items, asdl_seq * body, int lineno, int col_offset, int + end_lineno, int end_col_offset, PyArena *arena) { stmt_ty p; p = (stmt_ty)PyArena_Malloc(arena, sizeof(*p)); @@ -1486,12 +1524,14 @@ With(asdl_seq * items, asdl_seq * body, int lineno, int col_offset, PyArena p->v.With.body = body; p->lineno = lineno; p->col_offset = col_offset; + p->end_lineno = end_lineno; + p->end_col_offset = end_col_offset; return p; } stmt_ty -AsyncWith(asdl_seq * items, asdl_seq * body, int lineno, int col_offset, - PyArena *arena) +AsyncWith(asdl_seq * items, asdl_seq * body, int lineno, int col_offset, int + end_lineno, int end_col_offset, PyArena *arena) { stmt_ty p; p = (stmt_ty)PyArena_Malloc(arena, sizeof(*p)); @@ -1502,11 +1542,14 @@ AsyncWith(asdl_seq * items, asdl_seq * body, int lineno, int col_offset, p->v.AsyncWith.body = body; p->lineno = lineno; p->col_offset = col_offset; + p->end_lineno = end_lineno; + p->end_col_offset = end_col_offset; return p; } stmt_ty -Raise(expr_ty exc, expr_ty cause, int lineno, int col_offset, PyArena *arena) +Raise(expr_ty exc, expr_ty cause, int lineno, int col_offset, int end_lineno, + int end_col_offset, PyArena *arena) { stmt_ty p; p = (stmt_ty)PyArena_Malloc(arena, sizeof(*p)); @@ -1517,12 +1560,15 @@ Raise(expr_ty exc, expr_ty cause, int lineno, int col_offset, PyArena *arena) p->v.Raise.cause = cause; p->lineno = lineno; p->col_offset = col_offset; + p->end_lineno = end_lineno; + p->end_col_offset = end_col_offset; return p; } stmt_ty Try(asdl_seq * body, asdl_seq * handlers, asdl_seq * orelse, asdl_seq * - finalbody, int lineno, int col_offset, PyArena *arena) + finalbody, int lineno, int col_offset, int end_lineno, int end_col_offset, + PyArena *arena) { stmt_ty p; p = (stmt_ty)PyArena_Malloc(arena, sizeof(*p)); @@ -1535,11 +1581,14 @@ Try(asdl_seq * body, asdl_seq * handlers, asdl_seq * orelse, asdl_seq * p->v.Try.finalbody = finalbody; p->lineno = lineno; p->col_offset = col_offset; + p->end_lineno = end_lineno; + p->end_col_offset = end_col_offset; return p; } stmt_ty -Assert(expr_ty test, expr_ty msg, int lineno, int col_offset, PyArena *arena) +Assert(expr_ty test, expr_ty msg, int lineno, int col_offset, int end_lineno, + int end_col_offset, PyArena *arena) { stmt_ty p; if (!test) { @@ -1555,11 +1604,14 @@ Assert(expr_ty test, expr_ty msg, int lineno, int col_offset, PyArena *arena) p->v.Assert.msg = msg; p->lineno = lineno; p->col_offset = col_offset; + p->end_lineno = end_lineno; + p->end_col_offset = end_col_offset; return p; } stmt_ty -Import(asdl_seq * names, int lineno, int col_offset, PyArena *arena) +Import(asdl_seq * names, int lineno, int col_offset, int end_lineno, int + end_col_offset, PyArena *arena) { stmt_ty p; p = (stmt_ty)PyArena_Malloc(arena, sizeof(*p)); @@ -1569,12 +1621,14 @@ Import(asdl_seq * names, int lineno, int col_offset, PyArena *arena) p->v.Import.names = names; p->lineno = lineno; p->col_offset = col_offset; + p->end_lineno = end_lineno; + p->end_col_offset = end_col_offset; return p; } stmt_ty ImportFrom(identifier module, asdl_seq * names, int level, int lineno, int - col_offset, PyArena *arena) + col_offset, int end_lineno, int end_col_offset, PyArena *arena) { stmt_ty p; p = (stmt_ty)PyArena_Malloc(arena, sizeof(*p)); @@ -1586,11 +1640,14 @@ ImportFrom(identifier module, asdl_seq * names, int level, int lineno, int p->v.ImportFrom.level = level; p->lineno = lineno; p->col_offset = col_offset; + p->end_lineno = end_lineno; + p->end_col_offset = end_col_offset; return p; } stmt_ty -Global(asdl_seq * names, int lineno, int col_offset, PyArena *arena) +Global(asdl_seq * names, int lineno, int col_offset, int end_lineno, int + end_col_offset, PyArena *arena) { stmt_ty p; p = (stmt_ty)PyArena_Malloc(arena, sizeof(*p)); @@ -1600,11 +1657,14 @@ Global(asdl_seq * names, int lineno, int col_offset, PyArena *arena) p->v.Global.names = names; p->lineno = lineno; p->col_offset = col_offset; + p->end_lineno = end_lineno; + p->end_col_offset = end_col_offset; return p; } stmt_ty -Nonlocal(asdl_seq * names, int lineno, int col_offset, PyArena *arena) +Nonlocal(asdl_seq * names, int lineno, int col_offset, int end_lineno, int + end_col_offset, PyArena *arena) { stmt_ty p; p = (stmt_ty)PyArena_Malloc(arena, sizeof(*p)); @@ -1614,11 +1674,14 @@ Nonlocal(asdl_seq * names, int lineno, int col_offset, PyArena *arena) p->v.Nonlocal.names = names; p->lineno = lineno; p->col_offset = col_offset; + p->end_lineno = end_lineno; + p->end_col_offset = end_col_offset; return p; } stmt_ty -Expr(expr_ty value, int lineno, int col_offset, PyArena *arena) +Expr(expr_ty value, int lineno, int col_offset, int end_lineno, int + end_col_offset, PyArena *arena) { stmt_ty p; if (!value) { @@ -1633,11 +1696,14 @@ Expr(expr_ty value, int lineno, int col_offset, PyArena *arena) p->v.Expr.value = value; p->lineno = lineno; p->col_offset = col_offset; + p->end_lineno = end_lineno; + p->end_col_offset = end_col_offset; return p; } stmt_ty -Pass(int lineno, int col_offset, PyArena *arena) +Pass(int lineno, int col_offset, int end_lineno, int end_col_offset, PyArena + *arena) { stmt_ty p; p = (stmt_ty)PyArena_Malloc(arena, sizeof(*p)); @@ -1646,11 +1712,14 @@ Pass(int lineno, int col_offset, PyArena *arena) p->kind = Pass_kind; p->lineno = lineno; p->col_offset = col_offset; + p->end_lineno = end_lineno; + p->end_col_offset = end_col_offset; return p; } stmt_ty -Break(int lineno, int col_offset, PyArena *arena) +Break(int lineno, int col_offset, int end_lineno, int end_col_offset, PyArena + *arena) { stmt_ty p; p = (stmt_ty)PyArena_Malloc(arena, sizeof(*p)); @@ -1659,11 +1728,14 @@ Break(int lineno, int col_offset, PyArena *arena) p->kind = Break_kind; p->lineno = lineno; p->col_offset = col_offset; + p->end_lineno = end_lineno; + p->end_col_offset = end_col_offset; return p; } stmt_ty -Continue(int lineno, int col_offset, PyArena *arena) +Continue(int lineno, int col_offset, int end_lineno, int end_col_offset, + PyArena *arena) { stmt_ty p; p = (stmt_ty)PyArena_Malloc(arena, sizeof(*p)); @@ -1672,12 +1744,14 @@ Continue(int lineno, int col_offset, PyArena *arena) p->kind = Continue_kind; p->lineno = lineno; p->col_offset = col_offset; + p->end_lineno = end_lineno; + p->end_col_offset = end_col_offset; return p; } expr_ty -BoolOp(boolop_ty op, asdl_seq * values, int lineno, int col_offset, PyArena - *arena) +BoolOp(boolop_ty op, asdl_seq * values, int lineno, int col_offset, int + end_lineno, int end_col_offset, PyArena *arena) { expr_ty p; if (!op) { @@ -1693,12 +1767,14 @@ BoolOp(boolop_ty op, asdl_seq * values, int lineno, int col_offset, PyArena p->v.BoolOp.values = values; p->lineno = lineno; p->col_offset = col_offset; + p->end_lineno = end_lineno; + p->end_col_offset = end_col_offset; return p; } expr_ty BinOp(expr_ty left, operator_ty op, expr_ty right, int lineno, int col_offset, - PyArena *arena) + int end_lineno, int end_col_offset, PyArena *arena) { expr_ty p; if (!left) { @@ -1725,12 +1801,14 @@ BinOp(expr_ty left, operator_ty op, expr_ty right, int lineno, int col_offset, p->v.BinOp.right = right; p->lineno = lineno; p->col_offset = col_offset; + p->end_lineno = end_lineno; + p->end_col_offset = end_col_offset; return p; } expr_ty -UnaryOp(unaryop_ty op, expr_ty operand, int lineno, int col_offset, PyArena - *arena) +UnaryOp(unaryop_ty op, expr_ty operand, int lineno, int col_offset, int + end_lineno, int end_col_offset, PyArena *arena) { expr_ty p; if (!op) { @@ -1751,12 +1829,14 @@ UnaryOp(unaryop_ty op, expr_ty operand, int lineno, int col_offset, PyArena p->v.UnaryOp.operand = operand; p->lineno = lineno; p->col_offset = col_offset; + p->end_lineno = end_lineno; + p->end_col_offset = end_col_offset; return p; } expr_ty -Lambda(arguments_ty args, expr_ty body, int lineno, int col_offset, PyArena - *arena) +Lambda(arguments_ty args, expr_ty body, int lineno, int col_offset, int + end_lineno, int end_col_offset, PyArena *arena) { expr_ty p; if (!args) { @@ -1777,12 +1857,14 @@ Lambda(arguments_ty args, expr_ty body, int lineno, int col_offset, PyArena p->v.Lambda.body = body; p->lineno = lineno; p->col_offset = col_offset; + p->end_lineno = end_lineno; + p->end_col_offset = end_col_offset; return p; } expr_ty IfExp(expr_ty test, expr_ty body, expr_ty orelse, int lineno, int col_offset, - PyArena *arena) + int end_lineno, int end_col_offset, PyArena *arena) { expr_ty p; if (!test) { @@ -1809,12 +1891,14 @@ IfExp(expr_ty test, expr_ty body, expr_ty orelse, int lineno, int col_offset, p->v.IfExp.orelse = orelse; p->lineno = lineno; p->col_offset = col_offset; + p->end_lineno = end_lineno; + p->end_col_offset = end_col_offset; return p; } expr_ty -Dict(asdl_seq * keys, asdl_seq * values, int lineno, int col_offset, PyArena - *arena) +Dict(asdl_seq * keys, asdl_seq * values, int lineno, int col_offset, int + end_lineno, int end_col_offset, PyArena *arena) { expr_ty p; p = (expr_ty)PyArena_Malloc(arena, sizeof(*p)); @@ -1825,11 +1909,14 @@ Dict(asdl_seq * keys, asdl_seq * values, int lineno, int col_offset, PyArena p->v.Dict.values = values; p->lineno = lineno; p->col_offset = col_offset; + p->end_lineno = end_lineno; + p->end_col_offset = end_col_offset; return p; } expr_ty -Set(asdl_seq * elts, int lineno, int col_offset, PyArena *arena) +Set(asdl_seq * elts, int lineno, int col_offset, int end_lineno, int + end_col_offset, PyArena *arena) { expr_ty p; p = (expr_ty)PyArena_Malloc(arena, sizeof(*p)); @@ -1839,12 +1926,14 @@ Set(asdl_seq * elts, int lineno, int col_offset, PyArena *arena) p->v.Set.elts = elts; p->lineno = lineno; p->col_offset = col_offset; + p->end_lineno = end_lineno; + p->end_col_offset = end_col_offset; return p; } expr_ty -ListComp(expr_ty elt, asdl_seq * generators, int lineno, int col_offset, - PyArena *arena) +ListComp(expr_ty elt, asdl_seq * generators, int lineno, int col_offset, int + end_lineno, int end_col_offset, PyArena *arena) { expr_ty p; if (!elt) { @@ -1860,12 +1949,14 @@ ListComp(expr_ty elt, asdl_seq * generators, int lineno, int col_offset, p->v.ListComp.generators = generators; p->lineno = lineno; p->col_offset = col_offset; + p->end_lineno = end_lineno; + p->end_col_offset = end_col_offset; return p; } expr_ty -SetComp(expr_ty elt, asdl_seq * generators, int lineno, int col_offset, PyArena - *arena) +SetComp(expr_ty elt, asdl_seq * generators, int lineno, int col_offset, int + end_lineno, int end_col_offset, PyArena *arena) { expr_ty p; if (!elt) { @@ -1881,12 +1972,14 @@ SetComp(expr_ty elt, asdl_seq * generators, int lineno, int col_offset, PyArena p->v.SetComp.generators = generators; p->lineno = lineno; p->col_offset = col_offset; + p->end_lineno = end_lineno; + p->end_col_offset = end_col_offset; return p; } expr_ty DictComp(expr_ty key, expr_ty value, asdl_seq * generators, int lineno, int - col_offset, PyArena *arena) + col_offset, int end_lineno, int end_col_offset, PyArena *arena) { expr_ty p; if (!key) { @@ -1908,12 +2001,14 @@ DictComp(expr_ty key, expr_ty value, asdl_seq * generators, int lineno, int p->v.DictComp.generators = generators; p->lineno = lineno; p->col_offset = col_offset; + p->end_lineno = end_lineno; + p->end_col_offset = end_col_offset; return p; } expr_ty GeneratorExp(expr_ty elt, asdl_seq * generators, int lineno, int col_offset, - PyArena *arena) + int end_lineno, int end_col_offset, PyArena *arena) { expr_ty p; if (!elt) { @@ -1929,11 +2024,14 @@ GeneratorExp(expr_ty elt, asdl_seq * generators, int lineno, int col_offset, p->v.GeneratorExp.generators = generators; p->lineno = lineno; p->col_offset = col_offset; + p->end_lineno = end_lineno; + p->end_col_offset = end_col_offset; return p; } expr_ty -Await(expr_ty value, int lineno, int col_offset, PyArena *arena) +Await(expr_ty value, int lineno, int col_offset, int end_lineno, int + end_col_offset, PyArena *arena) { expr_ty p; if (!value) { @@ -1948,11 +2046,14 @@ Await(expr_ty value, int lineno, int col_offset, PyArena *arena) p->v.Await.value = value; p->lineno = lineno; p->col_offset = col_offset; + p->end_lineno = end_lineno; + p->end_col_offset = end_col_offset; return p; } expr_ty -Yield(expr_ty value, int lineno, int col_offset, PyArena *arena) +Yield(expr_ty value, int lineno, int col_offset, int end_lineno, int + end_col_offset, PyArena *arena) { expr_ty p; p = (expr_ty)PyArena_Malloc(arena, sizeof(*p)); @@ -1962,11 +2063,14 @@ Yield(expr_ty value, int lineno, int col_offset, PyArena *arena) p->v.Yield.value = value; p->lineno = lineno; p->col_offset = col_offset; + p->end_lineno = end_lineno; + p->end_col_offset = end_col_offset; return p; } expr_ty -YieldFrom(expr_ty value, int lineno, int col_offset, PyArena *arena) +YieldFrom(expr_ty value, int lineno, int col_offset, int end_lineno, int + end_col_offset, PyArena *arena) { expr_ty p; if (!value) { @@ -1981,12 +2085,14 @@ YieldFrom(expr_ty value, int lineno, int col_offset, PyArena *arena) p->v.YieldFrom.value = value; p->lineno = lineno; p->col_offset = col_offset; + p->end_lineno = end_lineno; + p->end_col_offset = end_col_offset; return p; } expr_ty Compare(expr_ty left, asdl_int_seq * ops, asdl_seq * comparators, int lineno, - int col_offset, PyArena *arena) + int col_offset, int end_lineno, int end_col_offset, PyArena *arena) { expr_ty p; if (!left) { @@ -2003,12 +2109,14 @@ Compare(expr_ty left, asdl_int_seq * ops, asdl_seq * comparators, int lineno, p->v.Compare.comparators = comparators; p->lineno = lineno; p->col_offset = col_offset; + p->end_lineno = end_lineno; + p->end_col_offset = end_col_offset; return p; } expr_ty Call(expr_ty func, asdl_seq * args, asdl_seq * keywords, int lineno, int - col_offset, PyArena *arena) + col_offset, int end_lineno, int end_col_offset, PyArena *arena) { expr_ty p; if (!func) { @@ -2025,12 +2133,15 @@ Call(expr_ty func, asdl_seq * args, asdl_seq * keywords, int lineno, int p->v.Call.keywords = keywords; p->lineno = lineno; p->col_offset = col_offset; + p->end_lineno = end_lineno; + p->end_col_offset = end_col_offset; return p; } expr_ty FormattedValue(expr_ty value, int conversion, expr_ty format_spec, int lineno, - int col_offset, PyArena *arena) + int col_offset, int end_lineno, int end_col_offset, PyArena + *arena) { expr_ty p; if (!value) { @@ -2047,11 +2158,14 @@ FormattedValue(expr_ty value, int conversion, expr_ty format_spec, int lineno, p->v.FormattedValue.format_spec = format_spec; p->lineno = lineno; p->col_offset = col_offset; + p->end_lineno = end_lineno; + p->end_col_offset = end_col_offset; return p; } expr_ty -JoinedStr(asdl_seq * values, int lineno, int col_offset, PyArena *arena) +JoinedStr(asdl_seq * values, int lineno, int col_offset, int end_lineno, int + end_col_offset, PyArena *arena) { expr_ty p; p = (expr_ty)PyArena_Malloc(arena, sizeof(*p)); @@ -2061,11 +2175,14 @@ JoinedStr(asdl_seq * values, int lineno, int col_offset, PyArena *arena) p->v.JoinedStr.values = values; p->lineno = lineno; p->col_offset = col_offset; + p->end_lineno = end_lineno; + p->end_col_offset = end_col_offset; return p; } expr_ty -Constant(constant value, int lineno, int col_offset, PyArena *arena) +Constant(constant value, int lineno, int col_offset, int end_lineno, int + end_col_offset, PyArena *arena) { expr_ty p; if (!value) { @@ -2080,12 +2197,14 @@ Constant(constant value, int lineno, int col_offset, PyArena *arena) p->v.Constant.value = value; p->lineno = lineno; p->col_offset = col_offset; + p->end_lineno = end_lineno; + p->end_col_offset = end_col_offset; return p; } expr_ty Attribute(expr_ty value, identifier attr, expr_context_ty ctx, int lineno, int - col_offset, PyArena *arena) + col_offset, int end_lineno, int end_col_offset, PyArena *arena) { expr_ty p; if (!value) { @@ -2112,12 +2231,14 @@ Attribute(expr_ty value, identifier attr, expr_context_ty ctx, int lineno, int p->v.Attribute.ctx = ctx; p->lineno = lineno; p->col_offset = col_offset; + p->end_lineno = end_lineno; + p->end_col_offset = end_col_offset; return p; } expr_ty Subscript(expr_ty value, slice_ty slice, expr_context_ty ctx, int lineno, int - col_offset, PyArena *arena) + col_offset, int end_lineno, int end_col_offset, PyArena *arena) { expr_ty p; if (!value) { @@ -2144,12 +2265,14 @@ Subscript(expr_ty value, slice_ty slice, expr_context_ty ctx, int lineno, int p->v.Subscript.ctx = ctx; p->lineno = lineno; p->col_offset = col_offset; + p->end_lineno = end_lineno; + p->end_col_offset = end_col_offset; return p; } expr_ty -Starred(expr_ty value, expr_context_ty ctx, int lineno, int col_offset, PyArena - *arena) +Starred(expr_ty value, expr_context_ty ctx, int lineno, int col_offset, int + end_lineno, int end_col_offset, PyArena *arena) { expr_ty p; if (!value) { @@ -2170,12 +2293,14 @@ Starred(expr_ty value, expr_context_ty ctx, int lineno, int col_offset, PyArena p->v.Starred.ctx = ctx; p->lineno = lineno; p->col_offset = col_offset; + p->end_lineno = end_lineno; + p->end_col_offset = end_col_offset; return p; } expr_ty -Name(identifier id, expr_context_ty ctx, int lineno, int col_offset, PyArena - *arena) +Name(identifier id, expr_context_ty ctx, int lineno, int col_offset, int + end_lineno, int end_col_offset, PyArena *arena) { expr_ty p; if (!id) { @@ -2196,12 +2321,14 @@ Name(identifier id, expr_context_ty ctx, int lineno, int col_offset, PyArena p->v.Name.ctx = ctx; p->lineno = lineno; p->col_offset = col_offset; + p->end_lineno = end_lineno; + p->end_col_offset = end_col_offset; return p; } expr_ty -List(asdl_seq * elts, expr_context_ty ctx, int lineno, int col_offset, PyArena - *arena) +List(asdl_seq * elts, expr_context_ty ctx, int lineno, int col_offset, int + end_lineno, int end_col_offset, PyArena *arena) { expr_ty p; if (!ctx) { @@ -2217,12 +2344,14 @@ List(asdl_seq * elts, expr_context_ty ctx, int lineno, int col_offset, PyArena p->v.List.ctx = ctx; p->lineno = lineno; p->col_offset = col_offset; + p->end_lineno = end_lineno; + p->end_col_offset = end_col_offset; return p; } expr_ty -Tuple(asdl_seq * elts, expr_context_ty ctx, int lineno, int col_offset, PyArena - *arena) +Tuple(asdl_seq * elts, expr_context_ty ctx, int lineno, int col_offset, int + end_lineno, int end_col_offset, PyArena *arena) { expr_ty p; if (!ctx) { @@ -2238,6 +2367,8 @@ Tuple(asdl_seq * elts, expr_context_ty ctx, int lineno, int col_offset, PyArena p->v.Tuple.ctx = ctx; p->lineno = lineno; p->col_offset = col_offset; + p->end_lineno = end_lineno; + p->end_col_offset = end_col_offset; return p; } @@ -2311,7 +2442,7 @@ comprehension(expr_ty target, expr_ty iter, asdl_seq * ifs, int is_async, excepthandler_ty ExceptHandler(expr_ty type, identifier name, asdl_seq * body, int lineno, int - col_offset, PyArena *arena) + col_offset, int end_lineno, int end_col_offset, PyArena *arena) { excepthandler_ty p; p = (excepthandler_ty)PyArena_Malloc(arena, sizeof(*p)); @@ -2323,6 +2454,8 @@ ExceptHandler(expr_ty type, identifier name, asdl_seq * body, int lineno, int p->v.ExceptHandler.body = body; p->lineno = lineno; p->col_offset = col_offset; + p->end_lineno = end_lineno; + p->end_col_offset = end_col_offset; return p; } @@ -2344,8 +2477,8 @@ arguments(asdl_seq * args, arg_ty vararg, asdl_seq * kwonlyargs, asdl_seq * } arg_ty -arg(identifier arg, expr_ty annotation, int lineno, int col_offset, PyArena - *arena) +arg(identifier arg, expr_ty annotation, int lineno, int col_offset, int + end_lineno, int end_col_offset, PyArena *arena) { arg_ty p; if (!arg) { @@ -2360,6 +2493,8 @@ arg(identifier arg, expr_ty annotation, int lineno, int col_offset, PyArena p->annotation = annotation; p->lineno = lineno; p->col_offset = col_offset; + p->end_lineno = end_lineno; + p->end_col_offset = end_col_offset; return p; } @@ -2886,6 +3021,16 @@ ast2obj_stmt(void* _o) if (_PyObject_SetAttrId(result, &PyId_col_offset, value) < 0) goto failed; Py_DECREF(value); + value = ast2obj_int(o->end_lineno); + if (!value) goto failed; + if (_PyObject_SetAttrId(result, &PyId_end_lineno, value) < 0) + goto failed; + Py_DECREF(value); + value = ast2obj_int(o->end_col_offset); + if (!value) goto failed; + if (_PyObject_SetAttrId(result, &PyId_end_col_offset, value) < 0) + goto failed; + Py_DECREF(value); return result; failed: Py_XDECREF(value); @@ -3281,6 +3426,16 @@ ast2obj_expr(void* _o) if (_PyObject_SetAttrId(result, &PyId_col_offset, value) < 0) goto failed; Py_DECREF(value); + value = ast2obj_int(o->end_lineno); + if (!value) goto failed; + if (_PyObject_SetAttrId(result, &PyId_end_lineno, value) < 0) + goto failed; + Py_DECREF(value); + value = ast2obj_int(o->end_col_offset); + if (!value) goto failed; + if (_PyObject_SetAttrId(result, &PyId_end_col_offset, value) < 0) + goto failed; + Py_DECREF(value); return result; failed: Py_XDECREF(value); @@ -3571,6 +3726,16 @@ ast2obj_excepthandler(void* _o) if (_PyObject_SetAttrId(result, &PyId_col_offset, value) < 0) goto failed; Py_DECREF(value); + value = ast2obj_int(o->end_lineno); + if (!value) goto failed; + if (_PyObject_SetAttrId(result, &PyId_end_lineno, value) < 0) + goto failed; + Py_DECREF(value); + value = ast2obj_int(o->end_col_offset); + if (!value) goto failed; + if (_PyObject_SetAttrId(result, &PyId_end_col_offset, value) < 0) + goto failed; + Py_DECREF(value); return result; failed: Py_XDECREF(value); @@ -3657,6 +3822,16 @@ ast2obj_arg(void* _o) if (_PyObject_SetAttrId(result, &PyId_col_offset, value) < 0) goto failed; Py_DECREF(value); + value = ast2obj_int(o->end_lineno); + if (!value) goto failed; + if (_PyObject_SetAttrId(result, &PyId_end_lineno, value) < 0) + goto failed; + Py_DECREF(value); + value = ast2obj_int(o->end_col_offset); + if (!value) goto failed; + if (_PyObject_SetAttrId(result, &PyId_end_col_offset, value) < 0) + goto failed; + Py_DECREF(value); return result; failed: Py_XDECREF(value); @@ -3922,6 +4097,8 @@ obj2ast_stmt(PyObject* obj, stmt_ty* out, PyArena* arena) PyObject *tmp = NULL; int lineno; int col_offset; + int end_lineno; + int end_col_offset; if (obj == Py_None) { *out = NULL; @@ -3953,6 +4130,32 @@ obj2ast_stmt(PyObject* obj, stmt_ty* out, PyArena* arena) if (res != 0) goto failed; Py_CLEAR(tmp); } + if (_PyObject_LookupAttrId(obj, &PyId_end_lineno, &tmp) < 0) { + return 1; + } + if (tmp == NULL || tmp == Py_None) { + Py_CLEAR(tmp); + end_lineno = 0; + } + else { + int res; + res = obj2ast_int(tmp, &end_lineno, arena); + if (res != 0) goto failed; + Py_CLEAR(tmp); + } + if (_PyObject_LookupAttrId(obj, &PyId_end_col_offset, &tmp) < 0) { + return 1; + } + if (tmp == NULL || tmp == Py_None) { + Py_CLEAR(tmp); + end_col_offset = 0; + } + else { + int res; + res = obj2ast_int(tmp, &end_col_offset, arena); + if (res != 0) goto failed; + Py_CLEAR(tmp); + } isinstance = PyObject_IsInstance(obj, (PyObject*)FunctionDef_type); if (isinstance == -1) { return 1; @@ -4064,7 +4267,7 @@ obj2ast_stmt(PyObject* obj, stmt_ty* out, PyArena* arena) Py_CLEAR(tmp); } *out = FunctionDef(name, args, body, decorator_list, returns, lineno, - col_offset, arena); + col_offset, end_lineno, end_col_offset, arena); if (*out == NULL) goto failed; return 0; } @@ -4179,7 +4382,8 @@ obj2ast_stmt(PyObject* obj, stmt_ty* out, PyArena* arena) Py_CLEAR(tmp); } *out = AsyncFunctionDef(name, args, body, decorator_list, returns, - lineno, col_offset, arena); + lineno, col_offset, end_lineno, end_col_offset, + arena); if (*out == NULL) goto failed; return 0; } @@ -4328,7 +4532,7 @@ obj2ast_stmt(PyObject* obj, stmt_ty* out, PyArena* arena) Py_CLEAR(tmp); } *out = ClassDef(name, bases, keywords, body, decorator_list, lineno, - col_offset, arena); + col_offset, end_lineno, end_col_offset, arena); if (*out == NULL) goto failed; return 0; } @@ -4352,7 +4556,8 @@ obj2ast_stmt(PyObject* obj, stmt_ty* out, PyArena* arena) if (res != 0) goto failed; Py_CLEAR(tmp); } - *out = Return(value, lineno, col_offset, arena); + *out = Return(value, lineno, col_offset, end_lineno, end_col_offset, + arena); if (*out == NULL) goto failed; return 0; } @@ -4393,7 +4598,8 @@ obj2ast_stmt(PyObject* obj, stmt_ty* out, PyArena* arena) } Py_CLEAR(tmp); } - *out = Delete(targets, lineno, col_offset, arena); + *out = Delete(targets, lineno, col_offset, end_lineno, end_col_offset, + arena); if (*out == NULL) goto failed; return 0; } @@ -4448,7 +4654,8 @@ obj2ast_stmt(PyObject* obj, stmt_ty* out, PyArena* arena) if (res != 0) goto failed; Py_CLEAR(tmp); } - *out = Assign(targets, value, lineno, col_offset, arena); + *out = Assign(targets, value, lineno, col_offset, end_lineno, + end_col_offset, arena); if (*out == NULL) goto failed; return 0; } @@ -4500,7 +4707,8 @@ obj2ast_stmt(PyObject* obj, stmt_ty* out, PyArena* arena) if (res != 0) goto failed; Py_CLEAR(tmp); } - *out = AugAssign(target, op, value, lineno, col_offset, arena); + *out = AugAssign(target, op, value, lineno, col_offset, end_lineno, + end_col_offset, arena); if (*out == NULL) goto failed; return 0; } @@ -4567,7 +4775,7 @@ obj2ast_stmt(PyObject* obj, stmt_ty* out, PyArena* arena) Py_CLEAR(tmp); } *out = AnnAssign(target, annotation, value, simple, lineno, col_offset, - arena); + end_lineno, end_col_offset, arena); if (*out == NULL) goto failed; return 0; } @@ -4667,7 +4875,8 @@ obj2ast_stmt(PyObject* obj, stmt_ty* out, PyArena* arena) } Py_CLEAR(tmp); } - *out = For(target, iter, body, orelse, lineno, col_offset, arena); + *out = For(target, iter, body, orelse, lineno, col_offset, end_lineno, + end_col_offset, arena); if (*out == NULL) goto failed; return 0; } @@ -4767,7 +4976,8 @@ obj2ast_stmt(PyObject* obj, stmt_ty* out, PyArena* arena) } Py_CLEAR(tmp); } - *out = AsyncFor(target, iter, body, orelse, lineno, col_offset, arena); + *out = AsyncFor(target, iter, body, orelse, lineno, col_offset, + end_lineno, end_col_offset, arena); if (*out == NULL) goto failed; return 0; } @@ -4853,7 +5063,8 @@ obj2ast_stmt(PyObject* obj, stmt_ty* out, PyArena* arena) } Py_CLEAR(tmp); } - *out = While(test, body, orelse, lineno, col_offset, arena); + *out = While(test, body, orelse, lineno, col_offset, end_lineno, + end_col_offset, arena); if (*out == NULL) goto failed; return 0; } @@ -4939,7 +5150,8 @@ obj2ast_stmt(PyObject* obj, stmt_ty* out, PyArena* arena) } Py_CLEAR(tmp); } - *out = If(test, body, orelse, lineno, col_offset, arena); + *out = If(test, body, orelse, lineno, col_offset, end_lineno, + end_col_offset, arena); if (*out == NULL) goto failed; return 0; } @@ -5011,7 +5223,8 @@ obj2ast_stmt(PyObject* obj, stmt_ty* out, PyArena* arena) } Py_CLEAR(tmp); } - *out = With(items, body, lineno, col_offset, arena); + *out = With(items, body, lineno, col_offset, end_lineno, + end_col_offset, arena); if (*out == NULL) goto failed; return 0; } @@ -5083,7 +5296,8 @@ obj2ast_stmt(PyObject* obj, stmt_ty* out, PyArena* arena) } Py_CLEAR(tmp); } - *out = AsyncWith(items, body, lineno, col_offset, arena); + *out = AsyncWith(items, body, lineno, col_offset, end_lineno, + end_col_offset, arena); if (*out == NULL) goto failed; return 0; } @@ -5121,7 +5335,8 @@ obj2ast_stmt(PyObject* obj, stmt_ty* out, PyArena* arena) if (res != 0) goto failed; Py_CLEAR(tmp); } - *out = Raise(exc, cause, lineno, col_offset, arena); + *out = Raise(exc, cause, lineno, col_offset, end_lineno, + end_col_offset, arena); if (*out == NULL) goto failed; return 0; } @@ -5256,7 +5471,7 @@ obj2ast_stmt(PyObject* obj, stmt_ty* out, PyArena* arena) Py_CLEAR(tmp); } *out = Try(body, handlers, orelse, finalbody, lineno, col_offset, - arena); + end_lineno, end_col_offset, arena); if (*out == NULL) goto failed; return 0; } @@ -5294,7 +5509,8 @@ obj2ast_stmt(PyObject* obj, stmt_ty* out, PyArena* arena) if (res != 0) goto failed; Py_CLEAR(tmp); } - *out = Assert(test, msg, lineno, col_offset, arena); + *out = Assert(test, msg, lineno, col_offset, end_lineno, + end_col_offset, arena); if (*out == NULL) goto failed; return 0; } @@ -5335,7 +5551,8 @@ obj2ast_stmt(PyObject* obj, stmt_ty* out, PyArena* arena) } Py_CLEAR(tmp); } - *out = Import(names, lineno, col_offset, arena); + *out = Import(names, lineno, col_offset, end_lineno, end_col_offset, + arena); if (*out == NULL) goto failed; return 0; } @@ -5404,7 +5621,8 @@ obj2ast_stmt(PyObject* obj, stmt_ty* out, PyArena* arena) if (res != 0) goto failed; Py_CLEAR(tmp); } - *out = ImportFrom(module, names, level, lineno, col_offset, arena); + *out = ImportFrom(module, names, level, lineno, col_offset, end_lineno, + end_col_offset, arena); if (*out == NULL) goto failed; return 0; } @@ -5445,7 +5663,8 @@ obj2ast_stmt(PyObject* obj, stmt_ty* out, PyArena* arena) } Py_CLEAR(tmp); } - *out = Global(names, lineno, col_offset, arena); + *out = Global(names, lineno, col_offset, end_lineno, end_col_offset, + arena); if (*out == NULL) goto failed; return 0; } @@ -5486,7 +5705,8 @@ obj2ast_stmt(PyObject* obj, stmt_ty* out, PyArena* arena) } Py_CLEAR(tmp); } - *out = Nonlocal(names, lineno, col_offset, arena); + *out = Nonlocal(names, lineno, col_offset, end_lineno, end_col_offset, + arena); if (*out == NULL) goto failed; return 0; } @@ -5510,7 +5730,8 @@ obj2ast_stmt(PyObject* obj, stmt_ty* out, PyArena* arena) if (res != 0) goto failed; Py_CLEAR(tmp); } - *out = Expr(value, lineno, col_offset, arena); + *out = Expr(value, lineno, col_offset, end_lineno, end_col_offset, + arena); if (*out == NULL) goto failed; return 0; } @@ -5520,7 +5741,7 @@ obj2ast_stmt(PyObject* obj, stmt_ty* out, PyArena* arena) } if (isinstance) { - *out = Pass(lineno, col_offset, arena); + *out = Pass(lineno, col_offset, end_lineno, end_col_offset, arena); if (*out == NULL) goto failed; return 0; } @@ -5530,7 +5751,7 @@ obj2ast_stmt(PyObject* obj, stmt_ty* out, PyArena* arena) } if (isinstance) { - *out = Break(lineno, col_offset, arena); + *out = Break(lineno, col_offset, end_lineno, end_col_offset, arena); if (*out == NULL) goto failed; return 0; } @@ -5540,7 +5761,7 @@ obj2ast_stmt(PyObject* obj, stmt_ty* out, PyArena* arena) } if (isinstance) { - *out = Continue(lineno, col_offset, arena); + *out = Continue(lineno, col_offset, end_lineno, end_col_offset, arena); if (*out == NULL) goto failed; return 0; } @@ -5559,6 +5780,8 @@ obj2ast_expr(PyObject* obj, expr_ty* out, PyArena* arena) PyObject *tmp = NULL; int lineno; int col_offset; + int end_lineno; + int end_col_offset; if (obj == Py_None) { *out = NULL; @@ -5590,6 +5813,32 @@ obj2ast_expr(PyObject* obj, expr_ty* out, PyArena* arena) if (res != 0) goto failed; Py_CLEAR(tmp); } + if (_PyObject_LookupAttrId(obj, &PyId_end_lineno, &tmp) < 0) { + return 1; + } + if (tmp == NULL || tmp == Py_None) { + Py_CLEAR(tmp); + end_lineno = 0; + } + else { + int res; + res = obj2ast_int(tmp, &end_lineno, arena); + if (res != 0) goto failed; + Py_CLEAR(tmp); + } + if (_PyObject_LookupAttrId(obj, &PyId_end_col_offset, &tmp) < 0) { + return 1; + } + if (tmp == NULL || tmp == Py_None) { + Py_CLEAR(tmp); + end_col_offset = 0; + } + else { + int res; + res = obj2ast_int(tmp, &end_col_offset, arena); + if (res != 0) goto failed; + Py_CLEAR(tmp); + } isinstance = PyObject_IsInstance(obj, (PyObject*)BoolOp_type); if (isinstance == -1) { return 1; @@ -5641,7 +5890,8 @@ obj2ast_expr(PyObject* obj, expr_ty* out, PyArena* arena) } Py_CLEAR(tmp); } - *out = BoolOp(op, values, lineno, col_offset, arena); + *out = BoolOp(op, values, lineno, col_offset, end_lineno, + end_col_offset, arena); if (*out == NULL) goto failed; return 0; } @@ -5693,7 +5943,8 @@ obj2ast_expr(PyObject* obj, expr_ty* out, PyArena* arena) if (res != 0) goto failed; Py_CLEAR(tmp); } - *out = BinOp(left, op, right, lineno, col_offset, arena); + *out = BinOp(left, op, right, lineno, col_offset, end_lineno, + end_col_offset, arena); if (*out == NULL) goto failed; return 0; } @@ -5731,7 +5982,8 @@ obj2ast_expr(PyObject* obj, expr_ty* out, PyArena* arena) if (res != 0) goto failed; Py_CLEAR(tmp); } - *out = UnaryOp(op, operand, lineno, col_offset, arena); + *out = UnaryOp(op, operand, lineno, col_offset, end_lineno, + end_col_offset, arena); if (*out == NULL) goto failed; return 0; } @@ -5769,7 +6021,8 @@ obj2ast_expr(PyObject* obj, expr_ty* out, PyArena* arena) if (res != 0) goto failed; Py_CLEAR(tmp); } - *out = Lambda(args, body, lineno, col_offset, arena); + *out = Lambda(args, body, lineno, col_offset, end_lineno, + end_col_offset, arena); if (*out == NULL) goto failed; return 0; } @@ -5821,7 +6074,8 @@ obj2ast_expr(PyObject* obj, expr_ty* out, PyArena* arena) if (res != 0) goto failed; Py_CLEAR(tmp); } - *out = IfExp(test, body, orelse, lineno, col_offset, arena); + *out = IfExp(test, body, orelse, lineno, col_offset, end_lineno, + end_col_offset, arena); if (*out == NULL) goto failed; return 0; } @@ -5893,7 +6147,8 @@ obj2ast_expr(PyObject* obj, expr_ty* out, PyArena* arena) } Py_CLEAR(tmp); } - *out = Dict(keys, values, lineno, col_offset, arena); + *out = Dict(keys, values, lineno, col_offset, end_lineno, + end_col_offset, arena); if (*out == NULL) goto failed; return 0; } @@ -5934,7 +6189,7 @@ obj2ast_expr(PyObject* obj, expr_ty* out, PyArena* arena) } Py_CLEAR(tmp); } - *out = Set(elts, lineno, col_offset, arena); + *out = Set(elts, lineno, col_offset, end_lineno, end_col_offset, arena); if (*out == NULL) goto failed; return 0; } @@ -5989,7 +6244,8 @@ obj2ast_expr(PyObject* obj, expr_ty* out, PyArena* arena) } Py_CLEAR(tmp); } - *out = ListComp(elt, generators, lineno, col_offset, arena); + *out = ListComp(elt, generators, lineno, col_offset, end_lineno, + end_col_offset, arena); if (*out == NULL) goto failed; return 0; } @@ -6044,7 +6300,8 @@ obj2ast_expr(PyObject* obj, expr_ty* out, PyArena* arena) } Py_CLEAR(tmp); } - *out = SetComp(elt, generators, lineno, col_offset, arena); + *out = SetComp(elt, generators, lineno, col_offset, end_lineno, + end_col_offset, arena); if (*out == NULL) goto failed; return 0; } @@ -6113,7 +6370,8 @@ obj2ast_expr(PyObject* obj, expr_ty* out, PyArena* arena) } Py_CLEAR(tmp); } - *out = DictComp(key, value, generators, lineno, col_offset, arena); + *out = DictComp(key, value, generators, lineno, col_offset, end_lineno, + end_col_offset, arena); if (*out == NULL) goto failed; return 0; } @@ -6168,7 +6426,8 @@ obj2ast_expr(PyObject* obj, expr_ty* out, PyArena* arena) } Py_CLEAR(tmp); } - *out = GeneratorExp(elt, generators, lineno, col_offset, arena); + *out = GeneratorExp(elt, generators, lineno, col_offset, end_lineno, + end_col_offset, arena); if (*out == NULL) goto failed; return 0; } @@ -6192,7 +6451,8 @@ obj2ast_expr(PyObject* obj, expr_ty* out, PyArena* arena) if (res != 0) goto failed; Py_CLEAR(tmp); } - *out = Await(value, lineno, col_offset, arena); + *out = Await(value, lineno, col_offset, end_lineno, end_col_offset, + arena); if (*out == NULL) goto failed; return 0; } @@ -6216,7 +6476,8 @@ obj2ast_expr(PyObject* obj, expr_ty* out, PyArena* arena) if (res != 0) goto failed; Py_CLEAR(tmp); } - *out = Yield(value, lineno, col_offset, arena); + *out = Yield(value, lineno, col_offset, end_lineno, end_col_offset, + arena); if (*out == NULL) goto failed; return 0; } @@ -6240,7 +6501,8 @@ obj2ast_expr(PyObject* obj, expr_ty* out, PyArena* arena) if (res != 0) goto failed; Py_CLEAR(tmp); } - *out = YieldFrom(value, lineno, col_offset, arena); + *out = YieldFrom(value, lineno, col_offset, end_lineno, end_col_offset, + arena); if (*out == NULL) goto failed; return 0; } @@ -6326,7 +6588,8 @@ obj2ast_expr(PyObject* obj, expr_ty* out, PyArena* arena) } Py_CLEAR(tmp); } - *out = Compare(left, ops, comparators, lineno, col_offset, arena); + *out = Compare(left, ops, comparators, lineno, col_offset, end_lineno, + end_col_offset, arena); if (*out == NULL) goto failed; return 0; } @@ -6412,7 +6675,8 @@ obj2ast_expr(PyObject* obj, expr_ty* out, PyArena* arena) } Py_CLEAR(tmp); } - *out = Call(func, args, keywords, lineno, col_offset, arena); + *out = Call(func, args, keywords, lineno, col_offset, end_lineno, + end_col_offset, arena); if (*out == NULL) goto failed; return 0; } @@ -6465,7 +6729,7 @@ obj2ast_expr(PyObject* obj, expr_ty* out, PyArena* arena) Py_CLEAR(tmp); } *out = FormattedValue(value, conversion, format_spec, lineno, - col_offset, arena); + col_offset, end_lineno, end_col_offset, arena); if (*out == NULL) goto failed; return 0; } @@ -6506,7 +6770,8 @@ obj2ast_expr(PyObject* obj, expr_ty* out, PyArena* arena) } Py_CLEAR(tmp); } - *out = JoinedStr(values, lineno, col_offset, arena); + *out = JoinedStr(values, lineno, col_offset, end_lineno, + end_col_offset, arena); if (*out == NULL) goto failed; return 0; } @@ -6530,7 +6795,8 @@ obj2ast_expr(PyObject* obj, expr_ty* out, PyArena* arena) if (res != 0) goto failed; Py_CLEAR(tmp); } - *out = Constant(value, lineno, col_offset, arena); + *out = Constant(value, lineno, col_offset, end_lineno, end_col_offset, + arena); if (*out == NULL) goto failed; return 0; } @@ -6582,7 +6848,8 @@ obj2ast_expr(PyObject* obj, expr_ty* out, PyArena* arena) if (res != 0) goto failed; Py_CLEAR(tmp); } - *out = Attribute(value, attr, ctx, lineno, col_offset, arena); + *out = Attribute(value, attr, ctx, lineno, col_offset, end_lineno, + end_col_offset, arena); if (*out == NULL) goto failed; return 0; } @@ -6634,7 +6901,8 @@ obj2ast_expr(PyObject* obj, expr_ty* out, PyArena* arena) if (res != 0) goto failed; Py_CLEAR(tmp); } - *out = Subscript(value, slice, ctx, lineno, col_offset, arena); + *out = Subscript(value, slice, ctx, lineno, col_offset, end_lineno, + end_col_offset, arena); if (*out == NULL) goto failed; return 0; } @@ -6672,7 +6940,8 @@ obj2ast_expr(PyObject* obj, expr_ty* out, PyArena* arena) if (res != 0) goto failed; Py_CLEAR(tmp); } - *out = Starred(value, ctx, lineno, col_offset, arena); + *out = Starred(value, ctx, lineno, col_offset, end_lineno, + end_col_offset, arena); if (*out == NULL) goto failed; return 0; } @@ -6710,7 +6979,8 @@ obj2ast_expr(PyObject* obj, expr_ty* out, PyArena* arena) if (res != 0) goto failed; Py_CLEAR(tmp); } - *out = Name(id, ctx, lineno, col_offset, arena); + *out = Name(id, ctx, lineno, col_offset, end_lineno, end_col_offset, + arena); if (*out == NULL) goto failed; return 0; } @@ -6765,7 +7035,8 @@ obj2ast_expr(PyObject* obj, expr_ty* out, PyArena* arena) if (res != 0) goto failed; Py_CLEAR(tmp); } - *out = List(elts, ctx, lineno, col_offset, arena); + *out = List(elts, ctx, lineno, col_offset, end_lineno, end_col_offset, + arena); if (*out == NULL) goto failed; return 0; } @@ -6820,7 +7091,8 @@ obj2ast_expr(PyObject* obj, expr_ty* out, PyArena* arena) if (res != 0) goto failed; Py_CLEAR(tmp); } - *out = Tuple(elts, ctx, lineno, col_offset, arena); + *out = Tuple(elts, ctx, lineno, col_offset, end_lineno, end_col_offset, + arena); if (*out == NULL) goto failed; return 0; } @@ -7389,6 +7661,8 @@ obj2ast_excepthandler(PyObject* obj, excepthandler_ty* out, PyArena* arena) PyObject *tmp = NULL; int lineno; int col_offset; + int end_lineno; + int end_col_offset; if (obj == Py_None) { *out = NULL; @@ -7420,6 +7694,32 @@ obj2ast_excepthandler(PyObject* obj, excepthandler_ty* out, PyArena* arena) if (res != 0) goto failed; Py_CLEAR(tmp); } + if (_PyObject_LookupAttrId(obj, &PyId_end_lineno, &tmp) < 0) { + return 1; + } + if (tmp == NULL || tmp == Py_None) { + Py_CLEAR(tmp); + end_lineno = 0; + } + else { + int res; + res = obj2ast_int(tmp, &end_lineno, arena); + if (res != 0) goto failed; + Py_CLEAR(tmp); + } + if (_PyObject_LookupAttrId(obj, &PyId_end_col_offset, &tmp) < 0) { + return 1; + } + if (tmp == NULL || tmp == Py_None) { + Py_CLEAR(tmp); + end_col_offset = 0; + } + else { + int res; + res = obj2ast_int(tmp, &end_col_offset, arena); + if (res != 0) goto failed; + Py_CLEAR(tmp); + } isinstance = PyObject_IsInstance(obj, (PyObject*)ExceptHandler_type); if (isinstance == -1) { return 1; @@ -7485,7 +7785,8 @@ obj2ast_excepthandler(PyObject* obj, excepthandler_ty* out, PyArena* arena) } Py_CLEAR(tmp); } - *out = ExceptHandler(type, name, body, lineno, col_offset, arena); + *out = ExceptHandler(type, name, body, lineno, col_offset, end_lineno, + end_col_offset, arena); if (*out == NULL) goto failed; return 0; } @@ -7669,6 +7970,8 @@ obj2ast_arg(PyObject* obj, arg_ty* out, PyArena* arena) expr_ty annotation; int lineno; int col_offset; + int end_lineno; + int end_col_offset; if (_PyObject_LookupAttrId(obj, &PyId_arg, &tmp) < 0) { return 1; @@ -7722,7 +8025,34 @@ obj2ast_arg(PyObject* obj, arg_ty* out, PyArena* arena) if (res != 0) goto failed; Py_CLEAR(tmp); } - *out = arg(arg, annotation, lineno, col_offset, arena); + if (_PyObject_LookupAttrId(obj, &PyId_end_lineno, &tmp) < 0) { + return 1; + } + if (tmp == NULL || tmp == Py_None) { + Py_CLEAR(tmp); + end_lineno = 0; + } + else { + int res; + res = obj2ast_int(tmp, &end_lineno, arena); + if (res != 0) goto failed; + Py_CLEAR(tmp); + } + if (_PyObject_LookupAttrId(obj, &PyId_end_col_offset, &tmp) < 0) { + return 1; + } + if (tmp == NULL || tmp == Py_None) { + Py_CLEAR(tmp); + end_col_offset = 0; + } + else { + int res; + res = obj2ast_int(tmp, &end_col_offset, arena); + if (res != 0) goto failed; + Py_CLEAR(tmp); + } + *out = arg(arg, annotation, lineno, col_offset, end_lineno, end_col_offset, + arena); return 0; failed: Py_XDECREF(tmp); diff --git a/Python/ast.c b/Python/ast.c index d71f44a6dbe2..855acca29e7c 100644 --- a/Python/ast.c +++ b/Python/ast.c @@ -580,10 +580,11 @@ static stmt_ty ast_for_for_stmt(struct compiling *, const node *, bool); /* Note different signature for ast_for_call */ static expr_ty ast_for_call(struct compiling *, const node *, expr_ty, - const node *); + const node *, const node *); static PyObject *parsenumber(struct compiling *, const char *); static expr_ty parsestrplus(struct compiling *, const node *n); +static void get_last_end_pos(asdl_seq *, int *, int *); #define COMP_GENEXP 0 #define COMP_LISTCOMP 1 @@ -810,6 +811,7 @@ PyAST_FromNodeObject(const node *n, PyCompilerFlags *flags, if (!stmts) goto out; asdl_seq_SET(stmts, 0, Pass(n->n_lineno, n->n_col_offset, + n->n_end_lineno, n->n_end_col_offset, arena)); if (!asdl_seq_GET(stmts, 0)) goto out; @@ -940,6 +942,8 @@ copy_location(expr_ty e, const node *n) if (e) { e->lineno = LINENO(n); e->col_offset = n->n_col_offset; + e->end_lineno = n->n_end_lineno; + e->end_col_offset = n->n_end_col_offset; } return e; } @@ -1228,7 +1232,8 @@ ast_for_arg(struct compiling *c, const node *n) return NULL; } - ret = arg(name, annotation, LINENO(n), n->n_col_offset, c->c_arena); + ret = arg(name, annotation, LINENO(n), n->n_col_offset, + n->n_end_lineno, n->n_end_col_offset, c->c_arena); if (!ret) return NULL; return ret; @@ -1287,6 +1292,7 @@ handle_keywordonly_args(struct compiling *c, const node *n, int start, if (forbidden_name(c, argname, ch, 0)) goto error; arg = arg(argname, annotation, LINENO(ch), ch->n_col_offset, + ch->n_end_lineno, ch->n_end_col_offset, c->c_arena); if (!arg) goto error; @@ -1480,16 +1486,19 @@ ast_for_dotted_name(struct compiling *c, const node *n) identifier id; int lineno, col_offset; int i; + node *ch; REQ(n, dotted_name); lineno = LINENO(n); col_offset = n->n_col_offset; - id = NEW_IDENTIFIER(CHILD(n, 0)); + ch = CHILD(n, 0); + id = NEW_IDENTIFIER(ch); if (!id) return NULL; - e = Name(id, Load, lineno, col_offset, c->c_arena); + e = Name(id, Load, lineno, col_offset, + ch->n_end_lineno, ch->n_end_col_offset, c->c_arena); if (!e) return NULL; @@ -1497,7 +1506,8 @@ ast_for_dotted_name(struct compiling *c, const node *n) id = NEW_IDENTIFIER(CHILD(n, i)); if (!id) return NULL; - e = Attribute(e, id, Load, lineno, col_offset, c->c_arena); + e = Attribute(e, id, Load, lineno, col_offset, + n->n_end_lineno, n->n_end_col_offset, c->c_arena); if (!e) return NULL; } @@ -1526,13 +1536,13 @@ ast_for_decorator(struct compiling *c, const node *n) } else if (NCH(n) == 5) { /* Call with no arguments */ d = Call(name_expr, NULL, NULL, LINENO(n), - n->n_col_offset, c->c_arena); + n->n_col_offset, n->n_end_lineno, n->n_end_col_offset, c->c_arena); if (!d) return NULL; name_expr = NULL; } else { - d = ast_for_call(c, CHILD(n, 3), name_expr, CHILD(n, 2)); + d = ast_for_call(c, CHILD(n, 3), name_expr, CHILD(n, 2), CHILD(n, 4)); if (!d) return NULL; name_expr = NULL; @@ -1573,6 +1583,7 @@ ast_for_funcdef_impl(struct compiling *c, const node *n0, asdl_seq *body; expr_ty returns = NULL; int name_i = 1; + int end_lineno, end_col_offset; REQ(n, funcdef); @@ -1593,13 +1604,14 @@ ast_for_funcdef_impl(struct compiling *c, const node *n0, body = ast_for_suite(c, CHILD(n, name_i + 3)); if (!body) return NULL; + get_last_end_pos(body, &end_lineno, &end_col_offset); if (is_async) return AsyncFunctionDef(name, args, body, decorator_seq, returns, - LINENO(n0), n0->n_col_offset, c->c_arena); + LINENO(n0), n0->n_col_offset, end_lineno, end_col_offset, c->c_arena); else return FunctionDef(name, args, body, decorator_seq, returns, - LINENO(n), n->n_col_offset, c->c_arena); + LINENO(n), n->n_col_offset, end_lineno, end_col_offset, c->c_arena); } static stmt_ty @@ -1704,7 +1716,8 @@ ast_for_lambdef(struct compiling *c, const node *n) return NULL; } - return Lambda(args, expression, LINENO(n), n->n_col_offset, c->c_arena); + return Lambda(args, expression, LINENO(n), n->n_col_offset, + n->n_end_lineno, n->n_end_col_offset, c->c_arena); } static expr_ty @@ -1724,6 +1737,7 @@ ast_for_ifexpr(struct compiling *c, const node *n) if (!orelse) return NULL; return IfExp(expression, body, orelse, LINENO(n), n->n_col_offset, + n->n_end_lineno, n->n_end_col_offset, c->c_arena); } @@ -1852,8 +1866,9 @@ ast_for_comprehension(struct compiling *c, const node *n) comp = comprehension(first, expression, NULL, is_async, c->c_arena); else - comp = comprehension(Tuple(t, Store, first->lineno, - first->col_offset, c->c_arena), + comp = comprehension(Tuple(t, Store, first->lineno, first->col_offset, + for_ch->n_end_lineno, for_ch->n_end_col_offset, + c->c_arena), expression, NULL, is_async, c->c_arena); if (!comp) return NULL; @@ -1918,11 +1933,14 @@ ast_for_itercomp(struct compiling *c, const node *n, int type) return NULL; if (type == COMP_GENEXP) - return GeneratorExp(elt, comps, LINENO(n), n->n_col_offset, c->c_arena); + return GeneratorExp(elt, comps, LINENO(n), n->n_col_offset, + n->n_end_lineno, n->n_end_col_offset, c->c_arena); else if (type == COMP_LISTCOMP) - return ListComp(elt, comps, LINENO(n), n->n_col_offset, c->c_arena); + return ListComp(elt, comps, LINENO(n), n->n_col_offset, + n->n_end_lineno, n->n_end_col_offset, c->c_arena); else if (type == COMP_SETCOMP) - return SetComp(elt, comps, LINENO(n), n->n_col_offset, c->c_arena); + return SetComp(elt, comps, LINENO(n), n->n_col_offset, + n->n_end_lineno, n->n_end_col_offset, c->c_arena); else /* Should never happen */ return NULL; @@ -1984,7 +2002,8 @@ ast_for_dictcomp(struct compiling *c, const node *n) if (!comps) return NULL; - return DictComp(key, value, comps, LINENO(n), n->n_col_offset, c->c_arena); + return DictComp(key, value, comps, LINENO(n), n->n_col_offset, + n->n_end_lineno, n->n_end_col_offset, c->c_arena); } static expr_ty @@ -2017,7 +2036,8 @@ ast_for_dictdisplay(struct compiling *c, const node *n) } keys->size = j; values->size = j; - return Dict(keys, values, LINENO(n), n->n_col_offset, c->c_arena); + return Dict(keys, values, LINENO(n), n->n_col_offset, + n->n_end_lineno, n->n_end_col_offset, c->c_arena); } static expr_ty @@ -2060,7 +2080,8 @@ ast_for_setdisplay(struct compiling *c, const node *n) return NULL; asdl_seq_SET(elts, i / 2, expression); } - return Set(elts, LINENO(n), n->n_col_offset, c->c_arena); + return Set(elts, LINENO(n), n->n_col_offset, + n->n_end_lineno, n->n_end_col_offset, c->c_arena); } static expr_ty @@ -2079,17 +2100,21 @@ ast_for_atom(struct compiling *c, const node *n) size_t len = strlen(s); if (len >= 4 && len <= 5) { if (!strcmp(s, "None")) - return Constant(Py_None, LINENO(n), n->n_col_offset, c->c_arena); + return Constant(Py_None, LINENO(n), n->n_col_offset, + n->n_end_lineno, n->n_end_col_offset, c->c_arena); if (!strcmp(s, "True")) - return Constant(Py_True, LINENO(n), n->n_col_offset, c->c_arena); + return Constant(Py_True, LINENO(n), n->n_col_offset, + n->n_end_lineno, n->n_end_col_offset, c->c_arena); if (!strcmp(s, "False")) - return Constant(Py_False, LINENO(n), n->n_col_offset, c->c_arena); + return Constant(Py_False, LINENO(n), n->n_col_offset, + n->n_end_lineno, n->n_end_col_offset, c->c_arena); } name = new_identifier(s, c); if (!name) return NULL; /* All names start in Load context, but may later be changed. */ - return Name(name, Load, LINENO(n), n->n_col_offset, c->c_arena); + return Name(name, Load, LINENO(n), n->n_col_offset, + n->n_end_lineno, n->n_end_col_offset, c->c_arena); } case STRING: { expr_ty str = parsestrplus(c, n); @@ -2128,15 +2153,18 @@ ast_for_atom(struct compiling *c, const node *n) Py_DECREF(pynum); return NULL; } - return Constant(pynum, LINENO(n), n->n_col_offset, c->c_arena); + return Constant(pynum, LINENO(n), n->n_col_offset, + n->n_end_lineno, n->n_end_col_offset, c->c_arena); } case ELLIPSIS: /* Ellipsis */ - return Constant(Py_Ellipsis, LINENO(n), n->n_col_offset, c->c_arena); + return Constant(Py_Ellipsis, LINENO(n), n->n_col_offset, + n->n_end_lineno, n->n_end_col_offset, c->c_arena); case LPAR: /* some parenthesized expressions */ ch = CHILD(n, 1); if (TYPE(ch) == RPAR) - return Tuple(NULL, Load, LINENO(n), n->n_col_offset, c->c_arena); + return Tuple(NULL, Load, LINENO(n), n->n_col_offset, + n->n_end_lineno, n->n_end_col_offset, c->c_arena); if (TYPE(ch) == yield_expr) return ast_for_expr(c, ch); @@ -2156,7 +2184,8 @@ ast_for_atom(struct compiling *c, const node *n) ch = CHILD(n, 1); if (TYPE(ch) == RSQB) - return List(NULL, Load, LINENO(n), n->n_col_offset, c->c_arena); + return List(NULL, Load, LINENO(n), n->n_col_offset, + n->n_end_lineno, n->n_end_col_offset, c->c_arena); REQ(ch, testlist_comp); if (NCH(ch) == 1 || TYPE(CHILD(ch, 1)) == COMMA) { @@ -2164,7 +2193,8 @@ ast_for_atom(struct compiling *c, const node *n) if (!elts) return NULL; - return List(elts, Load, LINENO(n), n->n_col_offset, c->c_arena); + return List(elts, Load, LINENO(n), n->n_col_offset, + n->n_end_lineno, n->n_end_col_offset, c->c_arena); } else { return copy_location(ast_for_listcomp(c, ch), n); @@ -2178,7 +2208,8 @@ ast_for_atom(struct compiling *c, const node *n) ch = CHILD(n, 1); if (TYPE(ch) == RBRACE) { /* It's an empty dict. */ - return Dict(NULL, NULL, LINENO(n), n->n_col_offset, c->c_arena); + return Dict(NULL, NULL, LINENO(n), n->n_col_offset, + n->n_end_lineno, n->n_end_col_offset, c->c_arena); } else { int is_dict = (TYPE(CHILD(ch, 0)) == DOUBLESTAR); @@ -2306,6 +2337,7 @@ ast_for_binop(struct compiling *c, const node *n) return NULL; result = BinOp(expr1, newoperator, expr2, LINENO(n), n->n_col_offset, + CHILD(n, 2)->n_end_lineno, CHILD(n, 2)->n_end_col_offset, c->c_arena); if (!result) return NULL; @@ -2325,6 +2357,8 @@ ast_for_binop(struct compiling *c, const node *n) tmp_result = BinOp(result, newoperator, tmp, LINENO(next_oper), next_oper->n_col_offset, + CHILD(n, i * 2 + 2)->n_end_lineno, + CHILD(n, i * 2 + 2)->n_end_col_offset, c->c_arena); if (!tmp_result) return NULL; @@ -2340,20 +2374,22 @@ ast_for_trailer(struct compiling *c, const node *n, expr_ty left_expr) subscriptlist: subscript (',' subscript)* [','] subscript: '.' '.' '.' | test | [test] ':' [test] [sliceop] */ + const node *n_copy = n; REQ(n, trailer); if (TYPE(CHILD(n, 0)) == LPAR) { if (NCH(n) == 2) - return Call(left_expr, NULL, NULL, LINENO(n), - n->n_col_offset, c->c_arena); + return Call(left_expr, NULL, NULL, LINENO(n), n->n_col_offset, + n->n_end_lineno, n->n_end_col_offset, c->c_arena); else - return ast_for_call(c, CHILD(n, 1), left_expr, CHILD(n, 0)); + return ast_for_call(c, CHILD(n, 1), left_expr, CHILD(n, 0), CHILD(n, 2)); } else if (TYPE(CHILD(n, 0)) == DOT) { PyObject *attr_id = NEW_IDENTIFIER(CHILD(n, 1)); if (!attr_id) return NULL; return Attribute(left_expr, attr_id, Load, - LINENO(n), n->n_col_offset, c->c_arena); + LINENO(n), n->n_col_offset, + n->n_end_lineno, n->n_end_col_offset, c->c_arena); } else { REQ(CHILD(n, 0), LSQB); @@ -2364,6 +2400,7 @@ ast_for_trailer(struct compiling *c, const node *n, expr_ty left_expr) if (!slc) return NULL; return Subscript(left_expr, slc, Load, LINENO(n), n->n_col_offset, + n_copy->n_end_lineno, n_copy->n_end_col_offset, c->c_arena); } else { @@ -2389,7 +2426,8 @@ ast_for_trailer(struct compiling *c, const node *n, expr_ty left_expr) } if (!simple) { return Subscript(left_expr, ExtSlice(slices, c->c_arena), - Load, LINENO(n), n->n_col_offset, c->c_arena); + Load, LINENO(n), n->n_col_offset, + n_copy->n_end_lineno, n_copy->n_end_col_offset, c->c_arena); } /* extract Index values and put them in a Tuple */ elts = _Py_asdl_seq_new(asdl_seq_LEN(slices), c->c_arena); @@ -2400,11 +2438,13 @@ ast_for_trailer(struct compiling *c, const node *n, expr_ty left_expr) assert(slc->kind == Index_kind && slc->v.Index.value); asdl_seq_SET(elts, j, slc->v.Index.value); } - e = Tuple(elts, Load, LINENO(n), n->n_col_offset, c->c_arena); + e = Tuple(elts, Load, LINENO(n), n->n_col_offset, + n->n_end_lineno, n->n_end_col_offset, c->c_arena); if (!e) return NULL; return Subscript(left_expr, Index(e, c->c_arena), - Load, LINENO(n), n->n_col_offset, c->c_arena); + Load, LINENO(n), n->n_col_offset, + n_copy->n_end_lineno, n_copy->n_end_col_offset, c->c_arena); } } } @@ -2421,13 +2461,16 @@ ast_for_factor(struct compiling *c, const node *n) switch (TYPE(CHILD(n, 0))) { case PLUS: return UnaryOp(UAdd, expression, LINENO(n), n->n_col_offset, + n->n_end_lineno, n->n_end_col_offset, c->c_arena); case MINUS: return UnaryOp(USub, expression, LINENO(n), n->n_col_offset, + n->n_end_lineno, n->n_end_col_offset, c->c_arena); case TILDE: - return UnaryOp(Invert, expression, LINENO(n), - n->n_col_offset, c->c_arena); + return UnaryOp(Invert, expression, LINENO(n), n->n_col_offset, + n->n_end_lineno, n->n_end_col_offset, + c->c_arena); } PyErr_Format(PyExc_SystemError, "unhandled factor: %d", TYPE(CHILD(n, 0))); @@ -2454,7 +2497,8 @@ ast_for_atom_expr(struct compiling *c, const node *n) if (nch == 1) return e; if (start && nch == 2) { - return Await(e, LINENO(n), n->n_col_offset, c->c_arena); + return Await(e, LINENO(n), n->n_col_offset, + n->n_end_lineno, n->n_end_col_offset, c->c_arena); } for (i = start + 1; i < nch; i++) { @@ -2471,7 +2515,8 @@ ast_for_atom_expr(struct compiling *c, const node *n) if (start) { /* there was an 'await' */ - return Await(e, LINENO(n), n->n_col_offset, c->c_arena); + return Await(e, LINENO(n), n->n_col_offset, + n->n_end_lineno, n->n_end_col_offset, c->c_arena); } else { return e; @@ -2494,7 +2539,8 @@ ast_for_power(struct compiling *c, const node *n) expr_ty f = ast_for_expr(c, CHILD(n, NCH(n) - 1)); if (!f) return NULL; - e = BinOp(e, Pow, f, LINENO(n), n->n_col_offset, c->c_arena); + e = BinOp(e, Pow, f, LINENO(n), n->n_col_offset, + n->n_end_lineno, n->n_end_col_offset, c->c_arena); } return e; } @@ -2510,7 +2556,8 @@ ast_for_starred(struct compiling *c, const node *n) return NULL; /* The Load context is changed later. */ - return Starred(tmp, Load, LINENO(n), n->n_col_offset, c->c_arena); + return Starred(tmp, Load, LINENO(n), n->n_col_offset, + n->n_end_lineno, n->n_end_col_offset, c->c_arena); } @@ -2569,9 +2616,11 @@ ast_for_expr(struct compiling *c, const node *n) } if (!strcmp(STR(CHILD(n, 1)), "and")) return BoolOp(And, seq, LINENO(n), n->n_col_offset, + n->n_end_lineno, n->n_end_col_offset, c->c_arena); assert(!strcmp(STR(CHILD(n, 1)), "or")); - return BoolOp(Or, seq, LINENO(n), n->n_col_offset, c->c_arena); + return BoolOp(Or, seq, LINENO(n), n->n_col_offset, + n->n_end_lineno, n->n_end_col_offset, c->c_arena); case not_test: if (NCH(n) == 1) { n = CHILD(n, 0); @@ -2583,6 +2632,7 @@ ast_for_expr(struct compiling *c, const node *n) return NULL; return UnaryOp(Not, expression, LINENO(n), n->n_col_offset, + n->n_end_lineno, n->n_end_col_offset, c->c_arena); } case comparison: @@ -2622,8 +2672,8 @@ ast_for_expr(struct compiling *c, const node *n) return NULL; } - return Compare(expression, ops, cmps, LINENO(n), - n->n_col_offset, c->c_arena); + return Compare(expression, ops, cmps, LINENO(n), n->n_col_offset, + n->n_end_lineno, n->n_end_col_offset, c->c_arena); } break; @@ -2663,8 +2713,10 @@ ast_for_expr(struct compiling *c, const node *n) return NULL; } if (is_from) - return YieldFrom(exp, LINENO(n), n->n_col_offset, c->c_arena); - return Yield(exp, LINENO(n), n->n_col_offset, c->c_arena); + return YieldFrom(exp, LINENO(n), n->n_col_offset, + n->n_end_lineno, n->n_end_col_offset, c->c_arena); + return Yield(exp, LINENO(n), n->n_col_offset, + n->n_end_lineno, n->n_end_col_offset, c->c_arena); } case factor: if (NCH(n) == 1) { @@ -2684,7 +2736,7 @@ ast_for_expr(struct compiling *c, const node *n) static expr_ty ast_for_call(struct compiling *c, const node *n, expr_ty func, - const node *maybegenbeg) + const node *maybegenbeg, const node *closepar) { /* arglist: argument (',' argument)* [','] @@ -2773,6 +2825,7 @@ ast_for_call(struct compiling *c, const node *n, expr_ty func, return NULL; starred = Starred(e, Load, LINENO(chch), chch->n_col_offset, + chch->n_end_lineno, chch->n_end_col_offset, c->c_arena); if (!starred) return NULL; @@ -2864,7 +2917,8 @@ ast_for_call(struct compiling *c, const node *n, expr_ty func, } } - return Call(func, args, keywords, func->lineno, func->col_offset, c->c_arena); + return Call(func, args, keywords, func->lineno, func->col_offset, + closepar->n_end_lineno, closepar->n_end_col_offset, c->c_arena); } static expr_ty @@ -2887,7 +2941,8 @@ ast_for_testlist(struct compiling *c, const node* n) asdl_seq *tmp = seq_for_testlist(c, n); if (!tmp) return NULL; - return Tuple(tmp, Load, LINENO(n), n->n_col_offset, c->c_arena); + return Tuple(tmp, Load, LINENO(n), n->n_col_offset, + n->n_end_lineno, n->n_end_col_offset, c->c_arena); } } @@ -2909,7 +2964,8 @@ ast_for_expr_stmt(struct compiling *c, const node *n) if (!e) return NULL; - return Expr(e, LINENO(n), n->n_col_offset, c->c_arena); + return Expr(e, LINENO(n), n->n_col_offset, + n->n_end_lineno, n->n_end_col_offset, c->c_arena); } else if (TYPE(CHILD(n, 1)) == augassign) { expr_ty expr1, expr2; @@ -2947,7 +3003,8 @@ ast_for_expr_stmt(struct compiling *c, const node *n) if (!newoperator) return NULL; - return AugAssign(expr1, newoperator, expr2, LINENO(n), n->n_col_offset, c->c_arena); + return AugAssign(expr1, newoperator, expr2, LINENO(n), n->n_col_offset, + n->n_end_lineno, n->n_end_col_offset, c->c_arena); } else if (TYPE(CHILD(n, 1)) == annassign) { expr_ty expr1, expr2, expr3; @@ -3007,7 +3064,8 @@ ast_for_expr_stmt(struct compiling *c, const node *n) } if (NCH(ann) == 2) { return AnnAssign(expr1, expr2, NULL, simple, - LINENO(n), n->n_col_offset, c->c_arena); + LINENO(n), n->n_col_offset, + n->n_end_lineno, n->n_end_col_offset, c->c_arena); } else { ch = CHILD(ann, 3); @@ -3016,7 +3074,8 @@ ast_for_expr_stmt(struct compiling *c, const node *n) return NULL; } return AnnAssign(expr1, expr2, expr3, simple, - LINENO(n), n->n_col_offset, c->c_arena); + LINENO(n), n->n_col_offset, + n->n_end_lineno, n->n_end_col_offset, c->c_arena); } } else { @@ -3054,7 +3113,8 @@ ast_for_expr_stmt(struct compiling *c, const node *n) expression = ast_for_expr(c, value); if (!expression) return NULL; - return Assign(targets, expression, LINENO(n), n->n_col_offset, c->c_arena); + return Assign(targets, expression, LINENO(n), n->n_col_offset, + n->n_end_lineno, n->n_end_col_offset, c->c_arena); } } @@ -3093,7 +3153,8 @@ ast_for_del_stmt(struct compiling *c, const node *n) expr_list = ast_for_exprlist(c, CHILD(n, 1), Del); if (!expr_list) return NULL; - return Delete(expr_list, LINENO(n), n->n_col_offset, c->c_arena); + return Delete(expr_list, LINENO(n), n->n_col_offset, + n->n_end_lineno, n->n_end_col_offset, c->c_arena); } static stmt_ty @@ -3115,27 +3176,33 @@ ast_for_flow_stmt(struct compiling *c, const node *n) ch = CHILD(n, 0); switch (TYPE(ch)) { case break_stmt: - return Break(LINENO(n), n->n_col_offset, c->c_arena); + return Break(LINENO(n), n->n_col_offset, + n->n_end_lineno, n->n_end_col_offset, c->c_arena); case continue_stmt: - return Continue(LINENO(n), n->n_col_offset, c->c_arena); + return Continue(LINENO(n), n->n_col_offset, + n->n_end_lineno, n->n_end_col_offset, c->c_arena); case yield_stmt: { /* will reduce to yield_expr */ expr_ty exp = ast_for_expr(c, CHILD(ch, 0)); if (!exp) return NULL; - return Expr(exp, LINENO(n), n->n_col_offset, c->c_arena); + return Expr(exp, LINENO(n), n->n_col_offset, + n->n_end_lineno, n->n_end_col_offset, c->c_arena); } case return_stmt: if (NCH(ch) == 1) - return Return(NULL, LINENO(n), n->n_col_offset, c->c_arena); + return Return(NULL, LINENO(n), n->n_col_offset, + n->n_end_lineno, n->n_end_col_offset, c->c_arena); else { expr_ty expression = ast_for_testlist(c, CHILD(ch, 1)); if (!expression) return NULL; - return Return(expression, LINENO(n), n->n_col_offset, c->c_arena); + return Return(expression, LINENO(n), n->n_col_offset, + n->n_end_lineno, n->n_end_col_offset, c->c_arena); } case raise_stmt: if (NCH(ch) == 1) - return Raise(NULL, NULL, LINENO(n), n->n_col_offset, c->c_arena); + return Raise(NULL, NULL, LINENO(n), n->n_col_offset, + n->n_end_lineno, n->n_end_col_offset, c->c_arena); else if (NCH(ch) >= 2) { expr_ty cause = NULL; expr_ty expression = ast_for_expr(c, CHILD(ch, 1)); @@ -3146,7 +3213,8 @@ ast_for_flow_stmt(struct compiling *c, const node *n) if (!cause) return NULL; } - return Raise(expression, cause, LINENO(n), n->n_col_offset, c->c_arena); + return Raise(expression, cause, LINENO(n), n->n_col_offset, + n->n_end_lineno, n->n_end_col_offset, c->c_arena); } /* fall through */ default: @@ -3307,11 +3375,14 @@ ast_for_import_stmt(struct compiling *c, const node *n) return NULL; asdl_seq_SET(aliases, i / 2, import_alias); } - return Import(aliases, lineno, col_offset, c->c_arena); + // Even though n is modified above, the end position is not changed + return Import(aliases, lineno, col_offset, + n->n_end_lineno, n->n_end_col_offset, c->c_arena); } else if (TYPE(n) == import_from) { int n_children; int idx, ndots = 0; + const node *n_copy = n; alias_ty mod = NULL; identifier modname = NULL; @@ -3382,6 +3453,7 @@ ast_for_import_stmt(struct compiling *c, const node *n) if (mod != NULL) modname = mod->name; return ImportFrom(modname, aliases, ndots, lineno, col_offset, + n_copy->n_end_lineno, n_copy->n_end_col_offset, c->c_arena); } PyErr_Format(PyExc_SystemError, @@ -3408,7 +3480,8 @@ ast_for_global_stmt(struct compiling *c, const node *n) return NULL; asdl_seq_SET(s, i / 2, name); } - return Global(s, LINENO(n), n->n_col_offset, c->c_arena); + return Global(s, LINENO(n), n->n_col_offset, + n->n_end_lineno, n->n_end_col_offset, c->c_arena); } static stmt_ty @@ -3429,7 +3502,8 @@ ast_for_nonlocal_stmt(struct compiling *c, const node *n) return NULL; asdl_seq_SET(s, i / 2, name); } - return Nonlocal(s, LINENO(n), n->n_col_offset, c->c_arena); + return Nonlocal(s, LINENO(n), n->n_col_offset, + n->n_end_lineno, n->n_end_col_offset, c->c_arena); } static stmt_ty @@ -3441,7 +3515,8 @@ ast_for_assert_stmt(struct compiling *c, const node *n) expr_ty expression = ast_for_expr(c, CHILD(n, 1)); if (!expression) return NULL; - return Assert(expression, NULL, LINENO(n), n->n_col_offset, c->c_arena); + return Assert(expression, NULL, LINENO(n), n->n_col_offset, + n->n_end_lineno, n->n_end_col_offset, c->c_arena); } else if (NCH(n) == 4) { expr_ty expr1, expr2; @@ -3453,7 +3528,8 @@ ast_for_assert_stmt(struct compiling *c, const node *n) if (!expr2) return NULL; - return Assert(expr1, expr2, LINENO(n), n->n_col_offset, c->c_arena); + return Assert(expr1, expr2, LINENO(n), n->n_col_offset, + n->n_end_lineno, n->n_end_col_offset, c->c_arena); } PyErr_Format(PyExc_SystemError, "improper number of parts to 'assert' statement: %d", @@ -3527,6 +3603,20 @@ ast_for_suite(struct compiling *c, const node *n) return seq; } +static void +get_last_end_pos(asdl_seq *s, int *end_lineno, int *end_col_offset) +{ + int tot = asdl_seq_LEN(s); + // Suite should not be empty, but it is safe to just ignore it + // if it will ever occur. + if (!tot) { + return; + } + stmt_ty last = asdl_seq_GET(s, tot - 1); + *end_lineno = last->end_lineno; + *end_col_offset = last->end_col_offset; +} + static stmt_ty ast_for_if_stmt(struct compiling *c, const node *n) { @@ -3534,6 +3624,7 @@ ast_for_if_stmt(struct compiling *c, const node *n) ['else' ':' suite] */ char *s; + int end_lineno, end_col_offset; REQ(n, if_stmt); @@ -3547,9 +3638,10 @@ ast_for_if_stmt(struct compiling *c, const node *n) suite_seq = ast_for_suite(c, CHILD(n, 3)); if (!suite_seq) return NULL; + get_last_end_pos(suite_seq, &end_lineno, &end_col_offset); return If(expression, suite_seq, NULL, LINENO(n), n->n_col_offset, - c->c_arena); + end_lineno, end_col_offset, c->c_arena); } s = STR(CHILD(n, 4)); @@ -3570,9 +3662,10 @@ ast_for_if_stmt(struct compiling *c, const node *n) seq2 = ast_for_suite(c, CHILD(n, 6)); if (!seq2) return NULL; + get_last_end_pos(seq2, &end_lineno, &end_col_offset); return If(expression, seq1, seq2, LINENO(n), n->n_col_offset, - c->c_arena); + end_lineno, end_col_offset, c->c_arena); } else if (s[2] == 'i') { int i, n_elif, has_else = 0; @@ -3604,12 +3697,13 @@ ast_for_if_stmt(struct compiling *c, const node *n) suite_seq2 = ast_for_suite(c, CHILD(n, NCH(n) - 1)); if (!suite_seq2) return NULL; + get_last_end_pos(suite_seq2, &end_lineno, &end_col_offset); asdl_seq_SET(orelse, 0, If(expression, suite_seq, suite_seq2, LINENO(CHILD(n, NCH(n) - 6)), CHILD(n, NCH(n) - 6)->n_col_offset, - c->c_arena)); + end_lineno, end_col_offset, c->c_arena)); /* the just-created orelse handled the last elif */ n_elif--; } @@ -3626,10 +3720,16 @@ ast_for_if_stmt(struct compiling *c, const node *n) if (!suite_seq) return NULL; + if (orelse != NULL) { + get_last_end_pos(orelse, &end_lineno, &end_col_offset); + } else { + get_last_end_pos(suite_seq, &end_lineno, &end_col_offset); + } asdl_seq_SET(newobj, 0, If(expression, suite_seq, orelse, LINENO(CHILD(n, off)), - CHILD(n, off)->n_col_offset, c->c_arena)); + CHILD(n, off)->n_col_offset, + end_lineno, end_col_offset, c->c_arena)); orelse = newobj; } expression = ast_for_expr(c, CHILD(n, 1)); @@ -3638,8 +3738,10 @@ ast_for_if_stmt(struct compiling *c, const node *n) suite_seq = ast_for_suite(c, CHILD(n, 3)); if (!suite_seq) return NULL; + get_last_end_pos(orelse, &end_lineno, &end_col_offset); return If(expression, suite_seq, orelse, - LINENO(n), n->n_col_offset, c->c_arena); + LINENO(n), n->n_col_offset, + end_lineno, end_col_offset, c->c_arena); } PyErr_Format(PyExc_SystemError, @@ -3652,6 +3754,7 @@ ast_for_while_stmt(struct compiling *c, const node *n) { /* while_stmt: 'while' test ':' suite ['else' ':' suite] */ REQ(n, while_stmt); + int end_lineno, end_col_offset; if (NCH(n) == 4) { expr_ty expression; @@ -3663,7 +3766,9 @@ ast_for_while_stmt(struct compiling *c, const node *n) suite_seq = ast_for_suite(c, CHILD(n, 3)); if (!suite_seq) return NULL; - return While(expression, suite_seq, NULL, LINENO(n), n->n_col_offset, c->c_arena); + get_last_end_pos(suite_seq, &end_lineno, &end_col_offset); + return While(expression, suite_seq, NULL, LINENO(n), n->n_col_offset, + end_lineno, end_col_offset, c->c_arena); } else if (NCH(n) == 7) { expr_ty expression; @@ -3678,8 +3783,10 @@ ast_for_while_stmt(struct compiling *c, const node *n) seq2 = ast_for_suite(c, CHILD(n, 6)); if (!seq2) return NULL; + get_last_end_pos(seq2, &end_lineno, &end_col_offset); - return While(expression, seq1, seq2, LINENO(n), n->n_col_offset, c->c_arena); + return While(expression, seq1, seq2, LINENO(n), n->n_col_offset, + end_lineno, end_col_offset, c->c_arena); } PyErr_Format(PyExc_SystemError, @@ -3696,6 +3803,7 @@ ast_for_for_stmt(struct compiling *c, const node *n0, bool is_async) expr_ty expression; expr_ty target, first; const node *node_target; + int end_lineno, end_col_offset; /* for_stmt: 'for' exprlist 'in' testlist ':' suite ['else' ':' suite] */ REQ(n, for_stmt); @@ -3715,7 +3823,9 @@ ast_for_for_stmt(struct compiling *c, const node *n0, bool is_async) if (NCH(node_target) == 1) target = first; else - target = Tuple(_target, Store, first->lineno, first->col_offset, c->c_arena); + target = Tuple(_target, Store, first->lineno, first->col_offset, + node_target->n_end_lineno, node_target->n_end_col_offset, + c->c_arena); expression = ast_for_testlist(c, CHILD(n, 3)); if (!expression) @@ -3724,20 +3834,26 @@ ast_for_for_stmt(struct compiling *c, const node *n0, bool is_async) if (!suite_seq) return NULL; + if (seq != NULL) { + get_last_end_pos(seq, &end_lineno, &end_col_offset); + } else { + get_last_end_pos(suite_seq, &end_lineno, &end_col_offset); + } if (is_async) return AsyncFor(target, expression, suite_seq, seq, LINENO(n0), n0->n_col_offset, - c->c_arena); + end_lineno, end_col_offset, c->c_arena); else return For(target, expression, suite_seq, seq, LINENO(n), n->n_col_offset, - c->c_arena); + end_lineno, end_col_offset, c->c_arena); } static excepthandler_ty ast_for_except_clause(struct compiling *c, const node *exc, node *body) { /* except_clause: 'except' [test ['as' test]] */ + int end_lineno, end_col_offset; REQ(exc, except_clause); REQ(body, suite); @@ -3745,9 +3861,11 @@ ast_for_except_clause(struct compiling *c, const node *exc, node *body) asdl_seq *suite_seq = ast_for_suite(c, body); if (!suite_seq) return NULL; + get_last_end_pos(suite_seq, &end_lineno, &end_col_offset); return ExceptHandler(NULL, NULL, suite_seq, LINENO(exc), - exc->n_col_offset, c->c_arena); + exc->n_col_offset, + end_lineno, end_col_offset, c->c_arena); } else if (NCH(exc) == 2) { expr_ty expression; @@ -3759,9 +3877,11 @@ ast_for_except_clause(struct compiling *c, const node *exc, node *body) suite_seq = ast_for_suite(c, body); if (!suite_seq) return NULL; + get_last_end_pos(suite_seq, &end_lineno, &end_col_offset); return ExceptHandler(expression, NULL, suite_seq, LINENO(exc), - exc->n_col_offset, c->c_arena); + exc->n_col_offset, + end_lineno, end_col_offset, c->c_arena); } else if (NCH(exc) == 4) { asdl_seq *suite_seq; @@ -3777,9 +3897,11 @@ ast_for_except_clause(struct compiling *c, const node *exc, node *body) suite_seq = ast_for_suite(c, body); if (!suite_seq) return NULL; + get_last_end_pos(suite_seq, &end_lineno, &end_col_offset); return ExceptHandler(expression, e, suite_seq, LINENO(exc), - exc->n_col_offset, c->c_arena); + exc->n_col_offset, + end_lineno, end_col_offset, c->c_arena); } PyErr_Format(PyExc_SystemError, @@ -3792,8 +3914,9 @@ static stmt_ty ast_for_try_stmt(struct compiling *c, const node *n) { const int nch = NCH(n); - int n_except = (nch - 3)/3; + int end_lineno, end_col_offset, n_except = (nch - 3)/3; asdl_seq *body, *handlers = NULL, *orelse = NULL, *finally = NULL; + excepthandler_ty last_handler; REQ(n, try_stmt); @@ -3849,7 +3972,20 @@ ast_for_try_stmt(struct compiling *c, const node *n) } assert(finally != NULL || asdl_seq_LEN(handlers)); - return Try(body, handlers, orelse, finally, LINENO(n), n->n_col_offset, c->c_arena); + if (finally != NULL) { + // finally is always last + get_last_end_pos(finally, &end_lineno, &end_col_offset); + } else if (orelse != NULL) { + // otherwise else is last + get_last_end_pos(orelse, &end_lineno, &end_col_offset); + } else { + // inline the get_last_end_pos logic due to layout mismatch + last_handler = (excepthandler_ty) asdl_seq_GET(handlers, n_except - 1); + end_lineno = last_handler->end_lineno; + end_col_offset = last_handler->end_col_offset; + } + return Try(body, handlers, orelse, finally, LINENO(n), n->n_col_offset, + end_lineno, end_col_offset, c->c_arena); } /* with_item: test ['as' expr] */ @@ -3881,7 +4017,7 @@ static stmt_ty ast_for_with_stmt(struct compiling *c, const node *n0, bool is_async) { const node * const n = is_async ? CHILD(n0, 1) : n0; - int i, n_items; + int i, n_items, end_lineno, end_col_offset; asdl_seq *items, *body; REQ(n, with_stmt); @@ -3900,11 +4036,14 @@ ast_for_with_stmt(struct compiling *c, const node *n0, bool is_async) body = ast_for_suite(c, CHILD(n, NCH(n) - 1)); if (!body) return NULL; + get_last_end_pos(body, &end_lineno, &end_col_offset); if (is_async) - return AsyncWith(items, body, LINENO(n0), n0->n_col_offset, c->c_arena); + return AsyncWith(items, body, LINENO(n0), n0->n_col_offset, + end_lineno, end_col_offset, c->c_arena); else - return With(items, body, LINENO(n), n->n_col_offset, c->c_arena); + return With(items, body, LINENO(n), n->n_col_offset, + end_lineno, end_col_offset, c->c_arena); } static stmt_ty @@ -3914,6 +4053,7 @@ ast_for_classdef(struct compiling *c, const node *n, asdl_seq *decorator_seq) PyObject *classname; asdl_seq *s; expr_ty call; + int end_lineno, end_col_offset; REQ(n, classdef); @@ -3921,26 +4061,32 @@ ast_for_classdef(struct compiling *c, const node *n, asdl_seq *decorator_seq) s = ast_for_suite(c, CHILD(n, 3)); if (!s) return NULL; + get_last_end_pos(s, &end_lineno, &end_col_offset); + classname = NEW_IDENTIFIER(CHILD(n, 1)); if (!classname) return NULL; if (forbidden_name(c, classname, CHILD(n, 3), 0)) return NULL; return ClassDef(classname, NULL, NULL, s, decorator_seq, - LINENO(n), n->n_col_offset, c->c_arena); + LINENO(n), n->n_col_offset, + end_lineno, end_col_offset, c->c_arena); } if (TYPE(CHILD(n, 3)) == RPAR) { /* class NAME '(' ')' ':' suite */ s = ast_for_suite(c, CHILD(n, 5)); if (!s) return NULL; + get_last_end_pos(s, &end_lineno, &end_col_offset); + classname = NEW_IDENTIFIER(CHILD(n, 1)); if (!classname) return NULL; if (forbidden_name(c, classname, CHILD(n, 3), 0)) return NULL; return ClassDef(classname, NULL, NULL, s, decorator_seq, - LINENO(n), n->n_col_offset, c->c_arena); + LINENO(n), n->n_col_offset, + end_lineno, end_col_offset, c->c_arena); } /* class NAME '(' arglist ')' ':' suite */ @@ -3951,14 +4097,18 @@ ast_for_classdef(struct compiling *c, const node *n, asdl_seq *decorator_seq) dummy_name = NEW_IDENTIFIER(CHILD(n, 1)); if (!dummy_name) return NULL; - dummy = Name(dummy_name, Load, LINENO(n), n->n_col_offset, c->c_arena); - call = ast_for_call(c, CHILD(n, 3), dummy, NULL); + dummy = Name(dummy_name, Load, LINENO(n), n->n_col_offset, + CHILD(n, 1)->n_end_lineno, CHILD(n, 1)->n_end_col_offset, + c->c_arena); + call = ast_for_call(c, CHILD(n, 3), dummy, NULL, CHILD(n, 4)); if (!call) return NULL; } s = ast_for_suite(c, CHILD(n, 6)); if (!s) return NULL; + get_last_end_pos(s, &end_lineno, &end_col_offset); + classname = NEW_IDENTIFIER(CHILD(n, 1)); if (!classname) return NULL; @@ -3966,7 +4116,8 @@ ast_for_classdef(struct compiling *c, const node *n, asdl_seq *decorator_seq) return NULL; return ClassDef(classname, call->v.Call.args, call->v.Call.keywords, s, - decorator_seq, LINENO(n), n->n_col_offset, c->c_arena); + decorator_seq, LINENO(n), n->n_col_offset, + end_lineno, end_col_offset, c->c_arena); } static stmt_ty @@ -3991,7 +4142,8 @@ ast_for_stmt(struct compiling *c, const node *n) case del_stmt: return ast_for_del_stmt(c, n); case pass_stmt: - return Pass(LINENO(n), n->n_col_offset, c->c_arena); + return Pass(LINENO(n), n->n_col_offset, + n->n_end_lineno, n->n_end_col_offset, c->c_arena); case flow_stmt: return ast_for_flow_stmt(c, n); case import_stmt: @@ -4248,6 +4400,7 @@ decode_bytes_with_escapes(struct compiling *c, const node *n, const char *s, static void fstring_shift_node_locations(node *n, int lineno, int col_offset) { n->n_col_offset = n->n_col_offset + col_offset; + n->n_end_col_offset = n->n_end_col_offset + col_offset; for (int i = 0; i < NCH(n); ++i) { if (n->n_lineno && n->n_lineno < CHILD(n, i)->n_lineno) { /* Shifting column offsets unnecessary if there's been newlines. */ @@ -4256,6 +4409,7 @@ static void fstring_shift_node_locations(node *n, int lineno, int col_offset) fstring_shift_node_locations(CHILD(n, i), lineno, col_offset); } n->n_lineno = n->n_lineno + lineno; + n->n_end_lineno = n->n_end_lineno + lineno; } /* Fix locations for the given node and its children. @@ -4672,6 +4826,7 @@ fstring_find_expr(const char **str, const char *end, int raw, int recurse_lvl, entire expression with the conversion and format spec. */ *expression = FormattedValue(simple_expression, conversion, format_spec, LINENO(n), n->n_col_offset, + n->n_end_lineno, n->n_end_col_offset, c->c_arena); if (!*expression) return -1; @@ -4918,7 +5073,8 @@ make_str_node_and_del(PyObject **str, struct compiling *c, const node* n) Py_DECREF(s); return NULL; } - return Constant(s, LINENO(n), n->n_col_offset, c->c_arena); + return Constant(s, LINENO(n), n->n_col_offset, + n->n_end_lineno, n->n_end_col_offset, c->c_arena); } /* Add a non-f-string (that is, a regular literal string). str is @@ -5073,7 +5229,8 @@ FstringParser_Finish(FstringParser *state, struct compiling *c, if (!seq) goto error; - return JoinedStr(seq, LINENO(n), n->n_col_offset, c->c_arena); + return JoinedStr(seq, LINENO(n), n->n_col_offset, + n->n_end_lineno, n->n_end_col_offset, c->c_arena); error: FstringParser_Dealloc(state); @@ -5283,7 +5440,8 @@ parsestrplus(struct compiling *c, const node *n) /* Just return the bytes object and we're done. */ if (PyArena_AddPyObject(c->c_arena, bytes_str) < 0) goto error; - return Constant(bytes_str, LINENO(n), n->n_col_offset, c->c_arena); + return Constant(bytes_str, LINENO(n), n->n_col_offset, + n->n_end_lineno, n->n_end_col_offset, c->c_arena); } /* We're not a bytes string, bytes_str should never have been set. */ diff --git a/Python/ast_opt.c b/Python/ast_opt.c index 6f72a7f63bf9..96c766fc0957 100644 --- a/Python/ast_opt.c +++ b/Python/ast_opt.c @@ -439,7 +439,8 @@ astfold_body(asdl_seq *stmts, PyArena *ctx_, int optimize_) return 0; } asdl_seq_SET(values, 0, st->v.Expr.value); - expr_ty expr = JoinedStr(values, st->lineno, st->col_offset, ctx_); + expr_ty expr = JoinedStr(values, st->lineno, st->col_offset, + st->end_lineno, st->end_col_offset, ctx_); if (!expr) { return 0; } diff --git a/Python/compile.c b/Python/compile.c index 5aebda0da4d1..9713bfc9e9b7 100644 --- a/Python/compile.c +++ b/Python/compile.c @@ -4757,7 +4757,8 @@ compiler_augassign(struct compiler *c, stmt_ty s) switch (e->kind) { case Attribute_kind: auge = Attribute(e->v.Attribute.value, e->v.Attribute.attr, - AugLoad, e->lineno, e->col_offset, c->c_arena); + AugLoad, e->lineno, e->col_offset, + e->end_lineno, e->end_col_offset, c->c_arena); if (auge == NULL) return 0; VISIT(c, expr, auge); @@ -4768,7 +4769,8 @@ compiler_augassign(struct compiler *c, stmt_ty s) break; case Subscript_kind: auge = Subscript(e->v.Subscript.value, e->v.Subscript.slice, - AugLoad, e->lineno, e->col_offset, c->c_arena); + AugLoad, e->lineno, e->col_offset, + e->end_lineno, e->end_col_offset, c->c_arena); if (auge == NULL) return 0; VISIT(c, expr, auge); From webhook-mailer at python.org Tue Jan 22 11:15:06 2019 From: webhook-mailer at python.org (Victor Stinner) Date: Tue, 22 Jan 2019 16:15:06 -0000 Subject: [Python-checkins] bpo-35720: Fixing a memory leak in pymain_parse_cmdline_impl() (GH-11528) Message-ID: https://github.com/python/cpython/commit/35ca1820e19f81f69073f294503cdcd708fe490f commit: 35ca1820e19f81f69073f294503cdcd708fe490f branch: master author: Lucas Cimon committer: Victor Stinner date: 2019-01-22T17:15:01+01:00 summary: bpo-35720: Fixing a memory leak in pymain_parse_cmdline_impl() (GH-11528) When the loop in the pymain_read_conf function in this same file calls pymain_init_cmdline_argv() a 2nd time, the pymain->command buffer of wchar_t is overriden and the previously allocated memory is never freed. files: A Misc/NEWS.d/next/Core and Builtins/2019-01-12-23-33-04.bpo-35720.LELKQx.rst M Modules/main.c diff --git a/Misc/NEWS.d/next/Core and Builtins/2019-01-12-23-33-04.bpo-35720.LELKQx.rst b/Misc/NEWS.d/next/Core and Builtins/2019-01-12-23-33-04.bpo-35720.LELKQx.rst new file mode 100644 index 000000000000..9c57ebcb625e --- /dev/null +++ b/Misc/NEWS.d/next/Core and Builtins/2019-01-12-23-33-04.bpo-35720.LELKQx.rst @@ -0,0 +1 @@ +Fixed a minor memory leak in pymain_parse_cmdline_impl function in Modules/main.c \ No newline at end of file diff --git a/Modules/main.c b/Modules/main.c index 8e66ddded419..da79a6397b38 100644 --- a/Modules/main.c +++ b/Modules/main.c @@ -1376,6 +1376,7 @@ pymain_read_conf(_PyMain *pymain, _PyCoreConfig *config, goto done; } pymain_clear_cmdline(pymain, cmdline); + pymain_clear_pymain(pymain); memset(cmdline, 0, sizeof(*cmdline)); config->utf8_mode = new_utf8_mode; config->coerce_c_locale = new_coerce_c_locale; From webhook-mailer at python.org Tue Jan 22 11:39:07 2019 From: webhook-mailer at python.org (Victor Stinner) Date: Tue, 22 Jan 2019 16:39:07 -0000 Subject: [Python-checkins] bpo-35713: Rework Python initialization (GH-11647) Message-ID: https://github.com/python/cpython/commit/bf4ac2d2fd520c61306b2676db488adab9b5d8c5 commit: bf4ac2d2fd520c61306b2676db488adab9b5d8c5 branch: master author: Victor Stinner committer: GitHub date: 2019-01-22T17:39:03+01:00 summary: bpo-35713: Rework Python initialization (GH-11647) * The PyByteArray_Init() and PyByteArray_Fini() functions have been removed. They did nothing since Python 2.7.4 and Python 3.2.0, were excluded from the limited API (stable ABI), and were not documented. * Move "_PyXXX_Init()" and "_PyXXX_Fini()" declarations from Include/cpython/pylifecycle.h to Include/internal/pycore_pylifecycle.h. Replace "PyAPI_FUNC(TYPE)" with "extern TYPE". * _PyExc_Init() now returns an error on failure rather than calling Py_FatalError(). Move macros inside _PyExc_Init() and undefine them when done. Rewrite macros to make them look more like statement: add ";" when using them, add "do { ... } while (0)". * _PyUnicode_Init() now returns a _PyInitError error rather than call Py_FatalError(). * Move stdin check from _PySys_BeginInit() to init_sys_streams(). * _Py_ReadyTypes() now returns a _PyInitError error rather than calling Py_FatalError(). files: A Misc/NEWS.d/next/C API/2019-01-22-17-04-10.bpo-35713.fmehdG.rst M Doc/whatsnew/3.8.rst M Include/cpython/pylifecycle.h M Include/internal/pycore_pylifecycle.h M Objects/bytearrayobject.c M Objects/exceptions.c M Objects/unicodeobject.c M Python/pylifecycle.c M Python/sysmodule.c diff --git a/Doc/whatsnew/3.8.rst b/Doc/whatsnew/3.8.rst index 0360be604f56..8b38cce35368 100644 --- a/Doc/whatsnew/3.8.rst +++ b/Doc/whatsnew/3.8.rst @@ -328,6 +328,10 @@ Optimizations Build and C API Changes ======================= +* The :c:func:`PyByteArray_Init` and :c:func:`PyByteArray_Fini` functions have + been removed. They did nothing since Python 2.7.4 and Python 3.2.0, were + excluded from the limited API (stable ABI), and were not documented. + * The result of :c:func:`PyExceptionClass_Name` is now of type ``const char *`` rather of ``char *``. (Contributed by Serhiy Storchaka in :issue:`33818`.) diff --git a/Include/cpython/pylifecycle.h b/Include/cpython/pylifecycle.h index 3009c4f10d3a..a3fdeefde01b 100644 --- a/Include/cpython/pylifecycle.h +++ b/Include/cpython/pylifecycle.h @@ -56,33 +56,6 @@ PyAPI_FUNC(void) _Py_SetProgramFullPath(const wchar_t *); PyAPI_FUNC(const char *) _Py_gitidentifier(void); PyAPI_FUNC(const char *) _Py_gitversion(void); -/* Internal -- various one-time initializations */ -PyAPI_FUNC(PyObject *) _PyBuiltin_Init(void); -PyAPI_FUNC(_PyInitError) _PySys_BeginInit(PyObject **sysmod); -PyAPI_FUNC(int) _PySys_EndInit(PyObject *sysdict, PyInterpreterState *interp); -PyAPI_FUNC(_PyInitError) _PyImport_Init(PyInterpreterState *interp); -PyAPI_FUNC(void) _PyExc_Init(PyObject * bltinmod); -PyAPI_FUNC(_PyInitError) _PyImportHooks_Init(void); -PyAPI_FUNC(int) _PyFloat_Init(void); -PyAPI_FUNC(int) PyByteArray_Init(void); -PyAPI_FUNC(_PyInitError) _Py_HashRandomization_Init(const _PyCoreConfig *); - -/* Various internal finalizers */ - -PyAPI_FUNC(void) PyMethod_Fini(void); -PyAPI_FUNC(void) PyFrame_Fini(void); -PyAPI_FUNC(void) PyCFunction_Fini(void); -PyAPI_FUNC(void) PyDict_Fini(void); -PyAPI_FUNC(void) PyTuple_Fini(void); -PyAPI_FUNC(void) PyList_Fini(void); -PyAPI_FUNC(void) PySet_Fini(void); -PyAPI_FUNC(void) PyBytes_Fini(void); -PyAPI_FUNC(void) PyByteArray_Fini(void); -PyAPI_FUNC(void) PyFloat_Fini(void); -PyAPI_FUNC(void) PyOS_FiniInterrupts(void); -PyAPI_FUNC(void) PySlice_Fini(void); -PyAPI_FUNC(void) PyAsyncGen_Fini(void); - PyAPI_FUNC(int) _Py_IsFinalizing(void); /* Random */ diff --git a/Include/internal/pycore_pylifecycle.h b/Include/internal/pycore_pylifecycle.h index e10431690cbd..de70199dae57 100644 --- a/Include/internal/pycore_pylifecycle.h +++ b/Include/internal/pycore_pylifecycle.h @@ -19,20 +19,45 @@ PyAPI_FUNC(void) _Py_ClearStandardStreamEncoding(void); PyAPI_FUNC(int) _Py_IsLocaleCoercionTarget(const char *ctype_loc); -extern int _PyUnicode_Init(void); +/* Various one-time initializers */ + +extern _PyInitError _PyUnicode_Init(void); extern int _PyStructSequence_Init(void); extern int _PyLong_Init(void); extern _PyInitError _PyFaulthandler_Init(int enable); extern int _PyTraceMalloc_Init(int enable); +extern PyObject * _PyBuiltin_Init(void); +extern _PyInitError _PySys_BeginInit(PyObject **sysmod); +extern int _PySys_EndInit(PyObject *sysdict, PyInterpreterState *interp); +extern _PyInitError _PyImport_Init(PyInterpreterState *interp); +extern _PyInitError _PyExc_Init(PyObject * bltinmod); +extern _PyInitError _PyImportHooks_Init(void); +extern int _PyFloat_Init(void); +extern _PyInitError _Py_HashRandomization_Init(const _PyCoreConfig *); extern void _Py_ReadyTypes(void); -PyAPI_FUNC(void) _PyExc_Fini(void); -PyAPI_FUNC(void) _PyImport_Fini(void); -PyAPI_FUNC(void) _PyImport_Fini2(void); -PyAPI_FUNC(void) _PyGC_Fini(void); -PyAPI_FUNC(void) _PyType_Fini(void); -PyAPI_FUNC(void) _Py_HashRandomization_Fini(void); +/* Various internal finalizers */ + +extern void PyMethod_Fini(void); +extern void PyFrame_Fini(void); +extern void PyCFunction_Fini(void); +extern void PyDict_Fini(void); +extern void PyTuple_Fini(void); +extern void PyList_Fini(void); +extern void PySet_Fini(void); +extern void PyBytes_Fini(void); +extern void PyFloat_Fini(void); +extern void PyOS_FiniInterrupts(void); +extern void PySlice_Fini(void); +extern void PyAsyncGen_Fini(void); + +extern void _PyExc_Fini(void); +extern void _PyImport_Fini(void); +extern void _PyImport_Fini2(void); +extern void _PyGC_Fini(void); +extern void _PyType_Fini(void); +extern void _Py_HashRandomization_Fini(void); extern void _PyUnicode_Fini(void); extern void PyLong_Fini(void); extern void _PyFaulthandler_Fini(void); diff --git a/Misc/NEWS.d/next/C API/2019-01-22-17-04-10.bpo-35713.fmehdG.rst b/Misc/NEWS.d/next/C API/2019-01-22-17-04-10.bpo-35713.fmehdG.rst new file mode 100644 index 000000000000..f95ceca47fdf --- /dev/null +++ b/Misc/NEWS.d/next/C API/2019-01-22-17-04-10.bpo-35713.fmehdG.rst @@ -0,0 +1,3 @@ +The :c:func:`PyByteArray_Init` and :c:func:`PyByteArray_Fini` functions have +been removed. They did nothing since Python 2.7.4 and Python 3.2.0, were +excluded from the limited API (stable ABI), and were not documented. diff --git a/Objects/bytearrayobject.c b/Objects/bytearrayobject.c index 3926095a1812..667213619344 100644 --- a/Objects/bytearrayobject.c +++ b/Objects/bytearrayobject.c @@ -17,17 +17,6 @@ class bytearray "PyByteArrayObject *" "&PyByteArray_Type" char _PyByteArray_empty_string[] = ""; -void -PyByteArray_Fini(void) -{ -} - -int -PyByteArray_Init(void) -{ - return 1; -} - /* end nullbytes support */ /* Helpers */ diff --git a/Objects/exceptions.c b/Objects/exceptions.c index 002a602373d7..8d81566c7f15 100644 --- a/Objects/exceptions.c +++ b/Objects/exceptions.c @@ -2299,7 +2299,7 @@ MemoryError_dealloc(PyBaseExceptionObject *self) } } -static void +static int preallocate_memerrors(void) { /* We create enough MemoryErrors and then decref them, which will fill @@ -2309,12 +2309,14 @@ preallocate_memerrors(void) for (i = 0; i < MEMERRORS_SAVE; i++) { errors[i] = MemoryError_new((PyTypeObject *) PyExc_MemoryError, NULL, NULL); - if (!errors[i]) - Py_FatalError("Could not preallocate MemoryError object"); + if (!errors[i]) { + return -1; + } } for (i = 0; i < MEMERRORS_SAVE; i++) { Py_DECREF(errors[i]); } + return 0; } static void @@ -2433,31 +2435,6 @@ SimpleExtendsException(PyExc_Warning, ResourceWarning, -#define PRE_INIT(TYPE) \ - if (!(_PyExc_ ## TYPE.tp_flags & Py_TPFLAGS_READY)) { \ - if (PyType_Ready(&_PyExc_ ## TYPE) < 0) \ - Py_FatalError("exceptions bootstrapping error."); \ - Py_INCREF(PyExc_ ## TYPE); \ - } - -#define POST_INIT(TYPE) \ - if (PyDict_SetItemString(bdict, # TYPE, PyExc_ ## TYPE)) \ - Py_FatalError("Module dictionary insertion problem."); - -#define INIT_ALIAS(NAME, TYPE) Py_INCREF(PyExc_ ## TYPE); \ - Py_XDECREF(PyExc_ ## NAME); \ - PyExc_ ## NAME = PyExc_ ## TYPE; \ - if (PyDict_SetItemString(bdict, # NAME, PyExc_ ## NAME)) \ - Py_FatalError("Module dictionary insertion problem."); - -#define ADD_ERRNO(TYPE, CODE) { \ - PyObject *_code = PyLong_FromLong(CODE); \ - assert(_PyObject_RealIsSubclass(PyExc_ ## TYPE, PyExc_OSError)); \ - if (!_code || PyDict_SetItem(errnomap, _code, PyExc_ ## TYPE)) \ - Py_FatalError("errmap insertion problem."); \ - Py_DECREF(_code); \ - } - #ifdef MS_WINDOWS #include /* The following constants were added to errno.h in VS2010 but have @@ -2514,184 +2491,226 @@ SimpleExtendsException(PyExc_Warning, ResourceWarning, #endif #endif /* MS_WINDOWS */ -void +_PyInitError _PyExc_Init(PyObject *bltinmod) { +#define PRE_INIT(TYPE) \ + if (!(_PyExc_ ## TYPE.tp_flags & Py_TPFLAGS_READY)) { \ + if (PyType_Ready(&_PyExc_ ## TYPE) < 0) { \ + return _Py_INIT_ERR("exceptions bootstrapping error."); \ + } \ + Py_INCREF(PyExc_ ## TYPE); \ + } + +#define POST_INIT(TYPE) \ + if (PyDict_SetItemString(bdict, # TYPE, PyExc_ ## TYPE)) { \ + return _Py_INIT_ERR("Module dictionary insertion problem."); \ + } + +#define INIT_ALIAS(NAME, TYPE) \ + do { \ + Py_INCREF(PyExc_ ## TYPE); \ + Py_XDECREF(PyExc_ ## NAME); \ + PyExc_ ## NAME = PyExc_ ## TYPE; \ + if (PyDict_SetItemString(bdict, # NAME, PyExc_ ## NAME)) { \ + return _Py_INIT_ERR("Module dictionary insertion problem."); \ + } \ + } while (0) + +#define ADD_ERRNO(TYPE, CODE) \ + do { \ + PyObject *_code = PyLong_FromLong(CODE); \ + assert(_PyObject_RealIsSubclass(PyExc_ ## TYPE, PyExc_OSError)); \ + if (!_code || PyDict_SetItem(errnomap, _code, PyExc_ ## TYPE)) \ + return _Py_INIT_ERR("errmap insertion problem."); \ + Py_DECREF(_code); \ + } while (0) + PyObject *bdict; - PRE_INIT(BaseException) - PRE_INIT(Exception) - PRE_INIT(TypeError) - PRE_INIT(StopAsyncIteration) - PRE_INIT(StopIteration) - PRE_INIT(GeneratorExit) - PRE_INIT(SystemExit) - PRE_INIT(KeyboardInterrupt) - PRE_INIT(ImportError) - PRE_INIT(ModuleNotFoundError) - PRE_INIT(OSError) - PRE_INIT(EOFError) - PRE_INIT(RuntimeError) - PRE_INIT(RecursionError) - PRE_INIT(NotImplementedError) - PRE_INIT(NameError) - PRE_INIT(UnboundLocalError) - PRE_INIT(AttributeError) - PRE_INIT(SyntaxError) - PRE_INIT(IndentationError) - PRE_INIT(TabError) - PRE_INIT(LookupError) - PRE_INIT(IndexError) - PRE_INIT(KeyError) - PRE_INIT(ValueError) - PRE_INIT(UnicodeError) - PRE_INIT(UnicodeEncodeError) - PRE_INIT(UnicodeDecodeError) - PRE_INIT(UnicodeTranslateError) - PRE_INIT(AssertionError) - PRE_INIT(ArithmeticError) - PRE_INIT(FloatingPointError) - PRE_INIT(OverflowError) - PRE_INIT(ZeroDivisionError) - PRE_INIT(SystemError) - PRE_INIT(ReferenceError) - PRE_INIT(MemoryError) - PRE_INIT(BufferError) - PRE_INIT(Warning) - PRE_INIT(UserWarning) - PRE_INIT(DeprecationWarning) - PRE_INIT(PendingDeprecationWarning) - PRE_INIT(SyntaxWarning) - PRE_INIT(RuntimeWarning) - PRE_INIT(FutureWarning) - PRE_INIT(ImportWarning) - PRE_INIT(UnicodeWarning) - PRE_INIT(BytesWarning) - PRE_INIT(ResourceWarning) + PRE_INIT(BaseException); + PRE_INIT(Exception); + PRE_INIT(TypeError); + PRE_INIT(StopAsyncIteration); + PRE_INIT(StopIteration); + PRE_INIT(GeneratorExit); + PRE_INIT(SystemExit); + PRE_INIT(KeyboardInterrupt); + PRE_INIT(ImportError); + PRE_INIT(ModuleNotFoundError); + PRE_INIT(OSError); + PRE_INIT(EOFError); + PRE_INIT(RuntimeError); + PRE_INIT(RecursionError); + PRE_INIT(NotImplementedError); + PRE_INIT(NameError); + PRE_INIT(UnboundLocalError); + PRE_INIT(AttributeError); + PRE_INIT(SyntaxError); + PRE_INIT(IndentationError); + PRE_INIT(TabError); + PRE_INIT(LookupError); + PRE_INIT(IndexError); + PRE_INIT(KeyError); + PRE_INIT(ValueError); + PRE_INIT(UnicodeError); + PRE_INIT(UnicodeEncodeError); + PRE_INIT(UnicodeDecodeError); + PRE_INIT(UnicodeTranslateError); + PRE_INIT(AssertionError); + PRE_INIT(ArithmeticError); + PRE_INIT(FloatingPointError); + PRE_INIT(OverflowError); + PRE_INIT(ZeroDivisionError); + PRE_INIT(SystemError); + PRE_INIT(ReferenceError); + PRE_INIT(MemoryError); + PRE_INIT(BufferError); + PRE_INIT(Warning); + PRE_INIT(UserWarning); + PRE_INIT(DeprecationWarning); + PRE_INIT(PendingDeprecationWarning); + PRE_INIT(SyntaxWarning); + PRE_INIT(RuntimeWarning); + PRE_INIT(FutureWarning); + PRE_INIT(ImportWarning); + PRE_INIT(UnicodeWarning); + PRE_INIT(BytesWarning); + PRE_INIT(ResourceWarning); /* OSError subclasses */ - PRE_INIT(ConnectionError) - - PRE_INIT(BlockingIOError) - PRE_INIT(BrokenPipeError) - PRE_INIT(ChildProcessError) - PRE_INIT(ConnectionAbortedError) - PRE_INIT(ConnectionRefusedError) - PRE_INIT(ConnectionResetError) - PRE_INIT(FileExistsError) - PRE_INIT(FileNotFoundError) - PRE_INIT(IsADirectoryError) - PRE_INIT(NotADirectoryError) - PRE_INIT(InterruptedError) - PRE_INIT(PermissionError) - PRE_INIT(ProcessLookupError) - PRE_INIT(TimeoutError) + PRE_INIT(ConnectionError); + + PRE_INIT(BlockingIOError); + PRE_INIT(BrokenPipeError); + PRE_INIT(ChildProcessError); + PRE_INIT(ConnectionAbortedError); + PRE_INIT(ConnectionRefusedError); + PRE_INIT(ConnectionResetError); + PRE_INIT(FileExistsError); + PRE_INIT(FileNotFoundError); + PRE_INIT(IsADirectoryError); + PRE_INIT(NotADirectoryError); + PRE_INIT(InterruptedError); + PRE_INIT(PermissionError); + PRE_INIT(ProcessLookupError); + PRE_INIT(TimeoutError); bdict = PyModule_GetDict(bltinmod); - if (bdict == NULL) - Py_FatalError("exceptions bootstrapping error."); - - POST_INIT(BaseException) - POST_INIT(Exception) - POST_INIT(TypeError) - POST_INIT(StopAsyncIteration) - POST_INIT(StopIteration) - POST_INIT(GeneratorExit) - POST_INIT(SystemExit) - POST_INIT(KeyboardInterrupt) - POST_INIT(ImportError) - POST_INIT(ModuleNotFoundError) - POST_INIT(OSError) - INIT_ALIAS(EnvironmentError, OSError) - INIT_ALIAS(IOError, OSError) + if (bdict == NULL) { + return _Py_INIT_ERR("exceptions bootstrapping error."); + } + + POST_INIT(BaseException); + POST_INIT(Exception); + POST_INIT(TypeError); + POST_INIT(StopAsyncIteration); + POST_INIT(StopIteration); + POST_INIT(GeneratorExit); + POST_INIT(SystemExit); + POST_INIT(KeyboardInterrupt); + POST_INIT(ImportError); + POST_INIT(ModuleNotFoundError); + POST_INIT(OSError); + INIT_ALIAS(EnvironmentError, OSError); + INIT_ALIAS(IOError, OSError); #ifdef MS_WINDOWS - INIT_ALIAS(WindowsError, OSError) + INIT_ALIAS(WindowsError, OSError); #endif - POST_INIT(EOFError) - POST_INIT(RuntimeError) - POST_INIT(RecursionError) - POST_INIT(NotImplementedError) - POST_INIT(NameError) - POST_INIT(UnboundLocalError) - POST_INIT(AttributeError) - POST_INIT(SyntaxError) - POST_INIT(IndentationError) - POST_INIT(TabError) - POST_INIT(LookupError) - POST_INIT(IndexError) - POST_INIT(KeyError) - POST_INIT(ValueError) - POST_INIT(UnicodeError) - POST_INIT(UnicodeEncodeError) - POST_INIT(UnicodeDecodeError) - POST_INIT(UnicodeTranslateError) - POST_INIT(AssertionError) - POST_INIT(ArithmeticError) - POST_INIT(FloatingPointError) - POST_INIT(OverflowError) - POST_INIT(ZeroDivisionError) - POST_INIT(SystemError) - POST_INIT(ReferenceError) - POST_INIT(MemoryError) - POST_INIT(BufferError) - POST_INIT(Warning) - POST_INIT(UserWarning) - POST_INIT(DeprecationWarning) - POST_INIT(PendingDeprecationWarning) - POST_INIT(SyntaxWarning) - POST_INIT(RuntimeWarning) - POST_INIT(FutureWarning) - POST_INIT(ImportWarning) - POST_INIT(UnicodeWarning) - POST_INIT(BytesWarning) - POST_INIT(ResourceWarning) + POST_INIT(EOFError); + POST_INIT(RuntimeError); + POST_INIT(RecursionError); + POST_INIT(NotImplementedError); + POST_INIT(NameError); + POST_INIT(UnboundLocalError); + POST_INIT(AttributeError); + POST_INIT(SyntaxError); + POST_INIT(IndentationError); + POST_INIT(TabError); + POST_INIT(LookupError); + POST_INIT(IndexError); + POST_INIT(KeyError); + POST_INIT(ValueError); + POST_INIT(UnicodeError); + POST_INIT(UnicodeEncodeError); + POST_INIT(UnicodeDecodeError); + POST_INIT(UnicodeTranslateError); + POST_INIT(AssertionError); + POST_INIT(ArithmeticError); + POST_INIT(FloatingPointError); + POST_INIT(OverflowError); + POST_INIT(ZeroDivisionError); + POST_INIT(SystemError); + POST_INIT(ReferenceError); + POST_INIT(MemoryError); + POST_INIT(BufferError); + POST_INIT(Warning); + POST_INIT(UserWarning); + POST_INIT(DeprecationWarning); + POST_INIT(PendingDeprecationWarning); + POST_INIT(SyntaxWarning); + POST_INIT(RuntimeWarning); + POST_INIT(FutureWarning); + POST_INIT(ImportWarning); + POST_INIT(UnicodeWarning); + POST_INIT(BytesWarning); + POST_INIT(ResourceWarning); if (!errnomap) { errnomap = PyDict_New(); - if (!errnomap) - Py_FatalError("Cannot allocate map from errnos to OSError subclasses"); + if (!errnomap) { + return _Py_INIT_ERR("Cannot allocate map from errnos to OSError subclasses"); + } } /* OSError subclasses */ - POST_INIT(ConnectionError) - - POST_INIT(BlockingIOError) - ADD_ERRNO(BlockingIOError, EAGAIN) - ADD_ERRNO(BlockingIOError, EALREADY) - ADD_ERRNO(BlockingIOError, EINPROGRESS) - ADD_ERRNO(BlockingIOError, EWOULDBLOCK) - POST_INIT(BrokenPipeError) - ADD_ERRNO(BrokenPipeError, EPIPE) + POST_INIT(ConnectionError); + + POST_INIT(BlockingIOError); + ADD_ERRNO(BlockingIOError, EAGAIN); + ADD_ERRNO(BlockingIOError, EALREADY); + ADD_ERRNO(BlockingIOError, EINPROGRESS); + ADD_ERRNO(BlockingIOError, EWOULDBLOCK); + POST_INIT(BrokenPipeError); + ADD_ERRNO(BrokenPipeError, EPIPE); #ifdef ESHUTDOWN - ADD_ERRNO(BrokenPipeError, ESHUTDOWN) + ADD_ERRNO(BrokenPipeError, ESHUTDOWN); #endif - POST_INIT(ChildProcessError) - ADD_ERRNO(ChildProcessError, ECHILD) - POST_INIT(ConnectionAbortedError) - ADD_ERRNO(ConnectionAbortedError, ECONNABORTED) - POST_INIT(ConnectionRefusedError) - ADD_ERRNO(ConnectionRefusedError, ECONNREFUSED) - POST_INIT(ConnectionResetError) - ADD_ERRNO(ConnectionResetError, ECONNRESET) - POST_INIT(FileExistsError) - ADD_ERRNO(FileExistsError, EEXIST) - POST_INIT(FileNotFoundError) - ADD_ERRNO(FileNotFoundError, ENOENT) - POST_INIT(IsADirectoryError) - ADD_ERRNO(IsADirectoryError, EISDIR) - POST_INIT(NotADirectoryError) - ADD_ERRNO(NotADirectoryError, ENOTDIR) - POST_INIT(InterruptedError) - ADD_ERRNO(InterruptedError, EINTR) - POST_INIT(PermissionError) - ADD_ERRNO(PermissionError, EACCES) - ADD_ERRNO(PermissionError, EPERM) - POST_INIT(ProcessLookupError) - ADD_ERRNO(ProcessLookupError, ESRCH) - POST_INIT(TimeoutError) - ADD_ERRNO(TimeoutError, ETIMEDOUT) - - preallocate_memerrors(); + POST_INIT(ChildProcessError); + ADD_ERRNO(ChildProcessError, ECHILD); + POST_INIT(ConnectionAbortedError); + ADD_ERRNO(ConnectionAbortedError, ECONNABORTED); + POST_INIT(ConnectionRefusedError); + ADD_ERRNO(ConnectionRefusedError, ECONNREFUSED); + POST_INIT(ConnectionResetError); + ADD_ERRNO(ConnectionResetError, ECONNRESET); + POST_INIT(FileExistsError); + ADD_ERRNO(FileExistsError, EEXIST); + POST_INIT(FileNotFoundError); + ADD_ERRNO(FileNotFoundError, ENOENT); + POST_INIT(IsADirectoryError); + ADD_ERRNO(IsADirectoryError, EISDIR); + POST_INIT(NotADirectoryError); + ADD_ERRNO(NotADirectoryError, ENOTDIR); + POST_INIT(InterruptedError); + ADD_ERRNO(InterruptedError, EINTR); + POST_INIT(PermissionError); + ADD_ERRNO(PermissionError, EACCES); + ADD_ERRNO(PermissionError, EPERM); + POST_INIT(ProcessLookupError); + ADD_ERRNO(ProcessLookupError, ESRCH); + POST_INIT(TimeoutError); + ADD_ERRNO(TimeoutError, ETIMEDOUT); + + if (preallocate_memerrors() < 0) { + return _Py_INIT_ERR("Could not preallocate MemoryError object"); + } + return _Py_INIT_OK(); + +#undef PRE_INIT +#undef POST_INIT +#undef INIT_ALIAS +#undef ADD_ERRNO } void diff --git a/Objects/unicodeobject.c b/Objects/unicodeobject.c index f1d23b66fa14..ea7bcabfc64f 100644 --- a/Objects/unicodeobject.c +++ b/Objects/unicodeobject.c @@ -15199,7 +15199,8 @@ PyTypeObject PyUnicode_Type = { /* Initialize the Unicode implementation */ -int _PyUnicode_Init(void) +_PyInitError +_PyUnicode_Init(void) { /* XXX - move this array to unicodectype.c ? */ Py_UCS2 linebreak[] = { @@ -15215,28 +15216,31 @@ int _PyUnicode_Init(void) /* Init the implementation */ _Py_INCREF_UNICODE_EMPTY(); - if (!unicode_empty) - Py_FatalError("Can't create empty string"); + if (!unicode_empty) { + return _Py_INIT_ERR("Can't create empty string"); + } Py_DECREF(unicode_empty); - if (PyType_Ready(&PyUnicode_Type) < 0) - Py_FatalError("Can't initialize 'unicode'"); + if (PyType_Ready(&PyUnicode_Type) < 0) { + return _Py_INIT_ERR("Can't initialize unicode type"); + } /* initialize the linebreak bloom filter */ bloom_linebreak = make_bloom_mask( PyUnicode_2BYTE_KIND, linebreak, Py_ARRAY_LENGTH(linebreak)); - if (PyType_Ready(&EncodingMapType) < 0) - Py_FatalError("Can't initialize encoding map type"); - - if (PyType_Ready(&PyFieldNameIter_Type) < 0) - Py_FatalError("Can't initialize field name iterator type"); - - if (PyType_Ready(&PyFormatterIter_Type) < 0) - Py_FatalError("Can't initialize formatter iter type"); + if (PyType_Ready(&EncodingMapType) < 0) { + return _Py_INIT_ERR("Can't initialize encoding map type"); + } + if (PyType_Ready(&PyFieldNameIter_Type) < 0) { + return _Py_INIT_ERR("Can't initialize field name iterator type"); + } + if (PyType_Ready(&PyFormatterIter_Type) < 0) { + return _Py_INIT_ERR("Can't initialize formatter iter type"); + } - return 0; + return _Py_INIT_OK(); } /* Finalize the Unicode implementation */ diff --git a/Python/pylifecycle.c b/Python/pylifecycle.c index 37ecc510c8fb..f0e00ea48d48 100644 --- a/Python/pylifecycle.c +++ b/Python/pylifecycle.c @@ -608,9 +608,6 @@ _Py_InitializeCore_impl(PyInterpreterState **interp_p, if (!_PyLong_Init()) return _Py_INIT_ERR("can't init longs"); - if (!PyByteArray_Init()) - return _Py_INIT_ERR("can't init bytearray"); - if (!_PyFloat_Init()) return _Py_INIT_ERR("can't init float"); @@ -634,9 +631,10 @@ _Py_InitializeCore_impl(PyInterpreterState **interp_p, PyDict_SetItemString(interp->sysdict, "modules", modules); _PyImport_FixupBuiltin(sysmod, "sys", modules); - /* Init Unicode implementation; relies on the codec registry */ - if (_PyUnicode_Init() < 0) - return _Py_INIT_ERR("can't initialize unicode"); + err = _PyUnicode_Init(); + if (_Py_INIT_FAILED(err)) { + return err; + } if (_PyStructSequence_Init() < 0) return _Py_INIT_ERR("can't initialize structseq"); @@ -651,7 +649,10 @@ _Py_InitializeCore_impl(PyInterpreterState **interp_p, Py_INCREF(interp->builtins); /* initialize builtin exceptions */ - _PyExc_Init(bimod); + err = _PyExc_Init(bimod); + if (_Py_INIT_FAILED(err)) { + return err; + } /* Set up a preliminary stderr printer until we have enough infrastructure for the io module in place. */ @@ -1146,7 +1147,6 @@ Py_FinalizeEx(void) PyList_Fini(); PySet_Fini(); PyBytes_Fini(); - PyByteArray_Fini(); PyLong_Fini(); PyFloat_Fini(); PyDict_Fini(); @@ -1302,7 +1302,10 @@ new_interpreter(PyThreadState **tstate_p) } /* initialize builtin exceptions */ - _PyExc_Init(bimod); + err = _PyExc_Init(bimod); + if (_Py_INIT_FAILED(err)) { + return err; + } if (bimod != NULL && sysmod != NULL) { PyObject *pstderr; @@ -1682,6 +1685,20 @@ init_sys_streams(PyInterpreterState *interp) _PyInitError res = _Py_INIT_OK(); _PyCoreConfig *config = &interp->core_config; + /* Check that stdin is not a directory + Using shell redirection, you can redirect stdin to a directory, + crashing the Python interpreter. Catch this common mistake here + and output a useful error message. Note that under MS Windows, + the shell already prevents that. */ +#ifndef MS_WINDOWS + struct _Py_stat_struct sb; + if (_Py_fstat_noraise(fileno(stdin), &sb) == 0 && + S_ISDIR(sb.st_mode)) { + return _Py_INIT_USER_ERR(" is a directory, " + "cannot continue"); + } +#endif + char *codec_name = get_codec_name(config->stdio_encoding); if (codec_name == NULL) { return _Py_INIT_ERR("failed to get the Python codec name " diff --git a/Python/sysmodule.c b/Python/sysmodule.c index 5ea3772efded..8efe1699422c 100644 --- a/Python/sysmodule.c +++ b/Python/sysmodule.c @@ -2381,22 +2381,6 @@ _PySys_BeginInit(PyObject **sysmod) } sysdict = PyModule_GetDict(m); - /* Check that stdin is not a directory - Using shell redirection, you can redirect stdin to a directory, - crashing the Python interpreter. Catch this common mistake here - and output a useful error message. Note that under MS Windows, - the shell already prevents that. */ -#ifndef MS_WINDOWS - { - struct _Py_stat_struct sb; - if (_Py_fstat_noraise(fileno(stdin), &sb) == 0 && - S_ISDIR(sb.st_mode)) { - return _Py_INIT_USER_ERR(" is a directory, " - "cannot continue"); - } - } -#endif - /* stdin/stdout/stderr are set in pylifecycle.c */ SET_SYS_FROM_STRING_BORROW("__displayhook__", From webhook-mailer at python.org Tue Jan 22 11:42:17 2019 From: webhook-mailer at python.org (Miss Islington (bot)) Date: Tue, 22 Jan 2019 16:42:17 -0000 Subject: [Python-checkins] bpo-35720: Fixing a memory leak in pymain_parse_cmdline_impl() (GH-11528) Message-ID: https://github.com/python/cpython/commit/f71e7433ebccb2e3a2665b93bb84de38f9c581c8 commit: f71e7433ebccb2e3a2665b93bb84de38f9c581c8 branch: 3.7 author: Miss Islington (bot) <31488909+miss-islington at users.noreply.github.com> committer: GitHub date: 2019-01-22T08:42:13-08:00 summary: bpo-35720: Fixing a memory leak in pymain_parse_cmdline_impl() (GH-11528) When the loop in the pymain_read_conf function in this same file calls pymain_init_cmdline_argv() a 2nd time, the pymain->command buffer of wchar_t is overriden and the previously allocated memory is never freed. (cherry picked from commit 35ca1820e19f81f69073f294503cdcd708fe490f) Co-authored-by: Lucas Cimon files: A Misc/NEWS.d/next/Core and Builtins/2019-01-12-23-33-04.bpo-35720.LELKQx.rst M Modules/main.c diff --git a/Misc/NEWS.d/next/Core and Builtins/2019-01-12-23-33-04.bpo-35720.LELKQx.rst b/Misc/NEWS.d/next/Core and Builtins/2019-01-12-23-33-04.bpo-35720.LELKQx.rst new file mode 100644 index 000000000000..9c57ebcb625e --- /dev/null +++ b/Misc/NEWS.d/next/Core and Builtins/2019-01-12-23-33-04.bpo-35720.LELKQx.rst @@ -0,0 +1 @@ +Fixed a minor memory leak in pymain_parse_cmdline_impl function in Modules/main.c \ No newline at end of file diff --git a/Modules/main.c b/Modules/main.c index af2c191b9b9b..a745381109d3 100644 --- a/Modules/main.c +++ b/Modules/main.c @@ -2165,6 +2165,7 @@ pymain_read_conf(_PyMain *pymain, _PyCoreConfig *config, _PyCmdline *cmdline) goto done; } pymain_clear_cmdline(pymain, cmdline); + pymain_clear_pymain(pymain); memset(cmdline, 0, sizeof(*cmdline)); cmdline_get_global_config(cmdline); From webhook-mailer at python.org Tue Jan 22 13:50:07 2019 From: webhook-mailer at python.org (Steve Dower) Date: Tue, 22 Jan 2019 18:50:07 -0000 Subject: [Python-checkins] bpo-35683: Improve Azure Pipelines steps (GH-11493) Message-ID: https://github.com/python/cpython/commit/28f6cb34f602b9796987904a607dceaf2e4a9e78 commit: 28f6cb34f602b9796987904a607dceaf2e4a9e78 branch: master author: Steve Dower committer: GitHub date: 2019-01-22T10:49:52-08:00 summary: bpo-35683: Improve Azure Pipelines steps (GH-11493) files: A .azure-pipelines/posix-deps-apt.sh A Misc/NEWS.d/next/Build/2019-01-10-11-37-18.bpo-35683.pf5Oos.rst D .azure-pipelines/docker-steps.yml D .azure-pipelines/posix-deps.sh D .azure-pipelines/windows-appx-test.yml M .azure-pipelines/ci.yml M .azure-pipelines/posix-steps.yml M .azure-pipelines/pr.yml M .azure-pipelines/windows-layout-steps.yml M .azure-pipelines/windows-steps.yml M Lib/idlelib/idle_test/test_help_about.py M Lib/test/libregrtest/main.py M Lib/test/test_symbol.py M PC/layout/main.py diff --git a/.azure-pipelines/ci.yml b/.azure-pipelines/ci.yml index 78075bcfc147..15a83dd0370e 100644 --- a/.azure-pipelines/ci.yml +++ b/.azure-pipelines/ci.yml @@ -2,6 +2,11 @@ variables: manylinux: false coverage: false +resources: + containers: + - container: manylinux1 + image: pyca/cryptography-manylinux1:x86_64 + jobs: - job: Prebuild displayName: Pre-build checks @@ -54,10 +59,12 @@ jobs: variables: testRunTitle: '$(build.sourceBranchName)-linux' testRunPlatform: linux - openssl_version: 1.1.0g + openssl_version: 1.1.0j steps: - template: ./posix-steps.yml + parameters: + dependencies: apt - job: ManyLinux1_CI_Tests @@ -75,13 +82,20 @@ jobs: pool: vmImage: ubuntu-16.04 + container: manylinux1 + variables: testRunTitle: '$(build.sourceBranchName)-manylinux1' testRunPlatform: manylinux1 - imageName: 'dockcross/manylinux-x64' + openssl_version: '' steps: - - template: ./docker-steps.yml + - template: ./posix-steps.yml + parameters: + dependencies: yum + sudo_dependencies: '' + xvfb: false + patchcheck: false - job: Ubuntu_Coverage_CI_Tests @@ -102,11 +116,12 @@ jobs: variables: testRunTitle: '$(Build.SourceBranchName)-linux-coverage' testRunPlatform: linux-coverage - openssl_version: 1.1.0g + openssl_version: 1.1.0j steps: - template: ./posix-steps.yml parameters: + dependencies: apt coverage: true @@ -144,3 +159,4 @@ jobs: - template: ./windows-layout-steps.yml parameters: kind: appx + fulltest: true diff --git a/.azure-pipelines/docker-steps.yml b/.azure-pipelines/docker-steps.yml deleted file mode 100644 index ba4dfd72dd8b..000000000000 --- a/.azure-pipelines/docker-steps.yml +++ /dev/null @@ -1,76 +0,0 @@ -steps: -- checkout: self - clean: true - fetchDepth: 5 - -- ${{ if ne(parameters.targetBranch, '') }}: - - script: | - git fetch -q origin ${{ parameters.targetbranch }} - if ! git diff --name-only HEAD $(git merge-base HEAD FETCH_HEAD) | grep -qvE '(\.rst$|^Doc|^Misc)' - then - echo "Only docs were updated, stopping build process." - echo "##vso[task.setvariable variable=DocOnly]true" - exit - fi - displayName: Detect doc-only changes - -- task: docker at 0 - displayName: 'Configure CPython (debug)' - inputs: - action: 'Run an image' - imageName: $(imageName) - volumes: | - $(build.sourcesDirectory):/src - $(build.binariesDirectory):/build - workDir: '/src' - containerCommand: './configure --with-pydebug' - detached: false - condition: and(succeeded(), ne(variables['DocOnly'], 'true')) - -- task: docker at 0 - displayName: 'Build CPython' - inputs: - action: 'Run an image' - imageName: $(imageName) - volumes: | - $(build.sourcesDirectory):/src - $(build.binariesDirectory):/build - workDir: '/src' - containerCommand: 'make -s -j4' - detached: false - condition: and(succeeded(), ne(variables['DocOnly'], 'true')) - -- task: docker at 0 - displayName: 'Display build info' - inputs: - action: 'Run an image' - imageName: $(imageName) - volumes: | - $(build.sourcesDirectory):/src - $(build.binariesDirectory):/build - workDir: '/src' - containerCommand: 'make pythoninfo' - detached: false - condition: and(succeeded(), ne(variables['DocOnly'], 'true')) - -- task: docker at 0 - displayName: 'Tests' - inputs: - action: 'Run an image' - imageName: $(imageName) - volumes: | - $(build.sourcesDirectory):/src - $(build.binariesDirectory):/build - workDir: '/src' - containerCommand: 'make buildbottest TESTOPTS="-j4 -uall,-cpu --junit-xml=/build/test-results.xml"' - detached: false - condition: and(succeeded(), ne(variables['DocOnly'], 'true')) - -- task: PublishTestResults at 2 - displayName: 'Publish Test Results' - inputs: - testResultsFiles: '$(build.binariesDirectory)/test-results.xml' - mergeTestResults: true - testRunTitle: $(testRunTitle) - platform: $(testRunPlatform) - condition: and(succeededOrFailed(), ne(variables['DocOnly'], 'true')) diff --git a/.azure-pipelines/posix-deps.sh b/.azure-pipelines/posix-deps-apt.sh similarity index 92% rename from .azure-pipelines/posix-deps.sh rename to .azure-pipelines/posix-deps-apt.sh index a57210756601..4f489903ab56 100755 --- a/.azure-pipelines/posix-deps.sh +++ b/.azure-pipelines/posix-deps-apt.sh @@ -1,6 +1,6 @@ -sudo apt-get update +apt-get update -sudo apt-get -yq install \ +apt-get -yq install \ build-essential \ zlib1g-dev \ libbz2-dev \ diff --git a/.azure-pipelines/posix-steps.yml b/.azure-pipelines/posix-steps.yml index 6e2606fff7bc..2affb50dc10e 100644 --- a/.azure-pipelines/posix-steps.yml +++ b/.azure-pipelines/posix-steps.yml @@ -1,12 +1,16 @@ parameters: coverage: false + sudo_dependencies: sudo + dependencies: apt + patchcheck: true + xvfb: true steps: - checkout: self clean: true fetchDepth: 5 -- script: ./.azure-pipelines/posix-deps.sh $(openssl_version) +- script: ${{ parameters.sudo_dependencies }} ./.azure-pipelines/posix-deps-${{ parameters.dependencies }}.sh $(openssl_version) displayName: 'Install dependencies' - script: ./configure --with-pydebug @@ -23,7 +27,7 @@ steps: displayName: 'Display build info' - script: | - xvfb-run ./venv/bin/python -m coverage run --pylib -m test \ + $COMMAND -m coverage run --pylib -m test \ --fail-env-changed \ -uall,-cpu \ --junit-xml=$(build.binariesDirectory)/test-results.xml \ @@ -32,6 +36,11 @@ steps: -x test_multiprocessing_spawn \ -x test_concurrent_futures displayName: 'Tests with coverage' + env: + ${{ if eq(parameters.xvfb, 'true') }}: + COMMAND: xvfb-run ./venv/bin/python + ${{ if ne(parameters.xvfb, 'true') }}: + COMMAND: ./venv/bin/python - script: ./venv/bin/python -m coverage xml displayName: 'Generate coverage.xml' @@ -44,13 +53,18 @@ steps: - script: make pythoninfo displayName: 'Display build info' - - script: xvfb-run make buildbottest TESTOPTS="-j4 -uall,-cpu --junit-xml=$(build.binariesDirectory)/test-results.xml" + - script: $COMMAND buildbottest TESTOPTS="-j4 -uall,-cpu --junit-xml=$(build.binariesDirectory)/test-results.xml" displayName: 'Tests' - - -- script: ./python Tools/scripts/patchcheck.py --travis true - displayName: 'Run patchcheck.py' - condition: and(succeeded(), eq(variables['Build.Reason'], 'PullRequest')) + env: + ${{ if eq(parameters.xvfb, 'true') }}: + COMMAND: xvfb-run make + ${{ if ne(parameters.xvfb, 'true') }}: + COMMAND: make + +- ${{ if eq(parameters.patchcheck, 'true') }}: + - script: ./python Tools/scripts/patchcheck.py --travis true + displayName: 'Run patchcheck.py' + condition: and(succeeded(), eq(variables['Build.Reason'], 'PullRequest')) - task: PublishTestResults at 2 diff --git a/.azure-pipelines/pr.yml b/.azure-pipelines/pr.yml index 2d7fba9cf328..0bd7921bcbef 100644 --- a/.azure-pipelines/pr.yml +++ b/.azure-pipelines/pr.yml @@ -1,3 +1,12 @@ +variables: + manylinux: false + coverage: false + +resources: + containers: + - container: manylinux1 + image: pyca/cryptography-manylinux1:x86_64 + jobs: - job: Prebuild displayName: Pre-build checks @@ -50,12 +59,70 @@ jobs: variables: testRunTitle: '$(system.pullRequest.TargetBranch)-linux' testRunPlatform: linux - openssl_version: 1.1.0g + openssl_version: 1.1.0j steps: - template: ./posix-steps.yml parameters: - targetBranch: $(System.PullRequest.TargetBranch) + dependencies: apt + + +- job: ManyLinux1_PR_Tests + displayName: ManyLinux1 PR Tests + dependsOn: Prebuild + condition: | + and( + and( + succeeded(), + eq(variables['manylinux'], 'true') + ), + eq(dependencies.Prebuild.outputs['tests.run'], 'true') + ) + + pool: + vmImage: ubuntu-16.04 + + container: manylinux1 + + variables: + testRunTitle: '$(system.pullRequest.TargetBranch)-manylinux1' + testRunPlatform: manylinux1 + openssl_version: '' + + steps: + - template: ./posix-steps.yml + parameters: + dependencies: yum + sudo_dependencies: '' + xvfb: false + patchcheck: false + + +- job: Ubuntu_Coverage_PR_Tests + displayName: Ubuntu PR Tests (coverage) + dependsOn: Prebuild + condition: | + and( + and( + succeeded(), + eq(variables['coverage'], 'true') + ), + eq(dependencies.Prebuild.outputs['tests.run'], 'true') + ) + + pool: + vmImage: ubuntu-16.04 + + variables: + testRunTitle: '$(Build.SourceBranchName)-linux-coverage' + testRunPlatform: linux-coverage + openssl_version: 1.1.0j + + steps: + - template: ./posix-steps.yml + parameters: + dependencies: apt + coverage: true - job: Windows_PR_Tests diff --git a/.azure-pipelines/windows-appx-test.yml b/.azure-pipelines/windows-appx-test.yml deleted file mode 100644 index cad752b0a1e7..000000000000 --- a/.azure-pipelines/windows-appx-test.yml +++ /dev/null @@ -1,67 +0,0 @@ -jobs: -- job: Prebuild - displayName: Pre-build checks - - pool: - vmImage: ubuntu-16.04 - - steps: - - template: ./prebuild-checks.yml - - -- job: Windows_Appx_Tests - displayName: Windows Appx Tests - dependsOn: Prebuild - condition: and(succeeded(), eq(dependencies.Prebuild.outputs['tests.run'], 'true')) - - pool: - vmImage: vs2017-win2016 - - strategy: - matrix: - win64: - arch: amd64 - buildOpt: '-p x64' - testRunTitle: '$(Build.SourceBranchName)-win64-appx' - testRunPlatform: win64 - maxParallel: 2 - - steps: - - checkout: self - clean: true - fetchDepth: 5 - - - powershell: | - # Relocate build outputs outside of source directory to make cleaning faster - Write-Host '##vso[task.setvariable variable=Py_IntDir]$(Build.BinariesDirectory)\obj' - # UNDONE: Do not build to a different directory because of broken tests - Write-Host '##vso[task.setvariable variable=Py_OutDir]$(Build.SourcesDirectory)\PCbuild' - Write-Host '##vso[task.setvariable variable=EXTERNALS_DIR]$(Build.BinariesDirectory)\externals' - displayName: Update build locations - - - script: PCbuild\build.bat -e $(buildOpt) - displayName: 'Build CPython' - env: - IncludeUwp: true - - - script: python.bat PC\layout -vv -s "$(Build.SourcesDirectory)" -b "$(Py_OutDir)\$(arch)" -t "$(Py_IntDir)\layout-tmp-$(arch)" --copy "$(Py_IntDir)\layout-$(arch)" --precompile --preset-appx --include-tests - displayName: 'Create APPX layout' - - - script: .\python.exe -m test.pythoninfo - workingDirectory: $(Py_IntDir)\layout-$(arch) - displayName: 'Display build info' - - - script: .\python.exe -m test -q -uall -u-cpu -rwW --slowest --timeout=1200 -j0 --junit-xml="$(Build.BinariesDirectory)\test-results.xml" --tempdir "$(Py_IntDir)\tmp-$(arch)" - workingDirectory: $(Py_IntDir)\layout-$(arch) - displayName: 'Tests' - env: - PREFIX: $(Py_IntDir)\layout-$(arch) - - - task: PublishTestResults at 2 - displayName: 'Publish Test Results' - inputs: - testResultsFiles: '$(Build.BinariesDirectory)\test-results.xml' - mergeTestResults: true - testRunTitle: $(testRunTitle) - platform: $(testRunPlatform) - condition: succeededOrFailed() diff --git a/.azure-pipelines/windows-layout-steps.yml b/.azure-pipelines/windows-layout-steps.yml index 62e5259375f5..e15729fac344 100644 --- a/.azure-pipelines/windows-layout-steps.yml +++ b/.azure-pipelines/windows-layout-steps.yml @@ -1,11 +1,28 @@ parameters: kind: nuget extraOpts: --precompile + fulltest: false steps: -- script: .\python.bat PC\layout -vv -s "$(Build.SourcesDirectory)" -b "$(Py_OutDir)\$(arch)" -t "$(Py_IntDir)\layout-tmp-${{ parameters['kind'] }}-$(arch)" --copy "$(Py_OutDir)\layout-${{ parameters['kind'] }}-$(arch)" ${{ parameters['extraOpts'] }} --preset-${{ parameters['kind'] }} --include-tests - displayName: Create ${{ parameters['kind'] }} layout +- script: .\python.bat PC\layout -vv -s "$(Build.SourcesDirectory)" -b "$(Py_OutDir)\$(arch)" -t "$(Build.BinariesDirectory)\layout-tmp-${{ parameters.kind }}-$(arch)" --copy "$(Build.BinariesDirectory)\layout-${{ parameters.kind }}-$(arch)" ${{ parameters.extraOpts }} --preset-${{ parameters.kind }} --include-tests + displayName: Create ${{ parameters.kind }} layout - script: .\python.exe -m test.pythoninfo - workingDirectory: $(Py_OutDir)\layout-${{ parameters['kind'] }}-$(arch) - displayName: Show layout info (${{ parameters['kind'] }}) + workingDirectory: $(Build.BinariesDirectory)\layout-${{ parameters.kind }}-$(arch) + displayName: Show layout info (${{ parameters.kind }}) + +- ${{ if eq(parameters.fulltest, 'true') }}: + - script: .\python.exe -m test -q -uall -u-cpu -rwW --slowest --timeout=1200 -j0 --junit-xml="$(Build.BinariesDirectory)\test-results-${{ parameters.kind }}.xml" --tempdir "$(Build.BinariesDirectory)\tmp-${{ parameters.kind }}-$(arch)" + workingDirectory: $(Build.BinariesDirectory)\layout-${{ parameters.kind }}-$(arch) + displayName: ${{ parameters.kind }} Tests + env: + PREFIX: $(Build.BinariesDirectory)\layout-${{ parameters.kind }}-$(arch) + + - task: PublishTestResults at 2 + displayName: Publish ${{ parameters.kind }} Test Results + inputs: + testResultsFiles: $(Build.BinariesDirectory)\test-results-${{ parameters.kind }}.xml + mergeTestResults: true + testRunTitle: ${{ parameters.kind }}-$(testRunTitle) + platform: $(testRunPlatform) + condition: succeededOrFailed() diff --git a/.azure-pipelines/windows-steps.yml b/.azure-pipelines/windows-steps.yml index 3651ae03bc1d..794a23a5d77e 100644 --- a/.azure-pipelines/windows-steps.yml +++ b/.azure-pipelines/windows-steps.yml @@ -1,6 +1,6 @@ steps: - checkout: self - clean: true + clean: false fetchDepth: 5 - powershell: | @@ -8,6 +8,7 @@ steps: Write-Host '##vso[task.setvariable variable=Py_IntDir]$(Build.BinariesDirectory)\obj' # UNDONE: Do not build to a different directory because of broken tests Write-Host '##vso[task.setvariable variable=Py_OutDir]$(Build.SourcesDirectory)\PCbuild' + #Write-Host '##vso[task.setvariable variable=Py_OutDir]$(Build.BinariesDirectory)\bin' Write-Host '##vso[task.setvariable variable=EXTERNALS_DIR]$(Build.BinariesDirectory)\externals' displayName: Update build locations diff --git a/Lib/idlelib/idle_test/test_help_about.py b/Lib/idlelib/idle_test/test_help_about.py index 5839b5d045d8..7c148d23a135 100644 --- a/Lib/idlelib/idle_test/test_help_about.py +++ b/Lib/idlelib/idle_test/test_help_about.py @@ -61,6 +61,8 @@ def test_printer_buttons(self): button.invoke() get = dialog._current_textview.viewframe.textframe.text.get lines = printer._Printer__lines + if len(lines) < 2: + self.fail(name + ' full text was not found') self.assertEqual(lines[0], get('1.0', '1.end')) self.assertEqual(lines[1], get('2.0', '2.end')) dialog._current_textview.destroy() diff --git a/Lib/test/libregrtest/main.py b/Lib/test/libregrtest/main.py index 8d44caf2999a..32ac44029bc3 100644 --- a/Lib/test/libregrtest/main.py +++ b/Lib/test/libregrtest/main.py @@ -1,5 +1,6 @@ import datetime import faulthandler +import json import locale import os import platform @@ -565,6 +566,9 @@ def main(self, tests=None, **kwargs): if self.ns.tempdir: TEMPDIR = self.ns.tempdir + elif self.ns.worker_args: + ns_dict, _ = json.loads(self.ns.worker_args) + TEMPDIR = ns_dict.get("tempdir") or TEMPDIR os.makedirs(TEMPDIR, exist_ok=True) diff --git a/Lib/test/test_symbol.py b/Lib/test/test_symbol.py index ed86aec36b87..645d8f43b6cd 100644 --- a/Lib/test/test_symbol.py +++ b/Lib/test/test_symbol.py @@ -2,6 +2,7 @@ from test import support import os import sys +import sysconfig import subprocess @@ -38,8 +39,8 @@ def compare_files(self, file1, file2): lines2 = fp.readlines() self.assertEqual(lines1, lines2) - @unittest.skipIf(not os.path.exists(GRAMMAR_FILE), - 'test only works from source build directory') + @unittest.skipUnless(sysconfig.is_python_build(), + 'test only works from source build directory') def test_real_grammar_and_symbol_file(self): output = support.TESTFN self.addCleanup(support.unlink, output) diff --git a/Misc/NEWS.d/next/Build/2019-01-10-11-37-18.bpo-35683.pf5Oos.rst b/Misc/NEWS.d/next/Build/2019-01-10-11-37-18.bpo-35683.pf5Oos.rst new file mode 100644 index 000000000000..f39610169370 --- /dev/null +++ b/Misc/NEWS.d/next/Build/2019-01-10-11-37-18.bpo-35683.pf5Oos.rst @@ -0,0 +1 @@ +Improved Azure Pipelines build steps and now verifying layouts correctly diff --git a/PC/layout/main.py b/PC/layout/main.py index d372fe50df32..910085c01bb2 100644 --- a/PC/layout/main.py +++ b/PC/layout/main.py @@ -156,6 +156,8 @@ def in_build(f, dest="", new_name=None): for dest, src in rglob(ns.build, "vcruntime*.dll"): yield dest, src + yield "LICENSE.txt", ns.source / "LICENSE" + for dest, src in rglob(ns.build, ("*.pyd", "*.dll")): if src.stem.endswith("_d") != bool(ns.debug) and src not in REQUIRED_DLLS: continue From webhook-mailer at python.org Tue Jan 22 15:18:22 2019 From: webhook-mailer at python.org (Victor Stinner) Date: Tue, 22 Jan 2019 20:18:22 -0000 Subject: [Python-checkins] bpo-35713: Split _Py_InitializeCore into subfunctions (GH-11650) Message-ID: https://github.com/python/cpython/commit/6d43f6f081023b680d9db4542d19b9e382149f0a commit: 6d43f6f081023b680d9db4542d19b9e382149f0a branch: master author: Victor Stinner committer: GitHub date: 2019-01-22T21:18:05+01:00 summary: bpo-35713: Split _Py_InitializeCore into subfunctions (GH-11650) * Split _Py_InitializeCore_impl() into subfunctions: add multiple pycore_init_xxx() functions * Preliminary sys.stderr is now set earlier to get an usable sys.stderr ealier. * Move code into _Py_Initialize_ReconfigureCore() to be able to call it from _Py_InitializeCore(). * Split _PyExc_Init(): create a new _PyBuiltins_AddExceptions() function. * Call _PyExc_Init() earlier in _Py_InitializeCore_impl() and new_interpreter() to get working exceptions earlier. * _Py_ReadyTypes() now returns _PyInitError rather than calling Py_FatalError(). * Misc code cleanup files: A Misc/NEWS.d/next/Core and Builtins/2019-01-22-18-50-21.bpo-35713.bTeUsa.rst M Include/internal/pycore_pylifecycle.h M Objects/exceptions.c M Objects/fileobject.c M Objects/floatobject.c M Objects/longobject.c M Objects/object.c M Objects/unicodeobject.c M Python/pylifecycle.c diff --git a/Include/internal/pycore_pylifecycle.h b/Include/internal/pycore_pylifecycle.h index de70199dae57..6f5c54422330 100644 --- a/Include/internal/pycore_pylifecycle.h +++ b/Include/internal/pycore_pylifecycle.h @@ -30,12 +30,13 @@ extern PyObject * _PyBuiltin_Init(void); extern _PyInitError _PySys_BeginInit(PyObject **sysmod); extern int _PySys_EndInit(PyObject *sysdict, PyInterpreterState *interp); extern _PyInitError _PyImport_Init(PyInterpreterState *interp); -extern _PyInitError _PyExc_Init(PyObject * bltinmod); +extern _PyInitError _PyExc_Init(void); +extern _PyInitError _PyBuiltins_AddExceptions(PyObject * bltinmod); extern _PyInitError _PyImportHooks_Init(void); extern int _PyFloat_Init(void); extern _PyInitError _Py_HashRandomization_Init(const _PyCoreConfig *); -extern void _Py_ReadyTypes(void); +extern _PyInitError _Py_ReadyTypes(void); /* Various internal finalizers */ diff --git a/Misc/NEWS.d/next/Core and Builtins/2019-01-22-18-50-21.bpo-35713.bTeUsa.rst b/Misc/NEWS.d/next/Core and Builtins/2019-01-22-18-50-21.bpo-35713.bTeUsa.rst new file mode 100644 index 000000000000..67e766fa075b --- /dev/null +++ b/Misc/NEWS.d/next/Core and Builtins/2019-01-22-18-50-21.bpo-35713.bTeUsa.rst @@ -0,0 +1,2 @@ +Reorganize Python initialization to get working exceptions and sys.stderr +earlier. diff --git a/Objects/exceptions.c b/Objects/exceptions.c index 8d81566c7f15..35e1df3ca1fa 100644 --- a/Objects/exceptions.c +++ b/Objects/exceptions.c @@ -2492,7 +2492,7 @@ SimpleExtendsException(PyExc_Warning, ResourceWarning, #endif /* MS_WINDOWS */ _PyInitError -_PyExc_Init(PyObject *bltinmod) +_PyExc_Init(void) { #define PRE_INIT(TYPE) \ if (!(_PyExc_ ## TYPE.tp_flags & Py_TPFLAGS_READY)) { \ @@ -2502,21 +2502,6 @@ _PyExc_Init(PyObject *bltinmod) Py_INCREF(PyExc_ ## TYPE); \ } -#define POST_INIT(TYPE) \ - if (PyDict_SetItemString(bdict, # TYPE, PyExc_ ## TYPE)) { \ - return _Py_INIT_ERR("Module dictionary insertion problem."); \ - } - -#define INIT_ALIAS(NAME, TYPE) \ - do { \ - Py_INCREF(PyExc_ ## TYPE); \ - Py_XDECREF(PyExc_ ## NAME); \ - PyExc_ ## NAME = PyExc_ ## TYPE; \ - if (PyDict_SetItemString(bdict, # NAME, PyExc_ ## NAME)) { \ - return _Py_INIT_ERR("Module dictionary insertion problem."); \ - } \ - } while (0) - #define ADD_ERRNO(TYPE, CODE) \ do { \ PyObject *_code = PyLong_FromLong(CODE); \ @@ -2526,8 +2511,6 @@ _PyExc_Init(PyObject *bltinmod) Py_DECREF(_code); \ } while (0) - PyObject *bdict; - PRE_INIT(BaseException); PRE_INIT(Exception); PRE_INIT(TypeError); @@ -2596,6 +2579,68 @@ _PyExc_Init(PyObject *bltinmod) PRE_INIT(ProcessLookupError); PRE_INIT(TimeoutError); + if (preallocate_memerrors() < 0) { + return _Py_INIT_ERR("Could not preallocate MemoryError object"); + } + + /* Add exceptions to errnomap */ + if (!errnomap) { + errnomap = PyDict_New(); + if (!errnomap) { + return _Py_INIT_ERR("Cannot allocate map from errnos to OSError subclasses"); + } + } + + ADD_ERRNO(BlockingIOError, EAGAIN); + ADD_ERRNO(BlockingIOError, EALREADY); + ADD_ERRNO(BlockingIOError, EINPROGRESS); + ADD_ERRNO(BlockingIOError, EWOULDBLOCK); + ADD_ERRNO(BrokenPipeError, EPIPE); +#ifdef ESHUTDOWN + ADD_ERRNO(BrokenPipeError, ESHUTDOWN); +#endif + ADD_ERRNO(ChildProcessError, ECHILD); + ADD_ERRNO(ConnectionAbortedError, ECONNABORTED); + ADD_ERRNO(ConnectionRefusedError, ECONNREFUSED); + ADD_ERRNO(ConnectionResetError, ECONNRESET); + ADD_ERRNO(FileExistsError, EEXIST); + ADD_ERRNO(FileNotFoundError, ENOENT); + ADD_ERRNO(IsADirectoryError, EISDIR); + ADD_ERRNO(NotADirectoryError, ENOTDIR); + ADD_ERRNO(InterruptedError, EINTR); + ADD_ERRNO(PermissionError, EACCES); + ADD_ERRNO(PermissionError, EPERM); + ADD_ERRNO(ProcessLookupError, ESRCH); + ADD_ERRNO(TimeoutError, ETIMEDOUT); + + return _Py_INIT_OK(); + +#undef PRE_INIT +#undef ADD_ERRNO +} + + +/* Add exception types to the builtins module */ +_PyInitError +_PyBuiltins_AddExceptions(PyObject *bltinmod) +{ +#define POST_INIT(TYPE) \ + if (PyDict_SetItemString(bdict, # TYPE, PyExc_ ## TYPE)) { \ + return _Py_INIT_ERR("Module dictionary insertion problem."); \ + } + +#define INIT_ALIAS(NAME, TYPE) \ + do { \ + Py_INCREF(PyExc_ ## TYPE); \ + Py_XDECREF(PyExc_ ## NAME); \ + PyExc_ ## NAME = PyExc_ ## TYPE; \ + if (PyDict_SetItemString(bdict, # NAME, PyExc_ ## NAME)) { \ + return _Py_INIT_ERR("Module dictionary insertion problem."); \ + } \ + } while (0) + + PyObject *bdict; + bdict = PyModule_GetDict(bltinmod); if (bdict == NULL) { return _Py_INIT_ERR("exceptions bootstrapping error."); @@ -2656,61 +2701,28 @@ _PyExc_Init(PyObject *bltinmod) POST_INIT(BytesWarning); POST_INIT(ResourceWarning); - if (!errnomap) { - errnomap = PyDict_New(); - if (!errnomap) { - return _Py_INIT_ERR("Cannot allocate map from errnos to OSError subclasses"); - } - } - /* OSError subclasses */ POST_INIT(ConnectionError); POST_INIT(BlockingIOError); - ADD_ERRNO(BlockingIOError, EAGAIN); - ADD_ERRNO(BlockingIOError, EALREADY); - ADD_ERRNO(BlockingIOError, EINPROGRESS); - ADD_ERRNO(BlockingIOError, EWOULDBLOCK); POST_INIT(BrokenPipeError); - ADD_ERRNO(BrokenPipeError, EPIPE); -#ifdef ESHUTDOWN - ADD_ERRNO(BrokenPipeError, ESHUTDOWN); -#endif POST_INIT(ChildProcessError); - ADD_ERRNO(ChildProcessError, ECHILD); POST_INIT(ConnectionAbortedError); - ADD_ERRNO(ConnectionAbortedError, ECONNABORTED); POST_INIT(ConnectionRefusedError); - ADD_ERRNO(ConnectionRefusedError, ECONNREFUSED); POST_INIT(ConnectionResetError); - ADD_ERRNO(ConnectionResetError, ECONNRESET); POST_INIT(FileExistsError); - ADD_ERRNO(FileExistsError, EEXIST); POST_INIT(FileNotFoundError); - ADD_ERRNO(FileNotFoundError, ENOENT); POST_INIT(IsADirectoryError); - ADD_ERRNO(IsADirectoryError, EISDIR); POST_INIT(NotADirectoryError); - ADD_ERRNO(NotADirectoryError, ENOTDIR); POST_INIT(InterruptedError); - ADD_ERRNO(InterruptedError, EINTR); POST_INIT(PermissionError); - ADD_ERRNO(PermissionError, EACCES); - ADD_ERRNO(PermissionError, EPERM); POST_INIT(ProcessLookupError); - ADD_ERRNO(ProcessLookupError, ESRCH); POST_INIT(TimeoutError); - ADD_ERRNO(TimeoutError, ETIMEDOUT); - if (preallocate_memerrors() < 0) { - return _Py_INIT_ERR("Could not preallocate MemoryError object"); - } return _Py_INIT_OK(); -#undef PRE_INIT #undef POST_INIT #undef INIT_ALIAS -#undef ADD_ERRNO } void diff --git a/Objects/fileobject.c b/Objects/fileobject.c index 313f1d0ac55c..babaa05bdbc4 100644 --- a/Objects/fileobject.c +++ b/Objects/fileobject.c @@ -359,6 +359,9 @@ stdprinter_write(PyStdPrinter_Object *self, PyObject *args) Py_ssize_t n; int err; + /* The function can clear the current exception */ + assert(!PyErr_Occurred()); + if (self->fd < 0) { /* fd might be invalid on Windows * I can't raise an exception here. It may lead to an @@ -367,10 +370,11 @@ stdprinter_write(PyStdPrinter_Object *self, PyObject *args) Py_RETURN_NONE; } - if (!PyArg_ParseTuple(args, "U", &unicode)) + if (!PyArg_ParseTuple(args, "U", &unicode)) { return NULL; + } - /* encode Unicode to UTF-8 */ + /* Encode Unicode to UTF-8/surrogateescape */ str = PyUnicode_AsUTF8AndSize(unicode, &n); if (str == NULL) { PyErr_Clear(); diff --git a/Objects/floatobject.c b/Objects/floatobject.c index 67f9e5d5b4ef..b952df880722 100644 --- a/Objects/floatobject.c +++ b/Objects/floatobject.c @@ -1999,8 +1999,9 @@ _PyFloat_Init(void) /* Init float info */ if (FloatInfoType.tp_name == NULL) { - if (PyStructSequence_InitType2(&FloatInfoType, &floatinfo_desc) < 0) + if (PyStructSequence_InitType2(&FloatInfoType, &floatinfo_desc) < 0) { return 0; + } } return 1; } diff --git a/Objects/longobject.c b/Objects/longobject.c index f42683e9d024..3c98385f5396 100644 --- a/Objects/longobject.c +++ b/Objects/longobject.c @@ -5635,8 +5635,9 @@ _PyLong_Init(void) /* initialize int_info */ if (Int_InfoType.tp_name == NULL) { - if (PyStructSequence_InitType2(&Int_InfoType, &int_info_desc) < 0) + if (PyStructSequence_InitType2(&Int_InfoType, &int_info_desc) < 0) { return 0; + } } return 1; diff --git a/Objects/object.c b/Objects/object.c index 6c2bd7717c01..2171d53523f2 100644 --- a/Objects/object.c +++ b/Objects/object.c @@ -1716,200 +1716,83 @@ PyObject _Py_NotImplementedStruct = { 1, &_PyNotImplemented_Type }; -void +_PyInitError _Py_ReadyTypes(void) { - if (PyType_Ready(&PyBaseObject_Type) < 0) - Py_FatalError("Can't initialize object type"); - - if (PyType_Ready(&PyType_Type) < 0) - Py_FatalError("Can't initialize type type"); - - if (PyType_Ready(&_PyWeakref_RefType) < 0) - Py_FatalError("Can't initialize weakref type"); - - if (PyType_Ready(&_PyWeakref_CallableProxyType) < 0) - Py_FatalError("Can't initialize callable weakref proxy type"); - - if (PyType_Ready(&_PyWeakref_ProxyType) < 0) - Py_FatalError("Can't initialize weakref proxy type"); - - if (PyType_Ready(&PyLong_Type) < 0) - Py_FatalError("Can't initialize int type"); - - if (PyType_Ready(&PyBool_Type) < 0) - Py_FatalError("Can't initialize bool type"); - - if (PyType_Ready(&PyByteArray_Type) < 0) - Py_FatalError("Can't initialize bytearray type"); - - if (PyType_Ready(&PyBytes_Type) < 0) - Py_FatalError("Can't initialize 'str'"); - - if (PyType_Ready(&PyList_Type) < 0) - Py_FatalError("Can't initialize list type"); - - if (PyType_Ready(&_PyNone_Type) < 0) - Py_FatalError("Can't initialize None type"); - - if (PyType_Ready(&_PyNotImplemented_Type) < 0) - Py_FatalError("Can't initialize NotImplemented type"); - - if (PyType_Ready(&PyTraceBack_Type) < 0) - Py_FatalError("Can't initialize traceback type"); - - if (PyType_Ready(&PySuper_Type) < 0) - Py_FatalError("Can't initialize super type"); - - if (PyType_Ready(&PyRange_Type) < 0) - Py_FatalError("Can't initialize range type"); - - if (PyType_Ready(&PyDict_Type) < 0) - Py_FatalError("Can't initialize dict type"); - - if (PyType_Ready(&PyDictKeys_Type) < 0) - Py_FatalError("Can't initialize dict keys type"); - - if (PyType_Ready(&PyDictValues_Type) < 0) - Py_FatalError("Can't initialize dict values type"); - - if (PyType_Ready(&PyDictItems_Type) < 0) - Py_FatalError("Can't initialize dict items type"); - - if (PyType_Ready(&PyDictRevIterKey_Type) < 0) - Py_FatalError("Can't initialize reversed dict keys type"); - - if (PyType_Ready(&PyDictRevIterValue_Type) < 0) - Py_FatalError("Can't initialize reversed dict values type"); - - if (PyType_Ready(&PyDictRevIterItem_Type) < 0) - Py_FatalError("Can't initialize reversed dict items type"); - - if (PyType_Ready(&PyODict_Type) < 0) - Py_FatalError("Can't initialize OrderedDict type"); - - if (PyType_Ready(&PyODictKeys_Type) < 0) - Py_FatalError("Can't initialize odict_keys type"); - - if (PyType_Ready(&PyODictItems_Type) < 0) - Py_FatalError("Can't initialize odict_items type"); - - if (PyType_Ready(&PyODictValues_Type) < 0) - Py_FatalError("Can't initialize odict_values type"); - - if (PyType_Ready(&PyODictIter_Type) < 0) - Py_FatalError("Can't initialize odict_keyiterator type"); - - if (PyType_Ready(&PySet_Type) < 0) - Py_FatalError("Can't initialize set type"); - - if (PyType_Ready(&PyUnicode_Type) < 0) - Py_FatalError("Can't initialize str type"); - - if (PyType_Ready(&PySlice_Type) < 0) - Py_FatalError("Can't initialize slice type"); - - if (PyType_Ready(&PyStaticMethod_Type) < 0) - Py_FatalError("Can't initialize static method type"); - - if (PyType_Ready(&PyComplex_Type) < 0) - Py_FatalError("Can't initialize complex type"); - - if (PyType_Ready(&PyFloat_Type) < 0) - Py_FatalError("Can't initialize float type"); - - if (PyType_Ready(&PyFrozenSet_Type) < 0) - Py_FatalError("Can't initialize frozenset type"); - - if (PyType_Ready(&PyProperty_Type) < 0) - Py_FatalError("Can't initialize property type"); - - if (PyType_Ready(&_PyManagedBuffer_Type) < 0) - Py_FatalError("Can't initialize managed buffer type"); - - if (PyType_Ready(&PyMemoryView_Type) < 0) - Py_FatalError("Can't initialize memoryview type"); - - if (PyType_Ready(&PyTuple_Type) < 0) - Py_FatalError("Can't initialize tuple type"); - - if (PyType_Ready(&PyEnum_Type) < 0) - Py_FatalError("Can't initialize enumerate type"); - - if (PyType_Ready(&PyReversed_Type) < 0) - Py_FatalError("Can't initialize reversed type"); - - if (PyType_Ready(&PyStdPrinter_Type) < 0) - Py_FatalError("Can't initialize StdPrinter"); - - if (PyType_Ready(&PyCode_Type) < 0) - Py_FatalError("Can't initialize code type"); - - if (PyType_Ready(&PyFrame_Type) < 0) - Py_FatalError("Can't initialize frame type"); - - if (PyType_Ready(&PyCFunction_Type) < 0) - Py_FatalError("Can't initialize builtin function type"); - - if (PyType_Ready(&PyMethod_Type) < 0) - Py_FatalError("Can't initialize method type"); - - if (PyType_Ready(&PyFunction_Type) < 0) - Py_FatalError("Can't initialize function type"); - - if (PyType_Ready(&PyDictProxy_Type) < 0) - Py_FatalError("Can't initialize dict proxy type"); - - if (PyType_Ready(&PyGen_Type) < 0) - Py_FatalError("Can't initialize generator type"); - - if (PyType_Ready(&PyGetSetDescr_Type) < 0) - Py_FatalError("Can't initialize get-set descriptor type"); - - if (PyType_Ready(&PyWrapperDescr_Type) < 0) - Py_FatalError("Can't initialize wrapper type"); - - if (PyType_Ready(&_PyMethodWrapper_Type) < 0) - Py_FatalError("Can't initialize method wrapper type"); - - if (PyType_Ready(&PyEllipsis_Type) < 0) - Py_FatalError("Can't initialize ellipsis type"); - - if (PyType_Ready(&PyMemberDescr_Type) < 0) - Py_FatalError("Can't initialize member descriptor type"); - - if (PyType_Ready(&_PyNamespace_Type) < 0) - Py_FatalError("Can't initialize namespace type"); - - if (PyType_Ready(&PyCapsule_Type) < 0) - Py_FatalError("Can't initialize capsule type"); - - if (PyType_Ready(&PyLongRangeIter_Type) < 0) - Py_FatalError("Can't initialize long range iterator type"); - - if (PyType_Ready(&PyCell_Type) < 0) - Py_FatalError("Can't initialize cell type"); - - if (PyType_Ready(&PyInstanceMethod_Type) < 0) - Py_FatalError("Can't initialize instance method type"); - - if (PyType_Ready(&PyClassMethodDescr_Type) < 0) - Py_FatalError("Can't initialize class method descr type"); - - if (PyType_Ready(&PyMethodDescr_Type) < 0) - Py_FatalError("Can't initialize method descr type"); - - if (PyType_Ready(&PyCallIter_Type) < 0) - Py_FatalError("Can't initialize call iter type"); - - if (PyType_Ready(&PySeqIter_Type) < 0) - Py_FatalError("Can't initialize sequence iterator type"); - - if (PyType_Ready(&PyCoro_Type) < 0) - Py_FatalError("Can't initialize coroutine type"); - - if (PyType_Ready(&_PyCoroWrapper_Type) < 0) - Py_FatalError("Can't initialize coroutine wrapper type"); +#define INIT_TYPE(TYPE, NAME) \ + do { \ + if (PyType_Ready(TYPE) < 0) { \ + return _Py_INIT_ERR("Can't initialize " NAME " type"); \ + } \ + } while (0) + + INIT_TYPE(&PyBaseObject_Type, "object"); + INIT_TYPE(&PyType_Type, "type"); + INIT_TYPE(&_PyWeakref_RefType, "weakref"); + INIT_TYPE(&_PyWeakref_CallableProxyType, "callable weakref proxy"); + INIT_TYPE(&_PyWeakref_ProxyType, "weakref proxy"); + INIT_TYPE(&PyLong_Type, "int"); + INIT_TYPE(&PyBool_Type, "bool"); + INIT_TYPE(&PyByteArray_Type, "bytearray"); + INIT_TYPE(&PyBytes_Type, "str"); + INIT_TYPE(&PyList_Type, "list"); + INIT_TYPE(&_PyNone_Type, "None"); + INIT_TYPE(&_PyNotImplemented_Type, "NotImplemented"); + INIT_TYPE(&PyTraceBack_Type, "traceback"); + INIT_TYPE(&PySuper_Type, "super"); + INIT_TYPE(&PyRange_Type, "range"); + INIT_TYPE(&PyDict_Type, "dict"); + INIT_TYPE(&PyDictKeys_Type, "dict keys"); + INIT_TYPE(&PyDictValues_Type, "dict values"); + INIT_TYPE(&PyDictItems_Type, "dict items"); + INIT_TYPE(&PyDictRevIterKey_Type, "reversed dict keys"); + INIT_TYPE(&PyDictRevIterValue_Type, "reversed dict values"); + INIT_TYPE(&PyDictRevIterItem_Type, "reversed dict items"); + INIT_TYPE(&PyODict_Type, "OrderedDict"); + INIT_TYPE(&PyODictKeys_Type, "odict_keys"); + INIT_TYPE(&PyODictItems_Type, "odict_items"); + INIT_TYPE(&PyODictValues_Type, "odict_values"); + INIT_TYPE(&PyODictIter_Type, "odict_keyiterator"); + INIT_TYPE(&PySet_Type, "set"); + INIT_TYPE(&PyUnicode_Type, "str"); + INIT_TYPE(&PySlice_Type, "slice"); + INIT_TYPE(&PyStaticMethod_Type, "static method"); + INIT_TYPE(&PyComplex_Type, "complex"); + INIT_TYPE(&PyFloat_Type, "float"); + INIT_TYPE(&PyFrozenSet_Type, "frozenset"); + INIT_TYPE(&PyProperty_Type, "property"); + INIT_TYPE(&_PyManagedBuffer_Type, "managed buffer"); + INIT_TYPE(&PyMemoryView_Type, "memoryview"); + INIT_TYPE(&PyTuple_Type, "tuple"); + INIT_TYPE(&PyEnum_Type, "enumerate"); + INIT_TYPE(&PyReversed_Type, "reversed"); + INIT_TYPE(&PyStdPrinter_Type, "StdPrinter"); + INIT_TYPE(&PyCode_Type, "code"); + INIT_TYPE(&PyFrame_Type, "frame"); + INIT_TYPE(&PyCFunction_Type, "builtin function"); + INIT_TYPE(&PyMethod_Type, "method"); + INIT_TYPE(&PyFunction_Type, "function"); + INIT_TYPE(&PyDictProxy_Type, "dict proxy"); + INIT_TYPE(&PyGen_Type, "generator"); + INIT_TYPE(&PyGetSetDescr_Type, "get-set descriptor"); + INIT_TYPE(&PyWrapperDescr_Type, "wrapper"); + INIT_TYPE(&_PyMethodWrapper_Type, "method wrapper"); + INIT_TYPE(&PyEllipsis_Type, "ellipsis"); + INIT_TYPE(&PyMemberDescr_Type, "member descriptor"); + INIT_TYPE(&_PyNamespace_Type, "namespace"); + INIT_TYPE(&PyCapsule_Type, "capsule"); + INIT_TYPE(&PyLongRangeIter_Type, "long range iterator"); + INIT_TYPE(&PyCell_Type, "cell"); + INIT_TYPE(&PyInstanceMethod_Type, "instance method"); + INIT_TYPE(&PyClassMethodDescr_Type, "class method descr"); + INIT_TYPE(&PyMethodDescr_Type, "method descr"); + INIT_TYPE(&PyCallIter_Type, "call iter"); + INIT_TYPE(&PySeqIter_Type, "sequence iterator"); + INIT_TYPE(&PyCoro_Type, "coroutine"); + INIT_TYPE(&_PyCoroWrapper_Type, "coroutine wrapper"); + return _Py_INIT_OK(); + +#undef INIT_TYPE } diff --git a/Objects/unicodeobject.c b/Objects/unicodeobject.c index ea7bcabfc64f..8141ce757412 100644 --- a/Objects/unicodeobject.c +++ b/Objects/unicodeobject.c @@ -15239,7 +15239,6 @@ _PyUnicode_Init(void) if (PyType_Ready(&PyFormatterIter_Type) < 0) { return _Py_INIT_ERR("Can't initialize formatter iter type"); } - return _Py_INIT_OK(); } diff --git a/Python/pylifecycle.c b/Python/pylifecycle.c index f0e00ea48d48..86d87fb6030a 100644 --- a/Python/pylifecycle.c +++ b/Python/pylifecycle.c @@ -464,9 +464,22 @@ _Py_SetLocaleFromEnv(int category) */ static _PyInitError -_Py_Initialize_ReconfigureCore(PyInterpreterState *interp, +_Py_Initialize_ReconfigureCore(PyInterpreterState **interp_p, const _PyCoreConfig *core_config) { + PyThreadState *tstate = _PyThreadState_GET(); + if (!tstate) { + return _Py_INIT_ERR("failed to read thread state"); + } + + PyInterpreterState *interp = tstate->interp; + if (interp == NULL) { + return _Py_INIT_ERR("can't make main interpreter"); + } + *interp_p = interp; + + /* bpo-34008: For backward compatibility reasons, calling Py_Main() after + Py_Initialize() ignores the new configuration. */ if (core_config->allocator != NULL) { const char *allocator = _PyMem_GetAllocatorsName(); if (allocator == NULL || strcmp(core_config->allocator, allocator) != 0) { @@ -492,57 +505,16 @@ _Py_Initialize_ReconfigureCore(PyInterpreterState *interp, } -/* Begin interpreter initialization - * - * On return, the first thread and interpreter state have been created, - * but the compiler, signal handling, multithreading and - * multiple interpreter support, and codec infrastructure are not yet - * available. - * - * The import system will support builtin and frozen modules only. - * The only supported io is writing to sys.stderr - * - * If any operation invoked by this function fails, a fatal error is - * issued and the function does not return. - * - * Any code invoked from this function should *not* assume it has access - * to the Python C API (unless the API is explicitly listed as being - * safe to call without calling Py_Initialize first) - * - * The caller is responsible to call _PyCoreConfig_Read(). - */ - static _PyInitError -_Py_InitializeCore_impl(PyInterpreterState **interp_p, - const _PyCoreConfig *core_config) +pycore_init_runtime(const _PyCoreConfig *core_config) { - PyInterpreterState *interp; - _PyInitError err; - - /* bpo-34008: For backward compatibility reasons, calling Py_Main() after - Py_Initialize() ignores the new configuration. */ - if (_PyRuntime.core_initialized) { - PyThreadState *tstate = _PyThreadState_GET(); - if (!tstate) { - return _Py_INIT_ERR("failed to read thread state"); - } - - interp = tstate->interp; - if (interp == NULL) { - return _Py_INIT_ERR("can't make main interpreter"); - } - *interp_p = interp; - - return _Py_Initialize_ReconfigureCore(interp, core_config); - } - if (_PyRuntime.initialized) { return _Py_INIT_ERR("main interpreter already initialized"); } _PyCoreConfig_SetGlobalConfig(core_config); - err = _PyRuntime_Initialize(); + _PyInitError err = _PyRuntime_Initialize(); if (_Py_INIT_FAILED(err)) { return err; } @@ -573,8 +545,15 @@ _Py_InitializeCore_impl(PyInterpreterState **interp_p, if (_Py_INIT_FAILED(err)) { return err; } + return _Py_INIT_OK(); +} - interp = PyInterpreterState_New(); + +static _PyInitError +pycore_create_interpreter(const _PyCoreConfig *core_config, + PyInterpreterState **interp_p) +{ + PyInterpreterState *interp = PyInterpreterState_New(); if (interp == NULL) { return _Py_INIT_ERR("can't make main interpreter"); } @@ -603,24 +582,61 @@ _Py_InitializeCore_impl(PyInterpreterState **interp_p, /* Create the GIL */ PyEval_InitThreads(); - _Py_ReadyTypes(); + return _Py_INIT_OK(); +} + + +static _PyInitError +pycore_init_types(void) +{ + _PyInitError err = _Py_ReadyTypes(); + if (_Py_INIT_FAILED(err)) { + return err; + } - if (!_PyLong_Init()) + err = _PyUnicode_Init(); + if (_Py_INIT_FAILED(err)) { + return err; + } + + if (_PyStructSequence_Init() < 0) { + return _Py_INIT_ERR("can't initialize structseq"); + } + + if (!_PyLong_Init()) { return _Py_INIT_ERR("can't init longs"); + } - if (!_PyFloat_Init()) + err = _PyExc_Init(); + if (_Py_INIT_FAILED(err)) { + return err; + } + + if (!_PyFloat_Init()) { return _Py_INIT_ERR("can't init float"); + } + if (!_PyContext_Init()) { + return _Py_INIT_ERR("can't init context"); + } + return _Py_INIT_OK(); +} + + +static _PyInitError +pycore_init_sys(PyInterpreterState *interp, PyObject **sysmod_p) +{ PyObject *modules = PyDict_New(); if (modules == NULL) return _Py_INIT_ERR("can't make modules dictionary"); interp->modules = modules; PyObject *sysmod; - err = _PySys_BeginInit(&sysmod); + _PyInitError err = _PySys_BeginInit(&sysmod); if (_Py_INIT_FAILED(err)) { return err; } + *sysmod_p = sysmod; interp->sysdict = PyModule_GetDict(sysmod); if (interp->sysdict == NULL) { @@ -631,39 +647,49 @@ _Py_InitializeCore_impl(PyInterpreterState **interp_p, PyDict_SetItemString(interp->sysdict, "modules", modules); _PyImport_FixupBuiltin(sysmod, "sys", modules); - err = _PyUnicode_Init(); - if (_Py_INIT_FAILED(err)) { - return err; + /* Set up a preliminary stderr printer until we have enough + infrastructure for the io module in place. + + Use UTF-8/surrogateescape and ignore EAGAIN errors. */ + PyObject *pstderr = PyFile_NewStdPrinter(fileno(stderr)); + if (pstderr == NULL) { + return _Py_INIT_ERR("can't set preliminary stderr"); } + _PySys_SetObjectId(&PyId_stderr, pstderr); + PySys_SetObject("__stderr__", pstderr); + Py_DECREF(pstderr); + + return _Py_INIT_OK(); +} - if (_PyStructSequence_Init() < 0) - return _Py_INIT_ERR("can't initialize structseq"); +static _PyInitError +pycore_init_builtins(PyInterpreterState *interp) +{ PyObject *bimod = _PyBuiltin_Init(); - if (bimod == NULL) + if (bimod == NULL) { return _Py_INIT_ERR("can't initialize builtins modules"); - _PyImport_FixupBuiltin(bimod, "builtins", modules); + } + _PyImport_FixupBuiltin(bimod, "builtins", interp->modules); + interp->builtins = PyModule_GetDict(bimod); - if (interp->builtins == NULL) + if (interp->builtins == NULL) { return _Py_INIT_ERR("can't initialize builtins dict"); + } Py_INCREF(interp->builtins); - /* initialize builtin exceptions */ - err = _PyExc_Init(bimod); + _PyInitError err = _PyBuiltins_AddExceptions(bimod); if (_Py_INIT_FAILED(err)) { return err; } + return _Py_INIT_OK(); +} - /* Set up a preliminary stderr printer until we have enough - infrastructure for the io module in place. */ - PyObject *pstderr = PyFile_NewStdPrinter(fileno(stderr)); - if (pstderr == NULL) - return _Py_INIT_ERR("can't set preliminary stderr"); - _PySys_SetObjectId(&PyId_stderr, pstderr); - PySys_SetObject("__stderr__", pstderr); - Py_DECREF(pstderr); - err = _PyImport_Init(interp); +static _PyInitError +pycore_init_import_warnings(PyInterpreterState *interp, PyObject *sysmod) +{ + _PyInitError err = _PyImport_Init(interp); if (_Py_INIT_FAILED(err)) { return err; } @@ -678,29 +704,85 @@ _Py_InitializeCore_impl(PyInterpreterState **interp_p, return _Py_INIT_ERR("can't initialize warnings"); } - if (!_PyContext_Init()) - return _Py_INIT_ERR("can't init context"); - - if (core_config->_install_importlib) { - err = _PyCoreConfig_SetPathConfig(core_config); + if (interp->core_config._install_importlib) { + err = _PyCoreConfig_SetPathConfig(&interp->core_config); if (_Py_INIT_FAILED(err)) { return err; } } /* This call sets up builtin and frozen import support */ - if (core_config->_install_importlib) { + if (interp->core_config._install_importlib) { err = initimport(interp, sysmod); if (_Py_INIT_FAILED(err)) { return err; } } + return _Py_INIT_OK(); +} + + +static _PyInitError +_Py_InitializeCore_impl(PyInterpreterState **interp_p, + const _PyCoreConfig *core_config) +{ + PyInterpreterState *interp; + + _PyInitError err = pycore_init_runtime(core_config); + if (_Py_INIT_FAILED(err)) { + return err; + } + + err = pycore_create_interpreter(core_config, &interp); + if (_Py_INIT_FAILED(err)) { + return err; + } + core_config = &interp->core_config; + *interp_p = interp; + + err = pycore_init_types(); + if (_Py_INIT_FAILED(err)) { + return err; + } + + PyObject *sysmod; + err = pycore_init_sys(interp, &sysmod); + if (_Py_INIT_FAILED(err)) { + return err; + } + + err = pycore_init_builtins(interp); + if (_Py_INIT_FAILED(err)) { + return err; + } + + err = pycore_init_import_warnings(interp, sysmod); + if (_Py_INIT_FAILED(err)) { + return err; + } /* Only when we get here is the runtime core fully initialized */ _PyRuntime.core_initialized = 1; return _Py_INIT_OK(); } +/* Begin interpreter initialization + * + * On return, the first thread and interpreter state have been created, + * but the compiler, signal handling, multithreading and + * multiple interpreter support, and codec infrastructure are not yet + * available. + * + * The import system will support builtin and frozen modules only. + * The only supported io is writing to sys.stderr + * + * If any operation invoked by this function fails, a fatal error is + * issued and the function does not return. + * + * Any code invoked from this function should *not* assume it has access + * to the Python C API (unless the API is explicitly listed as being + * safe to call without calling Py_Initialize first) + */ _PyInitError _Py_InitializeCore(PyInterpreterState **interp_p, const _PyCoreConfig *src_config) @@ -730,7 +812,12 @@ _Py_InitializeCore(PyInterpreterState **interp_p, goto done; } - err = _Py_InitializeCore_impl(interp_p, &config); + if (!_PyRuntime.core_initialized) { + err = _Py_InitializeCore_impl(interp_p, &config); + } + else { + err = _Py_Initialize_ReconfigureCore(interp_p, &config); + } done: _PyMem_SetDefaultAllocator(PYMEM_DOMAIN_RAW, &old_alloc); @@ -1270,6 +1357,11 @@ new_interpreter(PyThreadState **tstate_p) return _Py_INIT_ERR("failed to copy main interpreter config"); } + err = _PyExc_Init(); + if (_Py_INIT_FAILED(err)) { + return err; + } + /* XXX The following is lax in error checking */ PyObject *modules = PyDict_New(); if (modules == NULL) { @@ -1301,18 +1393,15 @@ new_interpreter(PyThreadState **tstate_p) goto handle_error; } - /* initialize builtin exceptions */ - err = _PyExc_Init(bimod); - if (_Py_INIT_FAILED(err)) { - return err; - } - if (bimod != NULL && sysmod != NULL) { - PyObject *pstderr; + err = _PyBuiltins_AddExceptions(bimod); + if (_Py_INIT_FAILED(err)) { + return err; + } /* Set up a preliminary stderr printer until we have enough infrastructure for the io module in place. */ - pstderr = PyFile_NewStdPrinter(fileno(stderr)); + PyObject *pstderr = PyFile_NewStdPrinter(fileno(stderr)); if (pstderr == NULL) { return _Py_INIT_ERR("can't set preliminary stderr"); } From webhook-mailer at python.org Tue Jan 22 15:31:35 2019 From: webhook-mailer at python.org (Steve Dower) Date: Tue, 22 Jan 2019 20:31:35 -0000 Subject: [Python-checkins] bpo-35683: Improve Azure Pipelines steps (GH-11493) Message-ID: https://github.com/python/cpython/commit/128efcade63480b5860a6d045a41ba4abf5eea2f commit: 128efcade63480b5860a6d045a41ba4abf5eea2f branch: 3.7 author: Steve Dower committer: GitHub date: 2019-01-22T12:31:30-08:00 summary: bpo-35683: Improve Azure Pipelines steps (GH-11493) files: A .azure-pipelines/posix-deps-apt.sh A Misc/NEWS.d/next/Build/2019-01-10-11-37-18.bpo-35683.pf5Oos.rst D .azure-pipelines/docker-steps.yml D .azure-pipelines/posix-deps.sh D .azure-pipelines/windows-appx-test.yml M .azure-pipelines/ci.yml M .azure-pipelines/posix-steps.yml M .azure-pipelines/pr.yml M .azure-pipelines/windows-layout-steps.yml M .azure-pipelines/windows-steps.yml M Lib/idlelib/idle_test/test_help_about.py M Lib/test/libregrtest/main.py M Lib/test/test_symbol.py M PC/layout/main.py diff --git a/.azure-pipelines/ci.yml b/.azure-pipelines/ci.yml index 78075bcfc147..15a83dd0370e 100644 --- a/.azure-pipelines/ci.yml +++ b/.azure-pipelines/ci.yml @@ -2,6 +2,11 @@ variables: manylinux: false coverage: false +resources: + containers: + - container: manylinux1 + image: pyca/cryptography-manylinux1:x86_64 + jobs: - job: Prebuild displayName: Pre-build checks @@ -54,10 +59,12 @@ jobs: variables: testRunTitle: '$(build.sourceBranchName)-linux' testRunPlatform: linux - openssl_version: 1.1.0g + openssl_version: 1.1.0j steps: - template: ./posix-steps.yml + parameters: + dependencies: apt - job: ManyLinux1_CI_Tests @@ -75,13 +82,20 @@ jobs: pool: vmImage: ubuntu-16.04 + container: manylinux1 + variables: testRunTitle: '$(build.sourceBranchName)-manylinux1' testRunPlatform: manylinux1 - imageName: 'dockcross/manylinux-x64' + openssl_version: '' steps: - - template: ./docker-steps.yml + - template: ./posix-steps.yml + parameters: + dependencies: yum + sudo_dependencies: '' + xvfb: false + patchcheck: false - job: Ubuntu_Coverage_CI_Tests @@ -102,11 +116,12 @@ jobs: variables: testRunTitle: '$(Build.SourceBranchName)-linux-coverage' testRunPlatform: linux-coverage - openssl_version: 1.1.0g + openssl_version: 1.1.0j steps: - template: ./posix-steps.yml parameters: + dependencies: apt coverage: true @@ -144,3 +159,4 @@ jobs: - template: ./windows-layout-steps.yml parameters: kind: appx + fulltest: true diff --git a/.azure-pipelines/docker-steps.yml b/.azure-pipelines/docker-steps.yml deleted file mode 100644 index ba4dfd72dd8b..000000000000 --- a/.azure-pipelines/docker-steps.yml +++ /dev/null @@ -1,76 +0,0 @@ -steps: -- checkout: self - clean: true - fetchDepth: 5 - -- ${{ if ne(parameters.targetBranch, '') }}: - - script: | - git fetch -q origin ${{ parameters.targetbranch }} - if ! git diff --name-only HEAD $(git merge-base HEAD FETCH_HEAD) | grep -qvE '(\.rst$|^Doc|^Misc)' - then - echo "Only docs were updated, stopping build process." - echo "##vso[task.setvariable variable=DocOnly]true" - exit - fi - displayName: Detect doc-only changes - -- task: docker at 0 - displayName: 'Configure CPython (debug)' - inputs: - action: 'Run an image' - imageName: $(imageName) - volumes: | - $(build.sourcesDirectory):/src - $(build.binariesDirectory):/build - workDir: '/src' - containerCommand: './configure --with-pydebug' - detached: false - condition: and(succeeded(), ne(variables['DocOnly'], 'true')) - -- task: docker at 0 - displayName: 'Build CPython' - inputs: - action: 'Run an image' - imageName: $(imageName) - volumes: | - $(build.sourcesDirectory):/src - $(build.binariesDirectory):/build - workDir: '/src' - containerCommand: 'make -s -j4' - detached: false - condition: and(succeeded(), ne(variables['DocOnly'], 'true')) - -- task: docker at 0 - displayName: 'Display build info' - inputs: - action: 'Run an image' - imageName: $(imageName) - volumes: | - $(build.sourcesDirectory):/src - $(build.binariesDirectory):/build - workDir: '/src' - containerCommand: 'make pythoninfo' - detached: false - condition: and(succeeded(), ne(variables['DocOnly'], 'true')) - -- task: docker at 0 - displayName: 'Tests' - inputs: - action: 'Run an image' - imageName: $(imageName) - volumes: | - $(build.sourcesDirectory):/src - $(build.binariesDirectory):/build - workDir: '/src' - containerCommand: 'make buildbottest TESTOPTS="-j4 -uall,-cpu --junit-xml=/build/test-results.xml"' - detached: false - condition: and(succeeded(), ne(variables['DocOnly'], 'true')) - -- task: PublishTestResults at 2 - displayName: 'Publish Test Results' - inputs: - testResultsFiles: '$(build.binariesDirectory)/test-results.xml' - mergeTestResults: true - testRunTitle: $(testRunTitle) - platform: $(testRunPlatform) - condition: and(succeededOrFailed(), ne(variables['DocOnly'], 'true')) diff --git a/.azure-pipelines/posix-deps.sh b/.azure-pipelines/posix-deps-apt.sh similarity index 92% rename from .azure-pipelines/posix-deps.sh rename to .azure-pipelines/posix-deps-apt.sh index a57210756601..4f489903ab56 100755 --- a/.azure-pipelines/posix-deps.sh +++ b/.azure-pipelines/posix-deps-apt.sh @@ -1,6 +1,6 @@ -sudo apt-get update +apt-get update -sudo apt-get -yq install \ +apt-get -yq install \ build-essential \ zlib1g-dev \ libbz2-dev \ diff --git a/.azure-pipelines/posix-steps.yml b/.azure-pipelines/posix-steps.yml index 9fec9be8014f..a4160e5a1bf5 100644 --- a/.azure-pipelines/posix-steps.yml +++ b/.azure-pipelines/posix-steps.yml @@ -1,12 +1,16 @@ parameters: coverage: false + sudo_dependencies: sudo + dependencies: apt + patchcheck: true + xvfb: true steps: - checkout: self clean: true fetchDepth: 5 -- script: ./.azure-pipelines/posix-deps.sh $(openssl_version) +- script: ${{ parameters.sudo_dependencies }} ./.azure-pipelines/posix-deps-${{ parameters.dependencies }}.sh $(openssl_version) displayName: 'Install dependencies' - script: ./configure --with-pydebug @@ -23,7 +27,7 @@ steps: displayName: 'Display build info' - script: | - xvfb-run ./venv/bin/python -m coverage run --pylib -m test \ + $COMMAND -m coverage run --pylib -m test \ --fail-env-changed \ -uall,-cpu \ --junit-xml=$(build.binariesDirectory)/test-results.xml" \ @@ -32,6 +36,11 @@ steps: -x test_multiprocessing_spawn \ -x test_concurrent_futures displayName: 'Tests with coverage' + env: + ${{ if eq(parameters.xvfb, 'true') }}: + COMMAND: xvfb-run ./venv/bin/python + ${{ if ne(parameters.xvfb, 'true') }}: + COMMAND: ./venv/bin/python - script: ./venv/bin/python -m coverage xml displayName: 'Generate coverage.xml' @@ -44,13 +53,18 @@ steps: - script: make pythoninfo displayName: 'Display build info' - - script: xvfb-run make buildbottest TESTOPTS="-j4 -uall,-cpu --junit-xml=$(build.binariesDirectory)/test-results.xml" + - script: $COMMAND buildbottest TESTOPTS="-j4 -uall,-cpu --junit-xml=$(build.binariesDirectory)/test-results.xml" displayName: 'Tests' - - -- script: ./python Tools/scripts/patchcheck.py --travis true - displayName: 'Run patchcheck.py' - condition: and(succeeded(), eq(variables['Build.Reason'], 'PullRequest')) + env: + ${{ if eq(parameters.xvfb, 'true') }}: + COMMAND: xvfb-run make + ${{ if ne(parameters.xvfb, 'true') }}: + COMMAND: make + +- ${{ if eq(parameters.patchcheck, 'true') }}: + - script: ./python Tools/scripts/patchcheck.py --travis true + displayName: 'Run patchcheck.py' + condition: and(succeeded(), eq(variables['Build.Reason'], 'PullRequest')) - task: PublishTestResults at 2 diff --git a/.azure-pipelines/pr.yml b/.azure-pipelines/pr.yml index 2d7fba9cf328..0bd7921bcbef 100644 --- a/.azure-pipelines/pr.yml +++ b/.azure-pipelines/pr.yml @@ -1,3 +1,12 @@ +variables: + manylinux: false + coverage: false + +resources: + containers: + - container: manylinux1 + image: pyca/cryptography-manylinux1:x86_64 + jobs: - job: Prebuild displayName: Pre-build checks @@ -50,12 +59,70 @@ jobs: variables: testRunTitle: '$(system.pullRequest.TargetBranch)-linux' testRunPlatform: linux - openssl_version: 1.1.0g + openssl_version: 1.1.0j steps: - template: ./posix-steps.yml parameters: - targetBranch: $(System.PullRequest.TargetBranch) + dependencies: apt + + +- job: ManyLinux1_PR_Tests + displayName: ManyLinux1 PR Tests + dependsOn: Prebuild + condition: | + and( + and( + succeeded(), + eq(variables['manylinux'], 'true') + ), + eq(dependencies.Prebuild.outputs['tests.run'], 'true') + ) + + pool: + vmImage: ubuntu-16.04 + + container: manylinux1 + + variables: + testRunTitle: '$(system.pullRequest.TargetBranch)-manylinux1' + testRunPlatform: manylinux1 + openssl_version: '' + + steps: + - template: ./posix-steps.yml + parameters: + dependencies: yum + sudo_dependencies: '' + xvfb: false + patchcheck: false + + +- job: Ubuntu_Coverage_PR_Tests + displayName: Ubuntu PR Tests (coverage) + dependsOn: Prebuild + condition: | + and( + and( + succeeded(), + eq(variables['coverage'], 'true') + ), + eq(dependencies.Prebuild.outputs['tests.run'], 'true') + ) + + pool: + vmImage: ubuntu-16.04 + + variables: + testRunTitle: '$(Build.SourceBranchName)-linux-coverage' + testRunPlatform: linux-coverage + openssl_version: 1.1.0j + + steps: + - template: ./posix-steps.yml + parameters: + dependencies: apt + coverage: true - job: Windows_PR_Tests diff --git a/.azure-pipelines/windows-appx-test.yml b/.azure-pipelines/windows-appx-test.yml deleted file mode 100644 index cad752b0a1e7..000000000000 --- a/.azure-pipelines/windows-appx-test.yml +++ /dev/null @@ -1,67 +0,0 @@ -jobs: -- job: Prebuild - displayName: Pre-build checks - - pool: - vmImage: ubuntu-16.04 - - steps: - - template: ./prebuild-checks.yml - - -- job: Windows_Appx_Tests - displayName: Windows Appx Tests - dependsOn: Prebuild - condition: and(succeeded(), eq(dependencies.Prebuild.outputs['tests.run'], 'true')) - - pool: - vmImage: vs2017-win2016 - - strategy: - matrix: - win64: - arch: amd64 - buildOpt: '-p x64' - testRunTitle: '$(Build.SourceBranchName)-win64-appx' - testRunPlatform: win64 - maxParallel: 2 - - steps: - - checkout: self - clean: true - fetchDepth: 5 - - - powershell: | - # Relocate build outputs outside of source directory to make cleaning faster - Write-Host '##vso[task.setvariable variable=Py_IntDir]$(Build.BinariesDirectory)\obj' - # UNDONE: Do not build to a different directory because of broken tests - Write-Host '##vso[task.setvariable variable=Py_OutDir]$(Build.SourcesDirectory)\PCbuild' - Write-Host '##vso[task.setvariable variable=EXTERNALS_DIR]$(Build.BinariesDirectory)\externals' - displayName: Update build locations - - - script: PCbuild\build.bat -e $(buildOpt) - displayName: 'Build CPython' - env: - IncludeUwp: true - - - script: python.bat PC\layout -vv -s "$(Build.SourcesDirectory)" -b "$(Py_OutDir)\$(arch)" -t "$(Py_IntDir)\layout-tmp-$(arch)" --copy "$(Py_IntDir)\layout-$(arch)" --precompile --preset-appx --include-tests - displayName: 'Create APPX layout' - - - script: .\python.exe -m test.pythoninfo - workingDirectory: $(Py_IntDir)\layout-$(arch) - displayName: 'Display build info' - - - script: .\python.exe -m test -q -uall -u-cpu -rwW --slowest --timeout=1200 -j0 --junit-xml="$(Build.BinariesDirectory)\test-results.xml" --tempdir "$(Py_IntDir)\tmp-$(arch)" - workingDirectory: $(Py_IntDir)\layout-$(arch) - displayName: 'Tests' - env: - PREFIX: $(Py_IntDir)\layout-$(arch) - - - task: PublishTestResults at 2 - displayName: 'Publish Test Results' - inputs: - testResultsFiles: '$(Build.BinariesDirectory)\test-results.xml' - mergeTestResults: true - testRunTitle: $(testRunTitle) - platform: $(testRunPlatform) - condition: succeededOrFailed() diff --git a/.azure-pipelines/windows-layout-steps.yml b/.azure-pipelines/windows-layout-steps.yml index 62e5259375f5..e15729fac344 100644 --- a/.azure-pipelines/windows-layout-steps.yml +++ b/.azure-pipelines/windows-layout-steps.yml @@ -1,11 +1,28 @@ parameters: kind: nuget extraOpts: --precompile + fulltest: false steps: -- script: .\python.bat PC\layout -vv -s "$(Build.SourcesDirectory)" -b "$(Py_OutDir)\$(arch)" -t "$(Py_IntDir)\layout-tmp-${{ parameters['kind'] }}-$(arch)" --copy "$(Py_OutDir)\layout-${{ parameters['kind'] }}-$(arch)" ${{ parameters['extraOpts'] }} --preset-${{ parameters['kind'] }} --include-tests - displayName: Create ${{ parameters['kind'] }} layout +- script: .\python.bat PC\layout -vv -s "$(Build.SourcesDirectory)" -b "$(Py_OutDir)\$(arch)" -t "$(Build.BinariesDirectory)\layout-tmp-${{ parameters.kind }}-$(arch)" --copy "$(Build.BinariesDirectory)\layout-${{ parameters.kind }}-$(arch)" ${{ parameters.extraOpts }} --preset-${{ parameters.kind }} --include-tests + displayName: Create ${{ parameters.kind }} layout - script: .\python.exe -m test.pythoninfo - workingDirectory: $(Py_OutDir)\layout-${{ parameters['kind'] }}-$(arch) - displayName: Show layout info (${{ parameters['kind'] }}) + workingDirectory: $(Build.BinariesDirectory)\layout-${{ parameters.kind }}-$(arch) + displayName: Show layout info (${{ parameters.kind }}) + +- ${{ if eq(parameters.fulltest, 'true') }}: + - script: .\python.exe -m test -q -uall -u-cpu -rwW --slowest --timeout=1200 -j0 --junit-xml="$(Build.BinariesDirectory)\test-results-${{ parameters.kind }}.xml" --tempdir "$(Build.BinariesDirectory)\tmp-${{ parameters.kind }}-$(arch)" + workingDirectory: $(Build.BinariesDirectory)\layout-${{ parameters.kind }}-$(arch) + displayName: ${{ parameters.kind }} Tests + env: + PREFIX: $(Build.BinariesDirectory)\layout-${{ parameters.kind }}-$(arch) + + - task: PublishTestResults at 2 + displayName: Publish ${{ parameters.kind }} Test Results + inputs: + testResultsFiles: $(Build.BinariesDirectory)\test-results-${{ parameters.kind }}.xml + mergeTestResults: true + testRunTitle: ${{ parameters.kind }}-$(testRunTitle) + platform: $(testRunPlatform) + condition: succeededOrFailed() diff --git a/.azure-pipelines/windows-steps.yml b/.azure-pipelines/windows-steps.yml index 3651ae03bc1d..794a23a5d77e 100644 --- a/.azure-pipelines/windows-steps.yml +++ b/.azure-pipelines/windows-steps.yml @@ -1,6 +1,6 @@ steps: - checkout: self - clean: true + clean: false fetchDepth: 5 - powershell: | @@ -8,6 +8,7 @@ steps: Write-Host '##vso[task.setvariable variable=Py_IntDir]$(Build.BinariesDirectory)\obj' # UNDONE: Do not build to a different directory because of broken tests Write-Host '##vso[task.setvariable variable=Py_OutDir]$(Build.SourcesDirectory)\PCbuild' + #Write-Host '##vso[task.setvariable variable=Py_OutDir]$(Build.BinariesDirectory)\bin' Write-Host '##vso[task.setvariable variable=EXTERNALS_DIR]$(Build.BinariesDirectory)\externals' displayName: Update build locations diff --git a/Lib/idlelib/idle_test/test_help_about.py b/Lib/idlelib/idle_test/test_help_about.py index 5839b5d045d8..7c148d23a135 100644 --- a/Lib/idlelib/idle_test/test_help_about.py +++ b/Lib/idlelib/idle_test/test_help_about.py @@ -61,6 +61,8 @@ def test_printer_buttons(self): button.invoke() get = dialog._current_textview.viewframe.textframe.text.get lines = printer._Printer__lines + if len(lines) < 2: + self.fail(name + ' full text was not found') self.assertEqual(lines[0], get('1.0', '1.end')) self.assertEqual(lines[1], get('2.0', '2.end')) dialog._current_textview.destroy() diff --git a/Lib/test/libregrtest/main.py b/Lib/test/libregrtest/main.py index 8d44caf2999a..32ac44029bc3 100644 --- a/Lib/test/libregrtest/main.py +++ b/Lib/test/libregrtest/main.py @@ -1,5 +1,6 @@ import datetime import faulthandler +import json import locale import os import platform @@ -565,6 +566,9 @@ def main(self, tests=None, **kwargs): if self.ns.tempdir: TEMPDIR = self.ns.tempdir + elif self.ns.worker_args: + ns_dict, _ = json.loads(self.ns.worker_args) + TEMPDIR = ns_dict.get("tempdir") or TEMPDIR os.makedirs(TEMPDIR, exist_ok=True) diff --git a/Lib/test/test_symbol.py b/Lib/test/test_symbol.py index c1306f54327f..32564f3df214 100644 --- a/Lib/test/test_symbol.py +++ b/Lib/test/test_symbol.py @@ -2,6 +2,7 @@ from test import support import os import sys +import sysconfig import subprocess @@ -35,8 +36,8 @@ def compare_files(self, file1, file2): lines2 = fp.readlines() self.assertEqual(lines1, lines2) - @unittest.skipIf(not os.path.exists(GRAMMAR_FILE), - 'test only works from source build directory') + @unittest.skipUnless(sysconfig.is_python_build(), + 'test only works from source build directory') def test_real_grammar_and_symbol_file(self): output = support.TESTFN self.addCleanup(support.unlink, output) diff --git a/Misc/NEWS.d/next/Build/2019-01-10-11-37-18.bpo-35683.pf5Oos.rst b/Misc/NEWS.d/next/Build/2019-01-10-11-37-18.bpo-35683.pf5Oos.rst new file mode 100644 index 000000000000..f39610169370 --- /dev/null +++ b/Misc/NEWS.d/next/Build/2019-01-10-11-37-18.bpo-35683.pf5Oos.rst @@ -0,0 +1 @@ +Improved Azure Pipelines build steps and now verifying layouts correctly diff --git a/PC/layout/main.py b/PC/layout/main.py index d372fe50df32..910085c01bb2 100644 --- a/PC/layout/main.py +++ b/PC/layout/main.py @@ -156,6 +156,8 @@ def in_build(f, dest="", new_name=None): for dest, src in rglob(ns.build, "vcruntime*.dll"): yield dest, src + yield "LICENSE.txt", ns.source / "LICENSE" + for dest, src in rglob(ns.build, ("*.pyd", "*.dll")): if src.stem.endswith("_d") != bool(ns.debug) and src not in REQUIRED_DLLS: continue From webhook-mailer at python.org Wed Jan 23 02:08:42 2019 From: webhook-mailer at python.org (Vinay Sajip) Date: Wed, 23 Jan 2019 07:08:42 -0000 Subject: [Python-checkins] bpo-35726: Prevented QueueHandler formatting from affecting other handlers (GH-11537) Message-ID: https://github.com/python/cpython/commit/da6424e96ada72c15c91bddb0a411acf7119e10a commit: da6424e96ada72c15c91bddb0a411acf7119e10a branch: master author: Manjusaka committer: Vinay Sajip date: 2019-01-23T07:08:38Z summary: bpo-35726: Prevented QueueHandler formatting from affecting other handlers (GH-11537) QueueHandler.prepare() now makes a copy of the record before modifying and enqueueing it, to avoid affecting other handlers in the chain. files: A Misc/NEWS.d/next/Library/2019-01-13-01-33-00.bpo-35726.dasdas.rst M Lib/logging/handlers.py diff --git a/Lib/logging/handlers.py b/Lib/logging/handlers.py index e213e438c31a..3727bf0677cd 100644 --- a/Lib/logging/handlers.py +++ b/Lib/logging/handlers.py @@ -27,6 +27,7 @@ from stat import ST_DEV, ST_INO, ST_MTIME import queue import threading +import copy # # Some constants... @@ -1377,6 +1378,8 @@ def prepare(self, record): # exc_info and exc_text attributes, as they are no longer # needed and, if not None, will typically not be pickleable. msg = self.format(record) + # bpo-35726: make copy of record to avoid affecting other handlers in the chain. + record = copy.copy(record) record.message = msg record.msg = msg record.args = None diff --git a/Misc/NEWS.d/next/Library/2019-01-13-01-33-00.bpo-35726.dasdas.rst b/Misc/NEWS.d/next/Library/2019-01-13-01-33-00.bpo-35726.dasdas.rst new file mode 100644 index 000000000000..f47cdc128e85 --- /dev/null +++ b/Misc/NEWS.d/next/Library/2019-01-13-01-33-00.bpo-35726.dasdas.rst @@ -0,0 +1 @@ +QueueHandler.prepare() now makes a copy of the record before modifying and enqueueing it, to avoid affecting other handlers in the chain. From webhook-mailer at python.org Wed Jan 23 02:12:43 2019 From: webhook-mailer at python.org (Vinay Sajip) Date: Wed, 23 Jan 2019 07:12:43 -0000 Subject: [Python-checkins] bpo-35722: Updated the documentation for the 'disable_existing_loggers' parameter (GH-11525) Message-ID: https://github.com/python/cpython/commit/f0c743604fc841d35a48822b936ef2e5919e43c1 commit: f0c743604fc841d35a48822b936ef2e5919e43c1 branch: master author: G?ry Ogam committer: Vinay Sajip date: 2019-01-23T07:12:39Z summary: bpo-35722: Updated the documentation for the 'disable_existing_loggers' parameter (GH-11525) files: M Doc/howto/logging.rst M Doc/library/logging.config.rst diff --git a/Doc/howto/logging.rst b/Doc/howto/logging.rst index 2a2282e9ecf0..c90df4301fa6 100644 --- a/Doc/howto/logging.rst +++ b/Doc/howto/logging.rst @@ -695,15 +695,15 @@ noncoders to easily modify the logging properties. .. warning:: The :func:`fileConfig` function takes a default parameter, ``disable_existing_loggers``, which defaults to ``True`` for reasons of backward compatibility. This may or may not be what you want, since it - will cause any loggers existing before the :func:`fileConfig` call to - be disabled unless they (or an ancestor) are explicitly named in the - configuration. Please refer to the reference documentation for more + will cause any non-root loggers existing before the :func:`fileConfig` + call to be disabled unless they (or an ancestor) are explicitly named in + the configuration. Please refer to the reference documentation for more information, and specify ``False`` for this parameter if you wish. The dictionary passed to :func:`dictConfig` can also specify a Boolean value with key ``disable_existing_loggers``, which if not specified explicitly in the dictionary also defaults to being interpreted as - ``True``. This leads to the logger-disabling behaviour described above, + ``True``. This leads to the logger-disabling behaviour described above, which may not be what you want - in which case, provide the key explicitly with a value of ``False``. @@ -802,7 +802,7 @@ the best default behaviour. If for some reason you *don't* want these messages printed in the absence of any logging configuration, you can attach a do-nothing handler to the top-level logger for your library. This avoids the message being printed, since a handler -will be always be found for the library's events: it just doesn't produce any +will always be found for the library's events: it just doesn't produce any output. If the library user configures logging for application use, presumably that configuration will add some handlers, and if levels are suitably configured then logging calls made in library code will send output to those diff --git a/Doc/library/logging.config.rst b/Doc/library/logging.config.rst index 7f6c3c69739d..683d6ed5e8ba 100644 --- a/Doc/library/logging.config.rst +++ b/Doc/library/logging.config.rst @@ -107,9 +107,9 @@ in :mod:`logging` itself) and defining handlers which are declared either in enabled. The default is ``True`` because this enables old behaviour in a backward-compatible way. This behaviour is to - disable any existing loggers unless they or - their ancestors are explicitly named in the - logging configuration. + disable any existing non-root loggers unless + they or their ancestors are explicitly named + in the logging configuration. .. versionchanged:: 3.4 An instance of a subclass of :class:`~configparser.RawConfigParser` is @@ -313,8 +313,8 @@ otherwise, the context is used to determine what to instantiate. If the specified value is ``True``, the configuration is processed as described in the section on :ref:`logging-config-dict-incremental`. -* *disable_existing_loggers* - whether any existing loggers are to be - disabled. This setting mirrors the parameter of the same name in +* *disable_existing_loggers* - whether any existing non-root loggers are + to be disabled. This setting mirrors the parameter of the same name in :func:`fileConfig`. If absent, this parameter defaults to ``True``. This value is ignored if *incremental* is ``True``. From webhook-mailer at python.org Wed Jan 23 02:21:39 2019 From: webhook-mailer at python.org (Vinay Sajip) Date: Wed, 23 Jan 2019 07:21:39 -0000 Subject: [Python-checkins] bpo-35722: Updated the documentation for the 'disable_existing_loggers' parameter (GH-11525) (GH-11655) Message-ID: https://github.com/python/cpython/commit/552478bb1086ef371e4f1da0b430b90eba4785d5 commit: 552478bb1086ef371e4f1da0b430b90eba4785d5 branch: 3.7 author: Miss Islington (bot) <31488909+miss-islington at users.noreply.github.com> committer: Vinay Sajip date: 2019-01-23T07:21:32Z summary: bpo-35722: Updated the documentation for the 'disable_existing_loggers' parameter (GH-11525) (GH-11655) (cherry picked from commit f0c743604fc841d35a48822b936ef2e5919e43c1) Co-authored-by: G?ry Ogam files: M Doc/howto/logging.rst M Doc/library/logging.config.rst diff --git a/Doc/howto/logging.rst b/Doc/howto/logging.rst index 2a2282e9ecf0..c90df4301fa6 100644 --- a/Doc/howto/logging.rst +++ b/Doc/howto/logging.rst @@ -695,15 +695,15 @@ noncoders to easily modify the logging properties. .. warning:: The :func:`fileConfig` function takes a default parameter, ``disable_existing_loggers``, which defaults to ``True`` for reasons of backward compatibility. This may or may not be what you want, since it - will cause any loggers existing before the :func:`fileConfig` call to - be disabled unless they (or an ancestor) are explicitly named in the - configuration. Please refer to the reference documentation for more + will cause any non-root loggers existing before the :func:`fileConfig` + call to be disabled unless they (or an ancestor) are explicitly named in + the configuration. Please refer to the reference documentation for more information, and specify ``False`` for this parameter if you wish. The dictionary passed to :func:`dictConfig` can also specify a Boolean value with key ``disable_existing_loggers``, which if not specified explicitly in the dictionary also defaults to being interpreted as - ``True``. This leads to the logger-disabling behaviour described above, + ``True``. This leads to the logger-disabling behaviour described above, which may not be what you want - in which case, provide the key explicitly with a value of ``False``. @@ -802,7 +802,7 @@ the best default behaviour. If for some reason you *don't* want these messages printed in the absence of any logging configuration, you can attach a do-nothing handler to the top-level logger for your library. This avoids the message being printed, since a handler -will be always be found for the library's events: it just doesn't produce any +will always be found for the library's events: it just doesn't produce any output. If the library user configures logging for application use, presumably that configuration will add some handlers, and if levels are suitably configured then logging calls made in library code will send output to those diff --git a/Doc/library/logging.config.rst b/Doc/library/logging.config.rst index de805eb955df..cd62adb39a36 100644 --- a/Doc/library/logging.config.rst +++ b/Doc/library/logging.config.rst @@ -107,9 +107,9 @@ in :mod:`logging` itself) and defining handlers which are declared either in enabled. The default is ``True`` because this enables old behaviour in a backward-compatible way. This behaviour is to - disable any existing loggers unless they or - their ancestors are explicitly named in the - logging configuration. + disable any existing non-root loggers unless + they or their ancestors are explicitly named + in the logging configuration. .. versionchanged:: 3.4 An instance of a subclass of :class:`~configparser.RawConfigParser` is @@ -308,8 +308,8 @@ otherwise, the context is used to determine what to instantiate. If the specified value is ``True``, the configuration is processed as described in the section on :ref:`logging-config-dict-incremental`. -* *disable_existing_loggers* - whether any existing loggers are to be - disabled. This setting mirrors the parameter of the same name in +* *disable_existing_loggers* - whether any existing non-root loggers are + to be disabled. This setting mirrors the parameter of the same name in :func:`fileConfig`. If absent, this parameter defaults to ``True``. This value is ignored if *incremental* is ``True``. From webhook-mailer at python.org Wed Jan 23 02:27:18 2019 From: webhook-mailer at python.org (Vinay Sajip) Date: Wed, 23 Jan 2019 07:27:18 -0000 Subject: [Python-checkins] bpo-35781: Changed references to deprecated 'warn' method in logging documentation in favour of 'warning' (GH-11654) Message-ID: https://github.com/python/cpython/commit/cda73a5af2ff064ca82140342b3158851d43868f commit: cda73a5af2ff064ca82140342b3158851d43868f branch: master author: yuji38kwmt committer: Vinay Sajip date: 2019-01-23T07:27:13Z summary: bpo-35781: Changed references to deprecated 'warn' method in logging documentation in favour of 'warning' (GH-11654) files: M Doc/howto/logging-cookbook.rst M Doc/howto/logging.rst diff --git a/Doc/howto/logging-cookbook.rst b/Doc/howto/logging-cookbook.rst index faf2ed15a066..4956aa0dd957 100644 --- a/Doc/howto/logging-cookbook.rst +++ b/Doc/howto/logging-cookbook.rst @@ -186,7 +186,7 @@ previous simple module-based configuration example:: # 'application' code logger.debug('debug message') logger.info('info message') - logger.warn('warn message') + logger.warning('warn message') logger.error('error message') logger.critical('critical message') @@ -295,7 +295,7 @@ Here is an example of a module using the logging configuration server:: while True: logger.debug('debug message') logger.info('info message') - logger.warn('warn message') + logger.warning('warn message') logger.error('error message') logger.critical('critical message') time.sleep(5) diff --git a/Doc/howto/logging.rst b/Doc/howto/logging.rst index c90df4301fa6..7a68ca89199c 100644 --- a/Doc/howto/logging.rst +++ b/Doc/howto/logging.rst @@ -610,7 +610,7 @@ logger, a console handler, and a simple formatter using Python code:: # 'application' code logger.debug('debug message') logger.info('info message') - logger.warn('warn message') + logger.warning('warn message') logger.error('error message') logger.critical('critical message') @@ -640,7 +640,7 @@ the names of the objects:: # 'application' code logger.debug('debug message') logger.info('info message') - logger.warn('warn message') + logger.warning('warn message') logger.error('error message') logger.critical('critical message') From webhook-mailer at python.org Wed Jan 23 02:43:42 2019 From: webhook-mailer at python.org (Vinay Sajip) Date: Wed, 23 Jan 2019 07:43:42 -0000 Subject: [Python-checkins] bpo-35781: Changed references to deprecated 'warn' method in logging documentation in favour of 'warning' (GH-11654) (GH-11657) Message-ID: https://github.com/python/cpython/commit/3be19c082b7f0ba3cf6c28922d3577126871788e commit: 3be19c082b7f0ba3cf6c28922d3577126871788e branch: 3.7 author: Miss Islington (bot) <31488909+miss-islington at users.noreply.github.com> committer: Vinay Sajip date: 2019-01-23T07:43:37Z summary: bpo-35781: Changed references to deprecated 'warn' method in logging documentation in favour of 'warning' (GH-11654) (GH-11657) (cherry picked from commit cda73a5af2ff064ca82140342b3158851d43868f) Co-authored-by: yuji38kwmt files: M Doc/howto/logging-cookbook.rst M Doc/howto/logging.rst diff --git a/Doc/howto/logging-cookbook.rst b/Doc/howto/logging-cookbook.rst index b1930a791fca..e391506ce2e4 100644 --- a/Doc/howto/logging-cookbook.rst +++ b/Doc/howto/logging-cookbook.rst @@ -186,7 +186,7 @@ previous simple module-based configuration example:: # 'application' code logger.debug('debug message') logger.info('info message') - logger.warn('warn message') + logger.warning('warn message') logger.error('error message') logger.critical('critical message') @@ -295,7 +295,7 @@ Here is an example of a module using the logging configuration server:: while True: logger.debug('debug message') logger.info('info message') - logger.warn('warn message') + logger.warning('warn message') logger.error('error message') logger.critical('critical message') time.sleep(5) diff --git a/Doc/howto/logging.rst b/Doc/howto/logging.rst index c90df4301fa6..7a68ca89199c 100644 --- a/Doc/howto/logging.rst +++ b/Doc/howto/logging.rst @@ -610,7 +610,7 @@ logger, a console handler, and a simple formatter using Python code:: # 'application' code logger.debug('debug message') logger.info('info message') - logger.warn('warn message') + logger.warning('warn message') logger.error('error message') logger.critical('critical message') @@ -640,7 +640,7 @@ the names of the objects:: # 'application' code logger.debug('debug message') logger.info('info message') - logger.warn('warn message') + logger.warning('warn message') logger.error('error message') logger.critical('critical message') From solipsis at pitrou.net Wed Jan 23 04:10:04 2019 From: solipsis at pitrou.net (solipsis at pitrou.net) Date: Wed, 23 Jan 2019 09:10:04 +0000 Subject: [Python-checkins] Daily reference leaks (4243df51fe43): sum=2 Message-ID: <20190123091004.1.38F2D8A6E76E28F5@psf.io> results for 4243df51fe43 on branch "default" -------------------------------------------- test_asyncio leaked [0, 0, 3] memory blocks, sum=3 test_collections leaked [-7, 8, -7] memory blocks, sum=-6 test_functools leaked [0, 3, 1] memory blocks, sum=4 test_multiprocessing_fork leaked [1, -2, 1] memory blocks, sum=0 test_multiprocessing_forkserver leaked [-1, 2, 0] memory blocks, sum=1 Command line was: ['./python', '-m', 'test.regrtest', '-uall', '-R', '3:3:/home/psf-users/antoine/refleaks/reflogoKPKir', '--timeout', '7200'] From webhook-mailer at python.org Wed Jan 23 09:04:43 2019 From: webhook-mailer at python.org (Victor Stinner) Date: Wed, 23 Jan 2019 14:04:43 -0000 Subject: [Python-checkins] bpo-35713: Reorganize sys module initialization (GH-11658) Message-ID: https://github.com/python/cpython/commit/ab67281e95de1a88c4379a75a547f19a8ba5ec30 commit: ab67281e95de1a88c4379a75a547f19a8ba5ec30 branch: master author: Victor Stinner committer: GitHub date: 2019-01-23T15:04:40+01:00 summary: bpo-35713: Reorganize sys module initialization (GH-11658) * Rename _PySys_BeginInit() to _PySys_InitCore(). * Rename _PySys_EndInit() to _PySys_InitMain(). * Add _PySys_Create(). It calls _PySys_InitCore() which becomes private. * Add _PySys_SetPreliminaryStderr(). * Rename _Py_ReadyTypes() to _PyTypes_Init(). * Misc code cleanup. files: M Include/internal/pycore_pylifecycle.h M Objects/object.c M Python/pylifecycle.c M Python/sysmodule.c diff --git a/Include/internal/pycore_pylifecycle.h b/Include/internal/pycore_pylifecycle.h index 6f5c54422330..acb7391c0a70 100644 --- a/Include/internal/pycore_pylifecycle.h +++ b/Include/internal/pycore_pylifecycle.h @@ -27,8 +27,11 @@ extern int _PyLong_Init(void); extern _PyInitError _PyFaulthandler_Init(int enable); extern int _PyTraceMalloc_Init(int enable); extern PyObject * _PyBuiltin_Init(void); -extern _PyInitError _PySys_BeginInit(PyObject **sysmod); -extern int _PySys_EndInit(PyObject *sysdict, PyInterpreterState *interp); +extern _PyInitError _PySys_Create( + PyInterpreterState *interp, + PyObject **sysmod_p); +extern _PyInitError _PySys_SetPreliminaryStderr(PyObject *sysdict); +extern int _PySys_InitMain(PyInterpreterState *interp); extern _PyInitError _PyImport_Init(PyInterpreterState *interp); extern _PyInitError _PyExc_Init(void); extern _PyInitError _PyBuiltins_AddExceptions(PyObject * bltinmod); @@ -36,7 +39,7 @@ extern _PyInitError _PyImportHooks_Init(void); extern int _PyFloat_Init(void); extern _PyInitError _Py_HashRandomization_Init(const _PyCoreConfig *); -extern _PyInitError _Py_ReadyTypes(void); +extern _PyInitError _PyTypes_Init(void); /* Various internal finalizers */ diff --git a/Objects/object.c b/Objects/object.c index 2171d53523f2..044342f24745 100644 --- a/Objects/object.c +++ b/Objects/object.c @@ -1717,7 +1717,7 @@ PyObject _Py_NotImplementedStruct = { }; _PyInitError -_Py_ReadyTypes(void) +_PyTypes_Init(void) { #define INIT_TYPE(TYPE, NAME) \ do { \ diff --git a/Python/pylifecycle.c b/Python/pylifecycle.c index 86d87fb6030a..5d5ec4a63200 100644 --- a/Python/pylifecycle.c +++ b/Python/pylifecycle.c @@ -589,7 +589,7 @@ pycore_create_interpreter(const _PyCoreConfig *core_config, static _PyInitError pycore_init_types(void) { - _PyInitError err = _Py_ReadyTypes(); + _PyInitError err = _PyTypes_Init(); if (_Py_INIT_FAILED(err)) { return err; } @@ -623,46 +623,6 @@ pycore_init_types(void) } -static _PyInitError -pycore_init_sys(PyInterpreterState *interp, PyObject **sysmod_p) -{ - PyObject *modules = PyDict_New(); - if (modules == NULL) - return _Py_INIT_ERR("can't make modules dictionary"); - interp->modules = modules; - - PyObject *sysmod; - _PyInitError err = _PySys_BeginInit(&sysmod); - if (_Py_INIT_FAILED(err)) { - return err; - } - *sysmod_p = sysmod; - - interp->sysdict = PyModule_GetDict(sysmod); - if (interp->sysdict == NULL) { - return _Py_INIT_ERR("can't initialize sys dict"); - } - - Py_INCREF(interp->sysdict); - PyDict_SetItemString(interp->sysdict, "modules", modules); - _PyImport_FixupBuiltin(sysmod, "sys", modules); - - /* Set up a preliminary stderr printer until we have enough - infrastructure for the io module in place. - - Use UTF-8/surrogateescape and ignore EAGAIN errors. */ - PyObject *pstderr = PyFile_NewStdPrinter(fileno(stderr)); - if (pstderr == NULL) { - return _Py_INIT_ERR("can't set preliminary stderr"); - } - _PySys_SetObjectId(&PyId_stderr, pstderr); - PySys_SetObject("__stderr__", pstderr); - Py_DECREF(pstderr); - - return _Py_INIT_OK(); -} - - static _PyInitError pycore_init_builtins(PyInterpreterState *interp) { @@ -746,7 +706,7 @@ _Py_InitializeCore_impl(PyInterpreterState **interp_p, } PyObject *sysmod; - err = pycore_init_sys(interp, &sysmod); + err = _PySys_Create(interp, &sysmod); if (_Py_INIT_FAILED(err)) { return err; } @@ -887,7 +847,7 @@ _Py_InitializeMainInterpreter(PyInterpreterState *interp, return _Py_INIT_ERR("can't initialize time"); } - if (_PySys_EndInit(interp->sysdict, interp) < 0) { + if (_PySys_InitMain(interp) < 0) { return _Py_INIT_ERR("can't finish initializing sys"); } @@ -1376,7 +1336,9 @@ new_interpreter(PyThreadState **tstate_p) goto handle_error; Py_INCREF(interp->sysdict); PyDict_SetItemString(interp->sysdict, "modules", modules); - _PySys_EndInit(interp->sysdict, interp); + if (_PySys_InitMain(interp) < 0) { + return _Py_INIT_ERR("can't finish initializing sys"); + } } else if (PyErr_Occurred()) { goto handle_error; @@ -1399,15 +1361,10 @@ new_interpreter(PyThreadState **tstate_p) return err; } - /* Set up a preliminary stderr printer until we have enough - infrastructure for the io module in place. */ - PyObject *pstderr = PyFile_NewStdPrinter(fileno(stderr)); - if (pstderr == NULL) { - return _Py_INIT_ERR("can't set preliminary stderr"); + err = _PySys_SetPreliminaryStderr(interp->sysdict); + if (_Py_INIT_FAILED(err)) { + return err; } - _PySys_SetObjectId(&PyId_stderr, pstderr); - PySys_SetObject("__stderr__", pstderr); - Py_DECREF(pstderr); err = _PyImportHooks_Init(); if (_Py_INIT_FAILED(err)) { diff --git a/Python/sysmodule.c b/Python/sysmodule.c index 8efe1699422c..f1cd74ebeccf 100644 --- a/Python/sysmodule.c +++ b/Python/sysmodule.c @@ -2368,19 +2368,12 @@ static struct PyModuleDef sysmodule = { } \ } while (0) - -_PyInitError -_PySys_BeginInit(PyObject **sysmod) +static _PyInitError +_PySys_InitCore(PyObject *sysdict) { - PyObject *m, *sysdict, *version_info; + PyObject *version_info; int res; - m = _PyModule_CreateInitialized(&sysmodule, PYTHON_API_VERSION); - if (m == NULL) { - return _Py_INIT_ERR("failed to create a module object"); - } - sysdict = PyModule_GetDict(m); - /* stdin/stdout/stderr are set in pylifecycle.c */ SET_SYS_FROM_STRING_BORROW("__displayhook__", @@ -2508,9 +2501,6 @@ _PySys_BeginInit(PyObject **sysmod) if (PyErr_Occurred()) { goto err_occurred; } - - *sysmod = m; - return _Py_INIT_OK(); type_init_failed: @@ -2536,8 +2526,9 @@ _PySys_BeginInit(PyObject **sysmod) } while (0) int -_PySys_EndInit(PyObject *sysdict, PyInterpreterState *interp) +_PySys_InitMain(PyInterpreterState *interp) { + PyObject *sysdict = interp->sysdict; const _PyCoreConfig *core_config = &interp->core_config; const _PyMainInterpreterConfig *config = &interp->config; int res; @@ -2552,9 +2543,8 @@ _PySys_EndInit(PyObject *sysdict, PyInterpreterState *interp) #define COPY_LIST(KEY, ATTR) \ do { \ - assert(PyList_Check(config->ATTR)); \ - PyObject *list = PyList_GetSlice(config->ATTR, \ - 0, PyList_GET_SIZE(config->ATTR)); \ + assert(PyList_Check(ATTR)); \ + PyObject *list = PyList_GetSlice(ATTR, 0, PyList_GET_SIZE(ATTR)); \ if (list == NULL) { \ return -1; \ } \ @@ -2562,7 +2552,7 @@ _PySys_EndInit(PyObject *sysdict, PyInterpreterState *interp) Py_DECREF(list); \ } while (0) - COPY_LIST("path", module_search_path); + COPY_LIST("path", config->module_search_path); SET_SYS_FROM_STRING_BORROW("executable", config->executable); SET_SYS_FROM_STRING_BORROW("prefix", config->prefix); @@ -2580,7 +2570,7 @@ _PySys_EndInit(PyObject *sysdict, PyInterpreterState *interp) SET_SYS_FROM_STRING_BORROW("argv", config->argv); } if (config->warnoptions != NULL) { - COPY_LIST("warnoptions", warnoptions); + COPY_LIST("warnoptions", config->warnoptions); } if (config->xoptions != NULL) { PyObject *dict = PyDict_Copy(config->xoptions); @@ -2631,6 +2621,77 @@ _PySys_EndInit(PyObject *sysdict, PyInterpreterState *interp) #undef SET_SYS_FROM_STRING_BORROW #undef SET_SYS_FROM_STRING_INT_RESULT + +/* Set up a preliminary stderr printer until we have enough + infrastructure for the io module in place. + + Use UTF-8/surrogateescape and ignore EAGAIN errors. */ +_PyInitError +_PySys_SetPreliminaryStderr(PyObject *sysdict) +{ + PyObject *pstderr = PyFile_NewStdPrinter(fileno(stderr)); + if (pstderr == NULL) { + goto error; + } + if (_PyDict_SetItemId(sysdict, &PyId_stderr, pstderr) < 0) { + goto error; + } + if (PyDict_SetItemString(sysdict, "__stderr__", pstderr) < 0) { + goto error; + } + Py_DECREF(pstderr); + return _Py_INIT_OK(); + +error: + Py_XDECREF(pstderr); + return _Py_INIT_ERR("can't set preliminary stderr"); +} + + +/* Create sys module without all attributes: _PySys_InitMain() should be called + later to add remaining attributes. */ +_PyInitError +_PySys_Create(PyInterpreterState *interp, PyObject **sysmod_p) +{ + PyObject *modules = PyDict_New(); + if (modules == NULL) { + return _Py_INIT_ERR("can't make modules dictionary"); + } + interp->modules = modules; + + PyObject *sysmod = _PyModule_CreateInitialized(&sysmodule, PYTHON_API_VERSION); + if (sysmod == NULL) { + return _Py_INIT_ERR("failed to create a module object"); + } + + PyObject *sysdict = PyModule_GetDict(sysmod); + if (sysdict == NULL) { + return _Py_INIT_ERR("can't initialize sys dict"); + } + Py_INCREF(sysdict); + interp->sysdict = sysdict; + + if (PyDict_SetItemString(sysdict, "modules", interp->modules) < 0) { + return _Py_INIT_ERR("can't initialize sys module"); + } + + _PyInitError err = _PySys_SetPreliminaryStderr(sysdict); + if (_Py_INIT_FAILED(err)) { + return err; + } + + err = _PySys_InitCore(sysdict); + if (_Py_INIT_FAILED(err)) { + return err; + } + + _PyImport_FixupBuiltin(sysmod, "sys", interp->modules); + + *sysmod_p = sysmod; + return _Py_INIT_OK(); +} + + static PyObject * makepathobject(const wchar_t *path, wchar_t delim) { From webhook-mailer at python.org Wed Jan 23 13:00:45 2019 From: webhook-mailer at python.org (Victor Stinner) Date: Wed, 23 Jan 2019 18:00:45 -0000 Subject: [Python-checkins] bpo-35537: subprocess can use posix_spawn with pipes (GH-11575) Message-ID: https://github.com/python/cpython/commit/f6243ac1e4828299fe5a8e943d7bd41cab1f34cd commit: f6243ac1e4828299fe5a8e943d7bd41cab1f34cd branch: master author: Victor Stinner committer: GitHub date: 2019-01-23T19:00:39+01:00 summary: bpo-35537: subprocess can use posix_spawn with pipes (GH-11575) * subprocess.Popen can now also use os.posix_spawn() with pipes, but only if pipe file descriptors are greater than 2. * Fix Popen._posix_spawn(): set '_child_created' attribute to True. * Add Popen._close_pipe_fds() helper function to factorize the code. files: M Doc/whatsnew/3.8.rst M Lib/subprocess.py diff --git a/Doc/whatsnew/3.8.rst b/Doc/whatsnew/3.8.rst index 8b38cce35368..ce46afd48158 100644 --- a/Doc/whatsnew/3.8.rst +++ b/Doc/whatsnew/3.8.rst @@ -280,8 +280,8 @@ Optimizations and Linux (using glibc 2.24 or newer) if all these conditions are met: * *close_fds* is false; - * *preexec_fn*, *pass_fds*, *cwd*, *stdin*, *stdout*, *stderr* and - *start_new_session* parameters are not set; + * *preexec_fn*, *pass_fds*, *cwd* and *start_new_session* parameters + are not set; * the *executable* path contains a directory. * :func:`shutil.copyfile`, :func:`shutil.copy`, :func:`shutil.copy2`, diff --git a/Lib/subprocess.py b/Lib/subprocess.py index b94575b8401e..2300c7352e0d 100644 --- a/Lib/subprocess.py +++ b/Lib/subprocess.py @@ -1066,6 +1066,34 @@ def wait(self, timeout=None): pass raise # resume the KeyboardInterrupt + def _close_pipe_fds(self, + p2cread, p2cwrite, + c2pread, c2pwrite, + errread, errwrite): + # self._devnull is not always defined. + devnull_fd = getattr(self, '_devnull', None) + + if _mswindows: + if p2cread != -1: + p2cread.Close() + if c2pwrite != -1: + c2pwrite.Close() + if errwrite != -1: + errwrite.Close() + else: + if p2cread != -1 and p2cwrite != -1 and p2cread != devnull_fd: + os.close(p2cread) + if c2pwrite != -1 and c2pread != -1 and c2pwrite != devnull_fd: + os.close(c2pwrite) + if errwrite != -1 and errread != -1 and errwrite != devnull_fd: + os.close(errwrite) + + if devnull_fd is not None: + os.close(devnull_fd) + + # Prevent a double close of these handles/fds from __init__ on error. + self._closed_child_pipe_fds = True + if _mswindows: # @@ -1244,17 +1272,9 @@ def _execute_child(self, args, executable, preexec_fn, close_fds, # output pipe are maintained in this process or else the # pipe will not close when the child process exits and the # ReadFile will hang. - if p2cread != -1: - p2cread.Close() - if c2pwrite != -1: - c2pwrite.Close() - if errwrite != -1: - errwrite.Close() - if hasattr(self, '_devnull'): - os.close(self._devnull) - # Prevent a double close of these handles/fds from __init__ - # on error. - self._closed_child_pipe_fds = True + self._close_pipe_fds(p2cread, p2cwrite, + c2pread, c2pwrite, + errread, errwrite) # Retain the process handle, but close the thread handle self._child_created = True @@ -1441,7 +1461,10 @@ def _get_handles(self, stdin, stdout, stderr): errread, errwrite) - def _posix_spawn(self, args, executable, env, restore_signals): + def _posix_spawn(self, args, executable, env, restore_signals, + p2cread, p2cwrite, + c2pread, c2pwrite, + errread, errwrite): """Execute program using os.posix_spawn().""" if env is None: env = os.environ @@ -1456,7 +1479,26 @@ def _posix_spawn(self, args, executable, env, restore_signals): sigset.append(signum) kwargs['setsigdef'] = sigset + file_actions = [] + for fd in (p2cwrite, c2pread, errread): + if fd != -1: + file_actions.append((os.POSIX_SPAWN_CLOSE, fd)) + for fd, fd2 in ( + (p2cread, 0), + (c2pwrite, 1), + (errwrite, 2), + ): + if fd != -1: + file_actions.append((os.POSIX_SPAWN_DUP2, fd, fd2)) + if file_actions: + kwargs['file_actions'] = file_actions + self.pid = os.posix_spawn(executable, args, env, **kwargs) + self._child_created = True + + self._close_pipe_fds(p2cread, p2cwrite, + c2pread, c2pwrite, + errread, errwrite) def _execute_child(self, args, executable, preexec_fn, close_fds, pass_fds, cwd, env, @@ -1489,11 +1531,14 @@ def _execute_child(self, args, executable, preexec_fn, close_fds, and not close_fds and not pass_fds and cwd is None - and p2cread == p2cwrite == -1 - and c2pread == c2pwrite == -1 - and errread == errwrite == -1 + and (p2cread == -1 or p2cread > 2) + and (c2pwrite == -1 or c2pwrite > 2) + and (errwrite == -1 or errwrite > 2) and not start_new_session): - self._posix_spawn(args, executable, env, restore_signals) + self._posix_spawn(args, executable, env, restore_signals, + p2cread, p2cwrite, + c2pread, c2pwrite, + errread, errwrite) return orig_executable = executable @@ -1548,18 +1593,9 @@ def _execute_child(self, args, executable, preexec_fn, close_fds, # be sure the FD is closed no matter what os.close(errpipe_write) - # self._devnull is not always defined. - devnull_fd = getattr(self, '_devnull', None) - if p2cread != -1 and p2cwrite != -1 and p2cread != devnull_fd: - os.close(p2cread) - if c2pwrite != -1 and c2pread != -1 and c2pwrite != devnull_fd: - os.close(c2pwrite) - if errwrite != -1 and errread != -1 and errwrite != devnull_fd: - os.close(errwrite) - if devnull_fd is not None: - os.close(devnull_fd) - # Prevent a double close of these fds from __init__ on error. - self._closed_child_pipe_fds = True + self._close_pipe_fds(p2cread, p2cwrite, + c2pread, c2pwrite, + errread, errwrite) # Wait for exec to fail or succeed; possibly raising an # exception (limited in size) From webhook-mailer at python.org Wed Jan 23 15:57:37 2019 From: webhook-mailer at python.org (=?utf-8?q?=C5=81ukasz?= Langa) Date: Wed, 23 Jan 2019 20:57:37 -0000 Subject: [Python-checkins] bpo-35767: Fix unittest.loader to allow partials as test_functions (#11600) Message-ID: https://github.com/python/cpython/commit/fd628cf5adaeee73eab579393cdff71c8f70cdf2 commit: fd628cf5adaeee73eab579393cdff71c8f70cdf2 branch: master author: Jason Fried committer: ?ukasz Langa date: 2019-01-23T21:57:25+01:00 summary: bpo-35767: Fix unittest.loader to allow partials as test_functions (#11600) files: M Lib/unittest/loader.py M Lib/unittest/test/test_loader.py diff --git a/Lib/unittest/loader.py b/Lib/unittest/loader.py index d936a96e73fb..ba7105e1ad60 100644 --- a/Lib/unittest/loader.py +++ b/Lib/unittest/loader.py @@ -229,7 +229,9 @@ def shouldIncludeMethod(attrname): testFunc = getattr(testCaseClass, attrname) if not callable(testFunc): return False - fullName = '%s.%s' % (testCaseClass.__module__, testFunc.__qualname__) + fullName = f'%s.%s.%s' % ( + testCaseClass.__module__, testCaseClass.__qualname__, attrname + ) return self.testNamePatterns is None or \ any(fnmatchcase(fullName, pattern) for pattern in self.testNamePatterns) testFnNames = list(filter(shouldIncludeMethod, dir(testCaseClass))) diff --git a/Lib/unittest/test/test_loader.py b/Lib/unittest/test/test_loader.py index bfd722940b56..bc54bf055352 100644 --- a/Lib/unittest/test/test_loader.py +++ b/Lib/unittest/test/test_loader.py @@ -1,3 +1,4 @@ +import functools import sys import types import warnings @@ -1575,5 +1576,20 @@ def test_suiteClass__default_value(self): self.assertIs(loader.suiteClass, unittest.TestSuite) + def test_partial_functions(self): + def noop(arg): + pass + + class Foo(unittest.TestCase): + pass + + setattr(Foo, 'test_partial', functools.partial(noop, None)) + + loader = unittest.TestLoader() + + test_names = ['test_partial'] + self.assertEqual(loader.getTestCaseNames(Foo), test_names) + + if __name__ == "__main__": unittest.main() From solipsis at pitrou.net Thu Jan 24 04:14:39 2019 From: solipsis at pitrou.net (solipsis at pitrou.net) Date: Thu, 24 Jan 2019 09:14:39 +0000 Subject: [Python-checkins] Daily reference leaks (4243df51fe43): sum=-1 Message-ID: <20190124091439.1.2AB7C09422064A39@psf.io> results for 4243df51fe43 on branch "default" -------------------------------------------- test_collections leaked [0, 0, -7] memory blocks, sum=-7 test_functools leaked [0, 3, 1] memory blocks, sum=4 test_multiprocessing_fork leaked [0, 2, 0] memory blocks, sum=2 Command line was: ['./python', '-m', 'test.regrtest', '-uall', '-R', '3:3:/home/psf-users/antoine/refleaks/reflogcv5cAV', '--timeout', '7200'] From webhook-mailer at python.org Thu Jan 24 12:29:55 2019 From: webhook-mailer at python.org (=?utf-8?q?=C5=81ukasz?= Langa) Date: Thu, 24 Jan 2019 17:29:55 -0000 Subject: [Python-checkins] bpo-35520: Fix build with dtrace support on certain systems. (#11194) Message-ID: https://github.com/python/cpython/commit/5c8f537669d3379fc50bb0a96accac756e43e281 commit: 5c8f537669d3379fc50bb0a96accac756e43e281 branch: master author: Jakub Kul?k committer: ?ukasz Langa date: 2019-01-24T18:29:48+01:00 summary: bpo-35520: Fix build with dtrace support on certain systems. (#11194) files: M Makefile.pre.in M configure M configure.ac diff --git a/Makefile.pre.in b/Makefile.pre.in index 3c77a0e9fe32..f8216971958a 100644 --- a/Makefile.pre.in +++ b/Makefile.pre.in @@ -451,8 +451,7 @@ LIBRARY_OBJS= \ # On some systems, object files that reference DTrace probes need to be modified # in-place by dtrace(1). DTRACE_DEPS = \ - Python/ceval.o -# XXX: should gcmodule, etc. be here, too? + Python/ceval.o Python/import.o Modules/gcmodule.o ######################################################################### # Rules @@ -628,7 +627,7 @@ $(LIBRARY): $(LIBRARY_OBJS) -rm -f $@ $(AR) $(ARFLAGS) $@ $(LIBRARY_OBJS) -libpython$(LDVERSION).so: $(LIBRARY_OBJS) +libpython$(LDVERSION).so: $(LIBRARY_OBJS) $(DTRACE_OBJS) if test $(INSTSONAME) != $(LDLIBRARY); then \ $(BLDSHARED) -Wl,-h$(INSTSONAME) -o $(INSTSONAME) $(LIBRARY_OBJS) $(MODLIBS) $(SHLIBS) $(LIBC) $(LIBM); \ $(LN) -f $(INSTSONAME) $@; \ @@ -640,7 +639,7 @@ libpython3.so: libpython$(LDVERSION).so $(BLDSHARED) $(NO_AS_NEEDED) -o $@ -Wl,-h$@ $^ libpython$(LDVERSION).dylib: $(LIBRARY_OBJS) - $(CC) -dynamiclib -Wl,-single_module $(PY_CORE_LDFLAGS) -undefined dynamic_lookup -Wl,-install_name,$(prefix)/lib/libpython$(LDVERSION).dylib -Wl,-compatibility_version,$(VERSION) -Wl,-current_version,$(VERSION) -o $@ $(LIBRARY_OBJS) $(SHLIBS) $(LIBC) $(LIBM); \ + $(CC) -dynamiclib -Wl,-single_module $(PY_CORE_LDFLAGS) -undefined dynamic_lookup -Wl,-install_name,$(prefix)/lib/libpython$(LDVERSION).dylib -Wl,-compatibility_version,$(VERSION) -Wl,-current_version,$(VERSION) -o $@ $(LIBRARY_OBJS) $(DTRACE_OBJS) $(SHLIBS) $(LIBC) $(LIBM); \ libpython$(VERSION).sl: $(LIBRARY_OBJS) @@ -752,6 +751,7 @@ Modules/getbuildinfo.o: $(PARSER_OBJS) \ $(PYTHON_OBJS) \ $(MODULE_OBJS) \ $(MODOBJS) \ + $(DTRACE_OBJS) \ $(srcdir)/Modules/getbuildinfo.c $(CC) -c $(PY_CORE_CFLAGS) \ -DGITVERSION="\"`LC_ALL=C $(GITVERSION)`\"" \ @@ -954,6 +954,10 @@ Include/pydtrace_probes.h: $(srcdir)/Include/pydtrace.d sed 's/PYTHON_/PyDTrace_/' $@ > $@.tmp mv $@.tmp $@ +Python/ceval.o: Include/pydtrace.h +Python/import.o: Include/pydtrace.h +Modules/gcmodule.o: Include/pydtrace.h + Python/pydtrace.o: $(srcdir)/Include/pydtrace.d $(DTRACE_DEPS) $(DTRACE) $(DFLAGS) -o $@ -G -s $< $(DTRACE_DEPS) diff --git a/configure b/configure index b32481dca03f..ebd9f904b09a 100755 --- a/configure +++ b/configure @@ -11352,7 +11352,7 @@ if ${ac_cv_dtrace_link+:} false; then : $as_echo_n "(cached) " >&6 else ac_cv_dtrace_link=no - echo 'BEGIN' > conftest.d + echo 'BEGIN{}' > conftest.d "$DTRACE" -G -s conftest.d -o conftest.o > /dev/null 2>&1 && \ ac_cv_dtrace_link=yes diff --git a/configure.ac b/configure.ac index 262c72668a95..721edb015ea3 100644 --- a/configure.ac +++ b/configure.ac @@ -3429,7 +3429,7 @@ then AC_CACHE_CHECK([whether DTrace probes require linking], [ac_cv_dtrace_link], [dnl ac_cv_dtrace_link=no - echo 'BEGIN' > conftest.d + echo 'BEGIN{}' > conftest.d "$DTRACE" -G -s conftest.d -o conftest.o > /dev/null 2>&1 && \ ac_cv_dtrace_link=yes ]) From webhook-mailer at python.org Thu Jan 24 12:31:04 2019 From: webhook-mailer at python.org (=?utf-8?q?=C5=81ukasz?= Langa) Date: Thu, 24 Jan 2019 17:31:04 -0000 Subject: [Python-checkins] bpo-35767: Fix unittest.loader to allow partials as test_functions (GH-11600) (#11662) Message-ID: https://github.com/python/cpython/commit/841387dd43e67b1800d10e4d7ce1f8cedc9f3706 commit: 841387dd43e67b1800d10e4d7ce1f8cedc9f3706 branch: 3.7 author: Miss Islington (bot) <31488909+miss-islington at users.noreply.github.com> committer: ?ukasz Langa date: 2019-01-24T18:30:59+01:00 summary: bpo-35767: Fix unittest.loader to allow partials as test_functions (GH-11600) (#11662) (cherry picked from commit fd628cf5adaeee73eab579393cdff71c8f70cdf2) Co-authored-by: Jason Fried files: M Lib/unittest/loader.py M Lib/unittest/test/test_loader.py diff --git a/Lib/unittest/loader.py b/Lib/unittest/loader.py index d936a96e73fb..ba7105e1ad60 100644 --- a/Lib/unittest/loader.py +++ b/Lib/unittest/loader.py @@ -229,7 +229,9 @@ def shouldIncludeMethod(attrname): testFunc = getattr(testCaseClass, attrname) if not callable(testFunc): return False - fullName = '%s.%s' % (testCaseClass.__module__, testFunc.__qualname__) + fullName = f'%s.%s.%s' % ( + testCaseClass.__module__, testCaseClass.__qualname__, attrname + ) return self.testNamePatterns is None or \ any(fnmatchcase(fullName, pattern) for pattern in self.testNamePatterns) testFnNames = list(filter(shouldIncludeMethod, dir(testCaseClass))) diff --git a/Lib/unittest/test/test_loader.py b/Lib/unittest/test/test_loader.py index bfd722940b56..bc54bf055352 100644 --- a/Lib/unittest/test/test_loader.py +++ b/Lib/unittest/test/test_loader.py @@ -1,3 +1,4 @@ +import functools import sys import types import warnings @@ -1575,5 +1576,20 @@ def test_suiteClass__default_value(self): self.assertIs(loader.suiteClass, unittest.TestSuite) + def test_partial_functions(self): + def noop(arg): + pass + + class Foo(unittest.TestCase): + pass + + setattr(Foo, 'test_partial', functools.partial(noop, None)) + + loader = unittest.TestLoader() + + test_names = ['test_partial'] + self.assertEqual(loader.getTestCaseNames(Foo), test_names) + + if __name__ == "__main__": unittest.main() From webhook-mailer at python.org Thu Jan 24 14:43:18 2019 From: webhook-mailer at python.org (Miss Islington (bot)) Date: Thu, 24 Jan 2019 19:43:18 -0000 Subject: [Python-checkins] bpo-35717: Fix KeyError exception raised when using enums and compile (GH-11523) Message-ID: https://github.com/python/cpython/commit/1fd06f1eca80dcbf3a916133919482a8327f3da4 commit: 1fd06f1eca80dcbf3a916133919482a8327f3da4 branch: master author: R?mi Lapeyre committer: Miss Islington (bot) <31488909+miss-islington at users.noreply.github.com> date: 2019-01-24T11:43:13-08:00 summary: bpo-35717: Fix KeyError exception raised when using enums and compile (GH-11523) https://bugs.python.org/issue17467 files: A Misc/NEWS.d/next/Library/2019-01-11-17-56-15.bpo-35717.6TDTB_.rst M Lib/enum.py M Lib/test/test_enum.py M Misc/ACKS diff --git a/Lib/enum.py b/Lib/enum.py index f7452f0cc0aa..a958ed8748af 100644 --- a/Lib/enum.py +++ b/Lib/enum.py @@ -419,7 +419,7 @@ def _create_(cls, class_name, names, *, module=None, qualname=None, type=None, s if module is None: try: module = sys._getframe(2).f_globals['__name__'] - except (AttributeError, ValueError) as exc: + except (AttributeError, ValueError, KeyError) as exc: pass if module is None: _make_class_unpicklable(enum_class) diff --git a/Lib/test/test_enum.py b/Lib/test/test_enum.py index 572e8733f45b..99fc85074b70 100644 --- a/Lib/test/test_enum.py +++ b/Lib/test/test_enum.py @@ -1858,6 +1858,15 @@ class Decision2(MyEnum): REVERT_ALL = "REVERT_ALL" RETRY = "RETRY" + def test_empty_globals(self): + # bpo-35717: sys._getframe(2).f_globals['__name__'] fails with KeyError + # when using compile and exec because f_globals is empty + code = "from enum import Enum; Enum('Animal', 'ANT BEE CAT DOG')" + code = compile(code, "", "exec") + global_ns = {} + local_ls = {} + exec(code, global_ns, local_ls) + class TestOrder(unittest.TestCase): diff --git a/Misc/ACKS b/Misc/ACKS index 81b51f751991..c9fa08bd6141 100644 --- a/Misc/ACKS +++ b/Misc/ACKS @@ -906,6 +906,7 @@ Glenn Langford Andrew Langmead Wolfgang Langner Detlef Lannert +R?mi Lapeyre Soren Larsen Amos Latteier Piers Lauder diff --git a/Misc/NEWS.d/next/Library/2019-01-11-17-56-15.bpo-35717.6TDTB_.rst b/Misc/NEWS.d/next/Library/2019-01-11-17-56-15.bpo-35717.6TDTB_.rst new file mode 100644 index 000000000000..7cae1d1c82c7 --- /dev/null +++ b/Misc/NEWS.d/next/Library/2019-01-11-17-56-15.bpo-35717.6TDTB_.rst @@ -0,0 +1,2 @@ +Fix KeyError exception raised when using enums and compile. Patch +contributed by R?mi Lapeyre. From webhook-mailer at python.org Thu Jan 24 18:50:01 2019 From: webhook-mailer at python.org (Emily Morehouse) Date: Thu, 24 Jan 2019 23:50:01 -0000 Subject: [Python-checkins] bpo-35224: PEP 572 Implementation (#10497) Message-ID: https://github.com/python/cpython/commit/8f59ee01be3d83d5513a9a3f654a237d77d80d9a commit: 8f59ee01be3d83d5513a9a3f654a237d77d80d9a branch: master author: Emily Morehouse committer: GitHub date: 2019-01-24T16:49:56-07:00 summary: bpo-35224: PEP 572 Implementation (#10497) * Add tokenization of := - Add token to Include/token.h. Add token to documentation in Doc/library/token.rst. - Run `./python Lib/token.py` to regenerate Lib/token.py. - Update Parser/tokenizer.c: add case to handle `:=`. * Add initial usage of := in grammar. * Update Python.asdl to match the grammar updates. Regenerated Include/Python-ast.h and Python/Python-ast.c * Update AST and compiler files in Python/ast.c and Python/compile.c. Basic functionality, this isn't scoped properly * Regenerate Lib/symbol.py using `./python Lib/symbol.py` * Tests - Fix failing tests in test_parser.py due to changes in token numbers for internal representation * Tests - Add simple test for := token * Tests - Add simple tests for named expressions using expr and suite * Tests - Update number of levels for nested expressions to prevent stack overflow * Update symbol table to handle NamedExpr * Update Grammar to allow assignment expressions in if statements. Regenerate Python/graminit.c accordingly using `make regen-grammar` * Tests - Add additional tests for named expressions in RoundtripLegalSyntaxTestCase, based on examples and information directly from PEP 572 Note: failing tests are currently commented out (4 out of 24 tests currently fail) * Tests - Add temporary syntax test failure tests in test_parser.py Note: There is an outstanding TODO for this -- syntax tests need to be moved to a different file (presumably test_syntax.py), but this is covering what needs to be tested at the moment, and it's more convenient to run a single test for the time being * Add support for allowing assignment expressions as function argument annotations. Uncomment tests for these cases because they all pass now! * Tests - Move existing syntax tests out of test_parser.py and into test_named_expressions.py. Refactor syntax tests to use unittest * Add TargetScopeError exception to extend SyntaxError Note: This simply creates the TargetScopeError exception, it is not yet used anywhere * Tests - Update tests per PEP 572 Continue refactoring test suite: The named expression test suite now checks for any invalid cases that throw exceptions (no longer limited to SyntaxErrors), assignment tests to ensure that variables are properly assigned, and scope tests to ensure that variable availability and values are correct Note: - There are still tests that are marked to skip, as they are not yet implemented - There are approximately 300 lines of the PEP that have not yet been addressed, though these may be deferred * Documentation - Small updates to XXX/todo comments - Remove XXX from child description in ast.c - Add comment with number of previously supported nested expressions for 3.7.X in test_parser.py * Fix assert in seq_for_testlist() * Cleanup - Denote "Not implemented -- No keyword args" on failing test case. Fix PEP8 error for blank lines at beginning of test classes in test_parser.py * Tests - Wrap all file opens in `with...as` to ensure files are closed * WIP: handle f(a := 1) * Tests and Cleanup - No longer skips keyword arg test. Keyword arg test now uses a simpler test case and does not rely on an external file. Remove print statements from ast.c * Tests - Refactor last remaining test case that relied on on external file to use a simpler test case without the dependency * Tests - Add better description of remaning skipped tests. Add test checking scope when using assignment expression in a function argument * Tests - Add test for nested comprehension, testing value and scope. Fix variable name in skipped comprehension scope test * Handle restriction of LHS for named expressions - can only assign to LHS of type NAME. Specifically, restrict assignment to tuples This adds an alternative set_context specifically for named expressions, set_namedexpr_context. Thus, context is now set differently for standard assignment versus assignment for named expressions in order to handle restrictions. * Tests - Update negative test case for assigning to lambda to match new error message. Add negative test case for assigning to tuple * Tests - Reorder test cases to group invalid syntax cases and named assignment target errors * Tests - Update test case for named expression in function argument - check that result and variable are set correctly * Todo - Add todo for TargetScopeError based on Guido's comment (https://github.com/python/cpython/commit/2b3acd37bdfc2d35e5094228c6684050d2aa8b0a#r30472562) * Tests - Add named expression tests for assignment operator in function arguments Note: One of two tests are skipped, as function arguments are currently treating an assignment expression inside of parenthesis as one child, which does not properly catch the named expression, nor does it count arguments properly * Add NamedStore to expr_context. Regenerate related code with `make regen-ast` * Add usage of NamedStore to ast_for_named_expr in ast.c. Update occurances of checking for Store to also handle NamedStore where appropriate * Add ste_comprehension to _symtable_entry to track if the namespace is a comprehension. Initialize ste_comprehension to 0. Set set_comprehension to 1 in symtable_handle_comprehension * s/symtable_add_def/symtable_add_def_helper. Add symtable_add_def to handle grabbing st->st_cur and passing it to symtable_add_def_helper. This now allows us to call the original code from symtable_add_def by instead calling symtable_add_def_helper with a different ste. * Refactor symtable_record_directive to take lineno and col_offset as arguments instead of stmt_ty. This allows symtable_record_directive to be used for stmt_ty and expr_ty * Handle elevating scope for named expressions in comprehensions. * Handle error for usage of named expression inside a class block * Tests - No longer skip scope tests. Add additional scope tests * Cleanup - Update error message for named expression within a comprehension within a class. Update comments. Add assert for symtable_extend_namedexpr_scope to validate that we always find at least a ModuleScope if we don't find a Class or FunctionScope * Cleanup - Add missing case for NamedStore in expr_context_name. Remove unused var in set_namedexpr_content * Refactor - Consolidate set_context and set_namedexpr_context to reduce duplicated code. Special cases for named expressions are handled by checking if ctx is NamedStore * Cleanup - Add additional use cases for ast_for_namedexpr in usage comment. Fix multiple blank lines in test_named_expressions * Tests - Remove unnecessary test case. Renumber test case function names * Remove TargetScopeError for now. Will add back if needed * Cleanup - Small comment nit for consistency * Handle positional argument check with named expression * Add TargetScopeError exception definition. Add documentation for TargetScopeError in c-api docs. Throw TargetScopeError instead of SyntaxError when using a named expression in a comprehension within a class scope * Increase stack size for parser by 200. This is a minimal change (approx. 5kb) and should not have an impact on any systems. Update parser test to allow 99 nested levels again * Add TargetScopeError to exception_hierarchy.txt for test_baseexception.py_ * Tests - Major update for named expression tests, both in test_named_expressions and test_parser - Add test for TargetScopeError - Add tests for named expressions in comprehension scope and edge cases - Add tests for named expressions in function arguments (declarations and call sites) - Reorganize tests to group them more logically * Cleanup - Remove unnecessary comment * Cleanup - Comment nitpicks * Explicitly disallow assignment expressions to a name inside parentheses, e.g.: ((x) := 0) - Add check for LHS types to detect a parenthesis then a name (see note) - Add test for this scenario - Update tests for changed error message for named assignment to a tuple (also, see note) Note: This caused issues with the previous error handling for named assignment to a LHS that contained an expression, such as a tuple. Thus, the check for the LHS of a named expression must be changed to be more specific if we wish to maintain the previous error messages * Cleanup - Wrap lines more strictly in test file * Revert "Explicitly disallow assignment expressions to a name inside parentheses, e.g.: ((x) := 0)" This reverts commit f1531400ca7d7a2d148830c8ac703f041740896d. * Add NEWS.d entry * Tests - Fix error in test_pickle.test_exceptions by adding TargetScopeError to list of exceptions * Tests - Update error message tests to reflect improved messaging convention (s/can't/cannot) * Remove cases that cannot be reached in compile.c. Small linting update. * Update Grammar/Tokens to add COLONEQUAL. Regenerate all files * Update TargetScopeError PRE_INIT and POST_INIT, as this was purposefully left out when fixing rebase conflicts * Add NamedStore back and regenerate files * Pass along line number and end col info for named expression * Simplify News entry * Fix compiler warning and explicity mark fallthrough files: A Lib/test/test_named_expressions.py A Misc/NEWS.d/next/Core and Builtins/2018-11-13-14-26-54.bpo-35224.F0B6UQ.rst M Doc/c-api/exceptions.rst M Doc/library/token-list.inc M Grammar/Grammar M Grammar/Tokens M Include/Python-ast.h M Include/graminit.h M Include/pyerrors.h M Include/symtable.h M Include/token.h M Lib/_compat_pickle.py M Lib/symbol.py M Lib/test/exception_hierarchy.txt M Lib/test/test_parser.py M Lib/test/test_tokenize.py M Lib/token.py M Objects/exceptions.c M PC/python3.def M Parser/Python.asdl M Parser/parser.h M Parser/token.c M Python/Python-ast.c M Python/ast.c M Python/compile.c M Python/graminit.c M Python/symtable.c diff --git a/Doc/c-api/exceptions.rst b/Doc/c-api/exceptions.rst index dd1e026cb071..6dd0c02c6d7b 100644 --- a/Doc/c-api/exceptions.rst +++ b/Doc/c-api/exceptions.rst @@ -800,6 +800,7 @@ the variables: single: PyExc_SystemError single: PyExc_SystemExit single: PyExc_TabError + single: PyExc_TargetScopeError single: PyExc_TimeoutError single: PyExc_TypeError single: PyExc_UnboundLocalError @@ -901,6 +902,8 @@ the variables: +-----------------------------------------+---------------------------------+----------+ | :c:data:`PyExc_TabError` | :exc:`TabError` | | +-----------------------------------------+---------------------------------+----------+ +| :c:data:`PyExc_TargetScopeError` | :exc:`TargetScopeError` | | ++-----------------------------------------+---------------------------------+----------+ | :c:data:`PyExc_TimeoutError` | :exc:`TimeoutError` | | +-----------------------------------------+---------------------------------+----------+ | :c:data:`PyExc_TypeError` | :exc:`TypeError` | | diff --git a/Doc/library/token-list.inc b/Doc/library/token-list.inc index cd6e0f26968e..3ea9439be859 100644 --- a/Doc/library/token-list.inc +++ b/Doc/library/token-list.inc @@ -197,6 +197,10 @@ Token value for ``"..."``. +.. data:: COLONEQUAL + + Token value for ``":="``. + .. data:: OP .. data:: ERRORTOKEN diff --git a/Grammar/Grammar b/Grammar/Grammar index e232df979e2d..f21fa1136432 100644 --- a/Grammar/Grammar +++ b/Grammar/Grammar @@ -69,7 +69,7 @@ assert_stmt: 'assert' test [',' test] compound_stmt: if_stmt | while_stmt | for_stmt | try_stmt | with_stmt | funcdef | classdef | decorated | async_stmt async_stmt: 'async' (funcdef | with_stmt | for_stmt) -if_stmt: 'if' test ':' suite ('elif' test ':' suite)* ['else' ':' suite] +if_stmt: 'if' namedexpr_test ':' suite ('elif' namedexpr_test ':' suite)* ['else' ':' suite] while_stmt: 'while' test ':' suite ['else' ':' suite] for_stmt: 'for' exprlist 'in' testlist ':' suite ['else' ':' suite] try_stmt: ('try' ':' suite @@ -83,6 +83,7 @@ with_item: test ['as' expr] except_clause: 'except' [test ['as' NAME]] suite: simple_stmt | NEWLINE INDENT stmt+ DEDENT +namedexpr_test: test [':=' test] test: or_test ['if' or_test 'else' test] | lambdef test_nocond: or_test | lambdef_nocond lambdef: 'lambda' [varargslist] ':' test @@ -108,7 +109,7 @@ atom: ('(' [yield_expr|testlist_comp] ')' | '[' [testlist_comp] ']' | '{' [dictorsetmaker] '}' | NAME | NUMBER | STRING+ | '...' | 'None' | 'True' | 'False') -testlist_comp: (test|star_expr) ( comp_for | (',' (test|star_expr))* [','] ) +testlist_comp: (namedexpr_test|star_expr) ( comp_for | (',' (namedexpr_test|star_expr))* [','] ) trailer: '(' [arglist] ')' | '[' subscriptlist ']' | '.' NAME subscriptlist: subscript (',' subscript)* [','] subscript: test | [test] ':' [test] [sliceop] @@ -134,6 +135,7 @@ arglist: argument (',' argument)* [','] # multiple (test comp_for) arguments are blocked; keyword unpackings # that precede iterable unpackings are blocked; etc. argument: ( test [comp_for] | + test ':=' test | test '=' test | '**' test | '*' test ) diff --git a/Grammar/Tokens b/Grammar/Tokens index 9595673a5af7..f6f303bd5292 100644 --- a/Grammar/Tokens +++ b/Grammar/Tokens @@ -52,6 +52,7 @@ AT '@' ATEQUAL '@=' RARROW '->' ELLIPSIS '...' +COLONEQUAL ':=' OP ERRORTOKEN diff --git a/Include/Python-ast.h b/Include/Python-ast.h index f8394e6c26ad..3527ae8949e7 100644 --- a/Include/Python-ast.h +++ b/Include/Python-ast.h @@ -17,7 +17,7 @@ typedef struct _stmt *stmt_ty; typedef struct _expr *expr_ty; typedef enum _expr_context { Load=1, Store=2, Del=3, AugLoad=4, AugStore=5, - Param=6 } expr_context_ty; + Param=6, NamedStore=7 } expr_context_ty; typedef struct _slice *slice_ty; @@ -214,14 +214,14 @@ struct _stmt { int end_col_offset; }; -enum _expr_kind {BoolOp_kind=1, BinOp_kind=2, UnaryOp_kind=3, Lambda_kind=4, - IfExp_kind=5, Dict_kind=6, Set_kind=7, ListComp_kind=8, - SetComp_kind=9, DictComp_kind=10, GeneratorExp_kind=11, - Await_kind=12, Yield_kind=13, YieldFrom_kind=14, - Compare_kind=15, Call_kind=16, FormattedValue_kind=17, - JoinedStr_kind=18, Constant_kind=19, Attribute_kind=20, - Subscript_kind=21, Starred_kind=22, Name_kind=23, - List_kind=24, Tuple_kind=25}; +enum _expr_kind {BoolOp_kind=1, NamedExpr_kind=2, BinOp_kind=3, UnaryOp_kind=4, + Lambda_kind=5, IfExp_kind=6, Dict_kind=7, Set_kind=8, + ListComp_kind=9, SetComp_kind=10, DictComp_kind=11, + GeneratorExp_kind=12, Await_kind=13, Yield_kind=14, + YieldFrom_kind=15, Compare_kind=16, Call_kind=17, + FormattedValue_kind=18, JoinedStr_kind=19, Constant_kind=20, + Attribute_kind=21, Subscript_kind=22, Starred_kind=23, + Name_kind=24, List_kind=25, Tuple_kind=26}; struct _expr { enum _expr_kind kind; union { @@ -230,6 +230,11 @@ struct _expr { asdl_seq *values; } BoolOp; + struct { + expr_ty target; + expr_ty value; + } NamedExpr; + struct { expr_ty left; operator_ty op; @@ -541,6 +546,10 @@ stmt_ty _Py_Continue(int lineno, int col_offset, int end_lineno, int #define BoolOp(a0, a1, a2, a3, a4, a5, a6) _Py_BoolOp(a0, a1, a2, a3, a4, a5, a6) expr_ty _Py_BoolOp(boolop_ty op, asdl_seq * values, int lineno, int col_offset, int end_lineno, int end_col_offset, PyArena *arena); +#define NamedExpr(a0, a1, a2, a3, a4, a5, a6) _Py_NamedExpr(a0, a1, a2, a3, a4, a5, a6) +expr_ty _Py_NamedExpr(expr_ty target, expr_ty value, int lineno, int + col_offset, int end_lineno, int end_col_offset, PyArena + *arena); #define BinOp(a0, a1, a2, a3, a4, a5, a6, a7) _Py_BinOp(a0, a1, a2, a3, a4, a5, a6, a7) expr_ty _Py_BinOp(expr_ty left, operator_ty op, expr_ty right, int lineno, int col_offset, int end_lineno, int end_col_offset, PyArena diff --git a/Include/graminit.h b/Include/graminit.h index bdfe821ad716..e3acff8a1e83 100644 --- a/Include/graminit.h +++ b/Include/graminit.h @@ -49,41 +49,42 @@ #define with_item 302 #define except_clause 303 #define suite 304 -#define test 305 -#define test_nocond 306 -#define lambdef 307 -#define lambdef_nocond 308 -#define or_test 309 -#define and_test 310 -#define not_test 311 -#define comparison 312 -#define comp_op 313 -#define star_expr 314 -#define expr 315 -#define xor_expr 316 -#define and_expr 317 -#define shift_expr 318 -#define arith_expr 319 -#define term 320 -#define factor 321 -#define power 322 -#define atom_expr 323 -#define atom 324 -#define testlist_comp 325 -#define trailer 326 -#define subscriptlist 327 -#define subscript 328 -#define sliceop 329 -#define exprlist 330 -#define testlist 331 -#define dictorsetmaker 332 -#define classdef 333 -#define arglist 334 -#define argument 335 -#define comp_iter 336 -#define sync_comp_for 337 -#define comp_for 338 -#define comp_if 339 -#define encoding_decl 340 -#define yield_expr 341 -#define yield_arg 342 +#define namedexpr_test 305 +#define test 306 +#define test_nocond 307 +#define lambdef 308 +#define lambdef_nocond 309 +#define or_test 310 +#define and_test 311 +#define not_test 312 +#define comparison 313 +#define comp_op 314 +#define star_expr 315 +#define expr 316 +#define xor_expr 317 +#define and_expr 318 +#define shift_expr 319 +#define arith_expr 320 +#define term 321 +#define factor 322 +#define power 323 +#define atom_expr 324 +#define atom 325 +#define testlist_comp 326 +#define trailer 327 +#define subscriptlist 328 +#define subscript 329 +#define sliceop 330 +#define exprlist 331 +#define testlist 332 +#define dictorsetmaker 333 +#define classdef 334 +#define arglist 335 +#define argument 336 +#define comp_iter 337 +#define sync_comp_for 338 +#define comp_for 339 +#define comp_if 340 +#define encoding_decl 341 +#define yield_expr 342 +#define yield_arg 343 diff --git a/Include/pyerrors.h b/Include/pyerrors.h index efe1c49d2d08..5c6751868df4 100644 --- a/Include/pyerrors.h +++ b/Include/pyerrors.h @@ -108,6 +108,7 @@ PyAPI_DATA(PyObject *) PyExc_NotImplementedError; PyAPI_DATA(PyObject *) PyExc_SyntaxError; PyAPI_DATA(PyObject *) PyExc_IndentationError; PyAPI_DATA(PyObject *) PyExc_TabError; +PyAPI_DATA(PyObject *) PyExc_TargetScopeError; PyAPI_DATA(PyObject *) PyExc_ReferenceError; PyAPI_DATA(PyObject *) PyExc_SystemError; PyAPI_DATA(PyObject *) PyExc_SystemExit; diff --git a/Include/symtable.h b/Include/symtable.h index 949022bd6630..9392e6438779 100644 --- a/Include/symtable.h +++ b/Include/symtable.h @@ -50,6 +50,7 @@ typedef struct _symtable_entry { including free refs to globals */ unsigned ste_generator : 1; /* true if namespace is a generator */ unsigned ste_coroutine : 1; /* true if namespace is a coroutine */ + unsigned ste_comprehension : 1; /* true if namespace is a list comprehension */ unsigned ste_varargs : 1; /* true if block has varargs */ unsigned ste_varkeywords : 1; /* true if block has varkeywords */ unsigned ste_returns_value : 1; /* true if namespace uses return with diff --git a/Include/token.h b/Include/token.h index 2d491e6927d1..b87b84cd966d 100644 --- a/Include/token.h +++ b/Include/token.h @@ -63,9 +63,10 @@ extern "C" { #define ATEQUAL 50 #define RARROW 51 #define ELLIPSIS 52 -#define OP 53 -#define ERRORTOKEN 54 -#define N_TOKENS 58 +#define COLONEQUAL 53 +#define OP 54 +#define ERRORTOKEN 55 +#define N_TOKENS 59 #define NT_OFFSET 256 /* Special definitions for cooperation with parser */ diff --git a/Lib/_compat_pickle.py b/Lib/_compat_pickle.py index f68496ae639f..8bb1cf80afa5 100644 --- a/Lib/_compat_pickle.py +++ b/Lib/_compat_pickle.py @@ -128,6 +128,7 @@ "SystemError", "SystemExit", "TabError", + "TargetScopeError", "TypeError", "UnboundLocalError", "UnicodeDecodeError", diff --git a/Lib/symbol.py b/Lib/symbol.py index 40d0ed1e0355..b3fa08984d87 100644 --- a/Lib/symbol.py +++ b/Lib/symbol.py @@ -61,44 +61,45 @@ with_item = 302 except_clause = 303 suite = 304 -test = 305 -test_nocond = 306 -lambdef = 307 -lambdef_nocond = 308 -or_test = 309 -and_test = 310 -not_test = 311 -comparison = 312 -comp_op = 313 -star_expr = 314 -expr = 315 -xor_expr = 316 -and_expr = 317 -shift_expr = 318 -arith_expr = 319 -term = 320 -factor = 321 -power = 322 -atom_expr = 323 -atom = 324 -testlist_comp = 325 -trailer = 326 -subscriptlist = 327 -subscript = 328 -sliceop = 329 -exprlist = 330 -testlist = 331 -dictorsetmaker = 332 -classdef = 333 -arglist = 334 -argument = 335 -comp_iter = 336 -sync_comp_for = 337 -comp_for = 338 -comp_if = 339 -encoding_decl = 340 -yield_expr = 341 -yield_arg = 342 +namedexpr_test = 305 +test = 306 +test_nocond = 307 +lambdef = 308 +lambdef_nocond = 309 +or_test = 310 +and_test = 311 +not_test = 312 +comparison = 313 +comp_op = 314 +star_expr = 315 +expr = 316 +xor_expr = 317 +and_expr = 318 +shift_expr = 319 +arith_expr = 320 +term = 321 +factor = 322 +power = 323 +atom_expr = 324 +atom = 325 +testlist_comp = 326 +trailer = 327 +subscriptlist = 328 +subscript = 329 +sliceop = 330 +exprlist = 331 +testlist = 332 +dictorsetmaker = 333 +classdef = 334 +arglist = 335 +argument = 336 +comp_iter = 337 +sync_comp_for = 338 +comp_for = 339 +comp_if = 340 +encoding_decl = 341 +yield_expr = 342 +yield_arg = 343 #--end constants-- sym_name = {} diff --git a/Lib/test/exception_hierarchy.txt b/Lib/test/exception_hierarchy.txt index 763a6c899b48..15f4491cf237 100644 --- a/Lib/test/exception_hierarchy.txt +++ b/Lib/test/exception_hierarchy.txt @@ -42,6 +42,7 @@ BaseException | +-- NotImplementedError | +-- RecursionError +-- SyntaxError + | +-- TargetScopeError | +-- IndentationError | +-- TabError +-- SystemError diff --git a/Lib/test/test_named_expressions.py b/Lib/test/test_named_expressions.py new file mode 100644 index 000000000000..e49fd7de20de --- /dev/null +++ b/Lib/test/test_named_expressions.py @@ -0,0 +1,415 @@ +import os +import unittest + + +class NamedExpressionInvalidTest(unittest.TestCase): + + def test_named_expression_invalid_01(self): + code = """x := 0""" + + with self.assertRaisesRegex(SyntaxError, "invalid syntax"): + exec(code, {}, {}) + + def test_named_expression_invalid_02(self): + code = """x = y := 0""" + + with self.assertRaisesRegex(SyntaxError, "invalid syntax"): + exec(code, {}, {}) + + def test_named_expression_invalid_03(self): + code = """y := f(x)""" + + with self.assertRaisesRegex(SyntaxError, "invalid syntax"): + exec(code, {}, {}) + + def test_named_expression_invalid_04(self): + code = """y0 = y1 := f(x)""" + + with self.assertRaisesRegex(SyntaxError, "invalid syntax"): + exec(code, {}, {}) + + def test_named_expression_invalid_06(self): + code = """((a, b) := (1, 2))""" + + with self.assertRaisesRegex(SyntaxError, "cannot use named assignment with tuple"): + exec(code, {}, {}) + + def test_named_expression_invalid_07(self): + code = """def spam(a = b := 42): pass""" + + with self.assertRaisesRegex(SyntaxError, "invalid syntax"): + exec(code, {}, {}) + + def test_named_expression_invalid_08(self): + code = """def spam(a: b := 42 = 5): pass""" + + with self.assertRaisesRegex(SyntaxError, "invalid syntax"): + exec(code, {}, {}) + + def test_named_expression_invalid_09(self): + code = """spam(a=b := 'c')""" + + with self.assertRaisesRegex(SyntaxError, "invalid syntax"): + exec(code, {}, {}) + + def test_named_expression_invalid_10(self): + code = """spam(x = y := f(x))""" + + with self.assertRaisesRegex(SyntaxError, "invalid syntax"): + exec(code, {}, {}) + + def test_named_expression_invalid_11(self): + code = """spam(a=1, b := 2)""" + + with self.assertRaisesRegex(SyntaxError, + "positional argument follows keyword argument"): + exec(code, {}, {}) + + def test_named_expression_invalid_12(self): + code = """spam(a=1, (b := 2))""" + + with self.assertRaisesRegex(SyntaxError, + "positional argument follows keyword argument"): + exec(code, {}, {}) + + def test_named_expression_invalid_13(self): + code = """spam(a=1, (b := 2))""" + + with self.assertRaisesRegex(SyntaxError, + "positional argument follows keyword argument"): + exec(code, {}, {}) + + def test_named_expression_invalid_14(self): + code = """(x := lambda: y := 1)""" + + with self.assertRaisesRegex(SyntaxError, "invalid syntax"): + exec(code, {}, {}) + + def test_named_expression_invalid_15(self): + code = """(lambda: x := 1)""" + + with self.assertRaisesRegex(SyntaxError, + "cannot use named assignment with lambda"): + exec(code, {}, {}) + + def test_named_expression_invalid_16(self): + code = "[i + 1 for i in i := [1,2]]" + + with self.assertRaisesRegex(SyntaxError, "invalid syntax"): + exec(code, {}, {}) + + def test_named_expression_invalid_17(self): + code = "[i := 0, j := 1 for i, j in [(1, 2), (3, 4)]]" + + with self.assertRaisesRegex(SyntaxError, "invalid syntax"): + exec(code, {}, {}) + + def test_named_expression_invalid_18(self): + code = """class Foo(): + [(42, 1 + ((( j := i )))) for i in range(5)] + """ + + with self.assertRaisesRegex(TargetScopeError, + "named expression within a comprehension cannot be used in a class body"): + exec(code, {}, {}) + + +class NamedExpressionAssignmentTest(unittest.TestCase): + + def test_named_expression_assignment_01(self): + (a := 10) + + self.assertEqual(a, 10) + + def test_named_expression_assignment_02(self): + a = 20 + (a := a) + + self.assertEqual(a, 20) + + def test_named_expression_assignment_03(self): + (total := 1 + 2) + + self.assertEqual(total, 3) + + def test_named_expression_assignment_04(self): + (info := (1, 2, 3)) + + self.assertEqual(info, (1, 2, 3)) + + def test_named_expression_assignment_05(self): + (x := 1, 2) + + self.assertEqual(x, 1) + + def test_named_expression_assignment_06(self): + (z := (y := (x := 0))) + + self.assertEqual(x, 0) + self.assertEqual(y, 0) + self.assertEqual(z, 0) + + def test_named_expression_assignment_07(self): + (loc := (1, 2)) + + self.assertEqual(loc, (1, 2)) + + def test_named_expression_assignment_08(self): + if spam := "eggs": + self.assertEqual(spam, "eggs") + else: self.fail("variable was not assigned using named expression") + + def test_named_expression_assignment_09(self): + if True and (spam := True): + self.assertTrue(spam) + else: self.fail("variable was not assigned using named expression") + + def test_named_expression_assignment_10(self): + if (match := 10) is 10: + pass + else: self.fail("variable was not assigned using named expression") + + def test_named_expression_assignment_11(self): + def spam(a): + return a + input_data = [1, 2, 3] + res = [(x, y, x/y) for x in input_data if (y := spam(x)) > 0] + + self.assertEqual(res, [(1, 1, 1.0), (2, 2, 1.0), (3, 3, 1.0)]) + + def test_named_expression_assignment_12(self): + def spam(a): + return a + res = [[y := spam(x), x/y] for x in range(1, 5)] + + self.assertEqual(res, [[1, 1.0], [2, 1.0], [3, 1.0], [4, 1.0]]) + + def test_named_expression_assignment_13(self): + length = len(lines := [1, 2]) + + self.assertEqual(length, 2) + self.assertEqual(lines, [1,2]) + + def test_named_expression_assignment_14(self): + """ + Where all variables are positive integers, and a is at least as large + as the n'th root of x, this algorithm returns the floor of the n'th + root of x (and roughly doubling the number of accurate bits per + iteration):: + """ + a = 9 + n = 2 + x = 3 + + while a > (d := x // a**(n-1)): + a = ((n-1)*a + d) // n + + self.assertEqual(a, 1) + + +class NamedExpressionScopeTest(unittest.TestCase): + + def test_named_expression_scope_01(self): + code = """def spam(): + (a := 5) +print(a)""" + + with self.assertRaisesRegex(NameError, "name 'a' is not defined"): + exec(code, {}, {}) + + def test_named_expression_scope_02(self): + total = 0 + partial_sums = [total := total + v for v in range(5)] + + self.assertEqual(partial_sums, [0, 1, 3, 6, 10]) + self.assertEqual(total, 10) + + def test_named_expression_scope_03(self): + containsOne = any((lastNum := num) == 1 for num in [1, 2, 3]) + + self.assertTrue(containsOne) + self.assertEqual(lastNum, 1) + + def test_named_expression_scope_04(self): + def spam(a): + return a + res = [[y := spam(x), x/y] for x in range(1, 5)] + + self.assertEqual(y, 4) + + def test_named_expression_scope_05(self): + def spam(a): + return a + input_data = [1, 2, 3] + res = [(x, y, x/y) for x in input_data if (y := spam(x)) > 0] + + self.assertEqual(res, [(1, 1, 1.0), (2, 2, 1.0), (3, 3, 1.0)]) + self.assertEqual(y, 3) + + def test_named_expression_scope_06(self): + res = [[spam := i for i in range(3)] for j in range(2)] + + self.assertEqual(res, [[0, 1, 2], [0, 1, 2]]) + self.assertEqual(spam, 2) + + def test_named_expression_scope_07(self): + len(lines := [1, 2]) + + self.assertEqual(lines, [1, 2]) + + def test_named_expression_scope_08(self): + def spam(a): + return a + + def eggs(b): + return b * 2 + + res = [spam(a := eggs(b := h)) for h in range(2)] + + self.assertEqual(res, [0, 2]) + self.assertEqual(a, 2) + self.assertEqual(b, 1) + + def test_named_expression_scope_09(self): + def spam(a): + return a + + def eggs(b): + return b * 2 + + res = [spam(a := eggs(a := h)) for h in range(2)] + + self.assertEqual(res, [0, 2]) + self.assertEqual(a, 2) + + def test_named_expression_scope_10(self): + res = [b := [a := 1 for i in range(2)] for j in range(2)] + + self.assertEqual(res, [[1, 1], [1, 1]]) + self.assertEqual(a, 1) + self.assertEqual(b, [1, 1]) + + def test_named_expression_scope_11(self): + res = [j := i for i in range(5)] + + self.assertEqual(res, [0, 1, 2, 3, 4]) + self.assertEqual(j, 4) + + def test_named_expression_scope_12(self): + res = [i := i for i in range(5)] + + self.assertEqual(res, [0, 1, 2, 3, 4]) + self.assertEqual(i, 4) + + def test_named_expression_scope_13(self): + res = [i := 0 for i, j in [(1, 2), (3, 4)]] + + self.assertEqual(res, [0, 0]) + self.assertEqual(i, 0) + + def test_named_expression_scope_14(self): + res = [(i := 0, j := 1) for i, j in [(1, 2), (3, 4)]] + + self.assertEqual(res, [(0, 1), (0, 1)]) + self.assertEqual(i, 0) + self.assertEqual(j, 1) + + def test_named_expression_scope_15(self): + res = [(i := i, j := j) for i, j in [(1, 2), (3, 4)]] + + self.assertEqual(res, [(1, 2), (3, 4)]) + self.assertEqual(i, 3) + self.assertEqual(j, 4) + + def test_named_expression_scope_16(self): + res = [(i := j, j := i) for i, j in [(1, 2), (3, 4)]] + + self.assertEqual(res, [(2, 2), (4, 4)]) + self.assertEqual(i, 4) + self.assertEqual(j, 4) + + def test_named_expression_scope_17(self): + b = 0 + res = [b := i + b for i in range(5)] + + self.assertEqual(res, [0, 1, 3, 6, 10]) + self.assertEqual(b, 10) + + def test_named_expression_scope_18(self): + def spam(a): + return a + + res = spam(b := 2) + + self.assertEqual(res, 2) + self.assertEqual(b, 2) + + def test_named_expression_scope_19(self): + def spam(a): + return a + + res = spam((b := 2)) + + self.assertEqual(res, 2) + self.assertEqual(b, 2) + + def test_named_expression_scope_20(self): + def spam(a): + return a + + res = spam(a=(b := 2)) + + self.assertEqual(res, 2) + self.assertEqual(b, 2) + + def test_named_expression_scope_21(self): + def spam(a, b): + return a + b + + res = spam(c := 2, b=1) + + self.assertEqual(res, 3) + self.assertEqual(c, 2) + + def test_named_expression_scope_22(self): + def spam(a, b): + return a + b + + res = spam((c := 2), b=1) + + self.assertEqual(res, 3) + self.assertEqual(c, 2) + + def test_named_expression_scope_23(self): + def spam(a, b): + return a + b + + res = spam(b=(c := 2), a=1) + + self.assertEqual(res, 3) + self.assertEqual(c, 2) + + def test_named_expression_scope_24(self): + a = 10 + def spam(): + nonlocal a + (a := 20) + spam() + + self.assertEqual(a, 20) + + def test_named_expression_scope_25(self): + ns = {} + code = """a = 10 +def spam(): + global a + (a := 20) +spam()""" + + exec(code, ns, {}) + + self.assertEqual(ns["a"], 20) + + +if __name__ == "__main__": + unittest.main() diff --git a/Lib/test/test_parser.py b/Lib/test/test_parser.py index 9b58bb93c81c..ac3899baedb6 100644 --- a/Lib/test/test_parser.py +++ b/Lib/test/test_parser.py @@ -115,6 +115,7 @@ def test_expressions(self): self.check_expr("foo * bar") self.check_expr("foo / bar") self.check_expr("foo // bar") + self.check_expr("(foo := 1)") self.check_expr("lambda: 0") self.check_expr("lambda x: 0") self.check_expr("lambda *y: 0") @@ -421,6 +422,40 @@ def test_dict_comprehensions(self): self.check_expr('{x**2:x[3] for x in seq if condition(x)}') self.check_expr('{x:x for x in seq1 for y in seq2 if condition(x, y)}') + def test_named_expressions(self): + self.check_suite("(a := 1)") + self.check_suite("(a := a)") + self.check_suite("if (match := pattern.search(data)) is None: pass") + self.check_suite("[y := f(x), y**2, y**3]") + self.check_suite("filtered_data = [y for x in data if (y := f(x)) is None]") + self.check_suite("(y := f(x))") + self.check_suite("y0 = (y1 := f(x))") + self.check_suite("foo(x=(y := f(x)))") + self.check_suite("def foo(answer=(p := 42)): pass") + self.check_suite("def foo(answer: (p := 42) = 5): pass") + self.check_suite("lambda: (x := 1)") + self.check_suite("(x := lambda: 1)") + self.check_suite("(x := lambda: (y := 1))") # not in PEP + self.check_suite("lambda line: (m := re.match(pattern, line)) and m.group(1)") + self.check_suite("x = (y := 0)") + self.check_suite("(z:=(y:=(x:=0)))") + self.check_suite("(info := (name, phone, *rest))") + self.check_suite("(x:=1,2)") + self.check_suite("(total := total + tax)") + self.check_suite("len(lines := f.readlines())") + self.check_suite("foo(x := 3, cat='vector')") + self.check_suite("foo(cat=(category := 'vector'))") + self.check_suite("if any(len(longline := l) >= 100 for l in lines): print(longline)") + self.check_suite( + "if env_base := os.environ.get('PYTHONUSERBASE', None): return env_base" + ) + self.check_suite( + "if self._is_special and (ans := self._check_nans(context=context)): return ans" + ) + self.check_suite("foo(b := 2, a=1)") + self.check_suite("foo(b := 2, a=1)") + self.check_suite("foo((b := 2), a=1)") + self.check_suite("foo(c=(b := 2), a=1)") # # Second, we take *invalid* trees and make sure we get ParserError @@ -694,16 +729,16 @@ def test_missing_import_source(self): def test_illegal_encoding(self): # Illegal encoding declaration tree = \ - (340, + (341, (257, (0, ''))) self.check_bad_tree(tree, "missed encoding") tree = \ - (340, + (341, (257, (0, '')), b'iso-8859-1') self.check_bad_tree(tree, "non-string encoding") tree = \ - (340, + (341, (257, (0, '')), '\udcff') with self.assertRaises(UnicodeEncodeError): @@ -776,8 +811,9 @@ def _nested_expression(self, level): return "["*level+"]"*level def test_deeply_nested_list(self): - # XXX used to be 99 levels in 2.x - e = self._nested_expression(93) + # This has fluctuated between 99 levels in 2.x, down to 93 levels in + # 3.7.X and back up to 99 in 3.8.X. Related to MAXSTACK size in Parser.h + e = self._nested_expression(99) st = parser.expr(e) st.compile() diff --git a/Lib/test/test_tokenize.py b/Lib/test/test_tokenize.py index 04a12542c6ae..4c90092893a2 100644 --- a/Lib/test/test_tokenize.py +++ b/Lib/test/test_tokenize.py @@ -1429,6 +1429,7 @@ def test_exact_type(self): self.assertExactTypeEqual('**=', token.DOUBLESTAREQUAL) self.assertExactTypeEqual('//', token.DOUBLESLASH) self.assertExactTypeEqual('//=', token.DOUBLESLASHEQUAL) + self.assertExactTypeEqual(':=', token.COLONEQUAL) self.assertExactTypeEqual('...', token.ELLIPSIS) self.assertExactTypeEqual('->', token.RARROW) self.assertExactTypeEqual('@', token.AT) diff --git a/Lib/token.py b/Lib/token.py index 5af7e6b91eac..7224eca32fe0 100644 --- a/Lib/token.py +++ b/Lib/token.py @@ -56,13 +56,14 @@ ATEQUAL = 50 RARROW = 51 ELLIPSIS = 52 -OP = 53 +COLONEQUAL = 53 +OP = 54 # These aren't used by the C tokenizer but are needed for tokenize.py -ERRORTOKEN = 54 -COMMENT = 55 -NL = 56 -ENCODING = 57 -N_TOKENS = 58 +ERRORTOKEN = 55 +COMMENT = 56 +NL = 57 +ENCODING = 58 +N_TOKENS = 59 # Special definitions for cooperation with parser NT_OFFSET = 256 @@ -96,6 +97,7 @@ '//=': DOUBLESLASHEQUAL, '/=': SLASHEQUAL, ':': COLON, + ':=': COLONEQUAL, ';': SEMI, '<': LESS, '<<': LEFTSHIFT, diff --git a/Misc/NEWS.d/next/Core and Builtins/2018-11-13-14-26-54.bpo-35224.F0B6UQ.rst b/Misc/NEWS.d/next/Core and Builtins/2018-11-13-14-26-54.bpo-35224.F0B6UQ.rst new file mode 100644 index 000000000000..fe54f362aef6 --- /dev/null +++ b/Misc/NEWS.d/next/Core and Builtins/2018-11-13-14-26-54.bpo-35224.F0B6UQ.rst @@ -0,0 +1 @@ +Implement :pep:`572` (assignment expressions). Patch by Emily Morehouse. diff --git a/Objects/exceptions.c b/Objects/exceptions.c index 35e1df3ca1fa..75ede1c9c5de 100644 --- a/Objects/exceptions.c +++ b/Objects/exceptions.c @@ -1520,6 +1520,13 @@ MiddlingExtendsException(PyExc_SyntaxError, IndentationError, SyntaxError, "Improper indentation."); +/* + * TargetScopeError extends SyntaxError + */ +MiddlingExtendsException(PyExc_SyntaxError, TargetScopeError, SyntaxError, + "Improper scope target."); + + /* * TabError extends IndentationError */ @@ -2531,6 +2538,7 @@ _PyExc_Init(void) PRE_INIT(AttributeError); PRE_INIT(SyntaxError); PRE_INIT(IndentationError); + PRE_INIT(TargetScopeError); PRE_INIT(TabError); PRE_INIT(LookupError); PRE_INIT(IndexError); @@ -2671,6 +2679,7 @@ _PyBuiltins_AddExceptions(PyObject *bltinmod) POST_INIT(AttributeError); POST_INIT(SyntaxError); POST_INIT(IndentationError); + POST_INIT(TargetScopeError); POST_INIT(TabError); POST_INIT(LookupError); POST_INIT(IndexError); diff --git a/PC/python3.def b/PC/python3.def index 5d93c18af87e..e317864d0cd8 100644 --- a/PC/python3.def +++ b/PC/python3.def @@ -235,6 +235,7 @@ EXPORTS PyExc_SystemError=python38.PyExc_SystemError DATA PyExc_SystemExit=python38.PyExc_SystemExit DATA PyExc_TabError=python38.PyExc_TabError DATA + PyExc_TargetScopeError=python38.PyExc_TargetScopeError DATA PyExc_TimeoutError=python38.PyExc_TimeoutError DATA PyExc_TypeError=python38.PyExc_TypeError DATA PyExc_UnboundLocalError=python38.PyExc_UnboundLocalError DATA diff --git a/Parser/Python.asdl b/Parser/Python.asdl index cedf37a2d9f9..7b2a8737ab80 100644 --- a/Parser/Python.asdl +++ b/Parser/Python.asdl @@ -54,6 +54,7 @@ module Python -- BoolOp() can use left & right? expr = BoolOp(boolop op, expr* values) + | NamedExpr(expr target, expr value) | BinOp(expr left, operator op, expr right) | UnaryOp(unaryop op, expr operand) | Lambda(arguments args, expr body) @@ -87,7 +88,7 @@ module Python -- col_offset is the byte offset in the utf8 string the parser uses attributes (int lineno, int col_offset, int? end_lineno, int? end_col_offset) - expr_context = Load | Store | Del | AugLoad | AugStore | Param + expr_context = Load | Store | Del | AugLoad | AugStore | Param | NamedStore slice = Slice(expr? lower, expr? upper, expr? step) | ExtSlice(slice* dims) diff --git a/Parser/parser.h b/Parser/parser.h index 95cd39d209dd..aee1c86cb044 100644 --- a/Parser/parser.h +++ b/Parser/parser.h @@ -7,7 +7,7 @@ extern "C" { /* Parser interface */ -#define MAXSTACK 1500 +#define MAXSTACK 1700 typedef struct { int s_state; /* State in current DFA */ diff --git a/Parser/token.c b/Parser/token.c index 35519aa4b611..d27f98a34d55 100644 --- a/Parser/token.c +++ b/Parser/token.c @@ -59,6 +59,7 @@ const char * const _PyParser_TokenNames[] = { "ATEQUAL", "RARROW", "ELLIPSIS", + "COLONEQUAL", "OP", "", "", @@ -142,6 +143,11 @@ PyToken_TwoChars(int c1, int c2) case '=': return SLASHEQUAL; } break; + case ':': + switch (c2) { + case '=': return COLONEQUAL; + } + break; case '<': switch (c2) { case '<': return LEFTSHIFT; diff --git a/Python/Python-ast.c b/Python/Python-ast.c index e6c5bfe9b29e..a333ff95b110 100644 --- a/Python/Python-ast.c +++ b/Python/Python-ast.c @@ -203,6 +203,11 @@ static char *BoolOp_fields[]={ "op", "values", }; +static PyTypeObject *NamedExpr_type; +static char *NamedExpr_fields[]={ + "target", + "value", +}; static PyTypeObject *BinOp_type; _Py_IDENTIFIER(left); _Py_IDENTIFIER(right); @@ -344,7 +349,8 @@ static char *Tuple_fields[]={ }; static PyTypeObject *expr_context_type; static PyObject *Load_singleton, *Store_singleton, *Del_singleton, -*AugLoad_singleton, *AugStore_singleton, *Param_singleton; +*AugLoad_singleton, *AugStore_singleton, *Param_singleton, +*NamedStore_singleton; static PyObject* ast2obj_expr_context(expr_context_ty); static PyTypeObject *Load_type; static PyTypeObject *Store_type; @@ -352,6 +358,7 @@ static PyTypeObject *Del_type; static PyTypeObject *AugLoad_type; static PyTypeObject *AugStore_type; static PyTypeObject *Param_type; +static PyTypeObject *NamedStore_type; static PyTypeObject *slice_type; static PyObject* ast2obj_slice(void*); static PyTypeObject *Slice_type; @@ -872,6 +879,8 @@ static int init_types(void) if (!add_attributes(expr_type, expr_attributes, 4)) return 0; BoolOp_type = make_type("BoolOp", expr_type, BoolOp_fields, 2); if (!BoolOp_type) return 0; + NamedExpr_type = make_type("NamedExpr", expr_type, NamedExpr_fields, 2); + if (!NamedExpr_type) return 0; BinOp_type = make_type("BinOp", expr_type, BinOp_fields, 3); if (!BinOp_type) return 0; UnaryOp_type = make_type("UnaryOp", expr_type, UnaryOp_fields, 2); @@ -949,6 +958,10 @@ static int init_types(void) if (!Param_type) return 0; Param_singleton = PyType_GenericNew(Param_type, NULL, NULL); if (!Param_singleton) return 0; + NamedStore_type = make_type("NamedStore", expr_context_type, NULL, 0); + if (!NamedStore_type) return 0; + NamedStore_singleton = PyType_GenericNew(NamedStore_type, NULL, NULL); + if (!NamedStore_singleton) return 0; slice_type = make_type("slice", &AST_type, NULL, 0); if (!slice_type) return 0; if (!add_attributes(slice_type, NULL, 0)) return 0; @@ -1772,6 +1785,34 @@ BoolOp(boolop_ty op, asdl_seq * values, int lineno, int col_offset, int return p; } +expr_ty +NamedExpr(expr_ty target, expr_ty value, int lineno, int col_offset, int + end_lineno, int end_col_offset, PyArena *arena) +{ + expr_ty p; + if (!target) { + PyErr_SetString(PyExc_ValueError, + "field target is required for NamedExpr"); + return NULL; + } + if (!value) { + PyErr_SetString(PyExc_ValueError, + "field value is required for NamedExpr"); + return NULL; + } + p = (expr_ty)PyArena_Malloc(arena, sizeof(*p)); + if (!p) + return NULL; + p->kind = NamedExpr_kind; + p->v.NamedExpr.target = target; + p->v.NamedExpr.value = value; + p->lineno = lineno; + p->col_offset = col_offset; + p->end_lineno = end_lineno; + p->end_col_offset = end_col_offset; + return p; +} + expr_ty BinOp(expr_ty left, operator_ty op, expr_ty right, int lineno, int col_offset, int end_lineno, int end_col_offset, PyArena *arena) @@ -3062,6 +3103,20 @@ ast2obj_expr(void* _o) goto failed; Py_DECREF(value); break; + case NamedExpr_kind: + result = PyType_GenericNew(NamedExpr_type, NULL, NULL); + if (!result) goto failed; + value = ast2obj_expr(o->v.NamedExpr.target); + if (!value) goto failed; + if (_PyObject_SetAttrId(result, &PyId_target, value) == -1) + goto failed; + Py_DECREF(value); + value = ast2obj_expr(o->v.NamedExpr.value); + if (!value) goto failed; + if (_PyObject_SetAttrId(result, &PyId_value, value) == -1) + goto failed; + Py_DECREF(value); + break; case BinOp_kind: result = PyType_GenericNew(BinOp_type, NULL, NULL); if (!result) goto failed; @@ -3464,6 +3519,9 @@ PyObject* ast2obj_expr_context(expr_context_ty o) case Param: Py_INCREF(Param_singleton); return Param_singleton; + case NamedStore: + Py_INCREF(NamedStore_singleton); + return NamedStore_singleton; default: /* should never happen, but just in case ... */ PyErr_Format(PyExc_SystemError, "unknown expr_context found"); @@ -5895,6 +5953,45 @@ obj2ast_expr(PyObject* obj, expr_ty* out, PyArena* arena) if (*out == NULL) goto failed; return 0; } + isinstance = PyObject_IsInstance(obj, (PyObject*)NamedExpr_type); + if (isinstance == -1) { + return 1; + } + if (isinstance) { + expr_ty target; + expr_ty value; + + if (_PyObject_LookupAttrId(obj, &PyId_target, &tmp) < 0) { + return 1; + } + if (tmp == NULL) { + PyErr_SetString(PyExc_TypeError, "required field \"target\" missing from NamedExpr"); + return 1; + } + else { + int res; + res = obj2ast_expr(tmp, &target, arena); + if (res != 0) goto failed; + Py_CLEAR(tmp); + } + if (_PyObject_LookupAttrId(obj, &PyId_value, &tmp) < 0) { + return 1; + } + if (tmp == NULL) { + PyErr_SetString(PyExc_TypeError, "required field \"value\" missing from NamedExpr"); + return 1; + } + else { + int res; + res = obj2ast_expr(tmp, &value, arena); + if (res != 0) goto failed; + Py_CLEAR(tmp); + } + *out = NamedExpr(target, value, lineno, col_offset, end_lineno, + end_col_offset, arena); + if (*out == NULL) goto failed; + return 0; + } isinstance = PyObject_IsInstance(obj, (PyObject*)BinOp_type); if (isinstance == -1) { return 1; @@ -7156,6 +7253,14 @@ obj2ast_expr_context(PyObject* obj, expr_context_ty* out, PyArena* arena) *out = Param; return 0; } + isinstance = PyObject_IsInstance(obj, (PyObject *)NamedStore_type); + if (isinstance == -1) { + return 1; + } + if (isinstance) { + *out = NamedStore; + return 0; + } PyErr_Format(PyExc_TypeError, "expected some sort of expr_context, but got %R", obj); return 1; @@ -8251,6 +8356,8 @@ PyInit__ast(void) if (PyDict_SetItemString(d, "expr", (PyObject*)expr_type) < 0) return NULL; if (PyDict_SetItemString(d, "BoolOp", (PyObject*)BoolOp_type) < 0) return NULL; + if (PyDict_SetItemString(d, "NamedExpr", (PyObject*)NamedExpr_type) < 0) + return NULL; if (PyDict_SetItemString(d, "BinOp", (PyObject*)BinOp_type) < 0) return NULL; if (PyDict_SetItemString(d, "UnaryOp", (PyObject*)UnaryOp_type) < 0) return @@ -8306,6 +8413,8 @@ PyInit__ast(void) return NULL; if (PyDict_SetItemString(d, "Param", (PyObject*)Param_type) < 0) return NULL; + if (PyDict_SetItemString(d, "NamedStore", (PyObject*)NamedStore_type) < 0) + return NULL; if (PyDict_SetItemString(d, "slice", (PyObject*)slice_type) < 0) return NULL; if (PyDict_SetItemString(d, "Slice", (PyObject*)Slice_type) < 0) return diff --git a/Python/ast.c b/Python/ast.c index 855acca29e7c..6560026109c8 100644 --- a/Python/ast.c +++ b/Python/ast.c @@ -94,6 +94,8 @@ expr_context_name(expr_context_ty ctx) return "Load"; case Store: return "Store"; + case NamedStore: + return "NamedStore"; case Del: return "Del"; case AugLoad: @@ -975,14 +977,29 @@ set_context(struct compiling *c, expr_ty e, expr_context_ty ctx, const node *n) switch (e->kind) { case Attribute_kind: + if (ctx == NamedStore) { + expr_name = "attribute"; + break; + } + e->v.Attribute.ctx = ctx; if (ctx == Store && forbidden_name(c, e->v.Attribute.attr, n, 1)) return 0; break; case Subscript_kind: + if (ctx == NamedStore) { + expr_name = "subscript"; + break; + } + e->v.Subscript.ctx = ctx; break; case Starred_kind: + if (ctx == NamedStore) { + expr_name = "starred"; + break; + } + e->v.Starred.ctx = ctx; if (!set_context(c, e->v.Starred.value, ctx, n)) return 0; @@ -995,10 +1012,20 @@ set_context(struct compiling *c, expr_ty e, expr_context_ty ctx, const node *n) e->v.Name.ctx = ctx; break; case List_kind: + if (ctx == NamedStore) { + expr_name = "list"; + break; + } + e->v.List.ctx = ctx; s = e->v.List.elts; break; case Tuple_kind: + if (ctx == NamedStore) { + expr_name = "tuple"; + break; + } + e->v.Tuple.ctx = ctx; s = e->v.Tuple.elts; break; @@ -1060,17 +1087,27 @@ set_context(struct compiling *c, expr_ty e, expr_context_ty ctx, const node *n) case IfExp_kind: expr_name = "conditional expression"; break; + case NamedExpr_kind: + expr_name = "named expression"; + break; default: PyErr_Format(PyExc_SystemError, - "unexpected expression in assignment %d (line %d)", + "unexpected expression in %sassignment %d (line %d)", + ctx == NamedStore ? "named ": "", e->kind, e->lineno); return 0; } /* Check for error string set by switch */ if (expr_name) { - return ast_error(c, n, "cannot %s %s", + if (ctx == NamedStore) { + return ast_error(c, n, "cannot use named assignment with %s", + expr_name); + } + else { + return ast_error(c, n, "cannot %s %s", ctx == Store ? "assign to" : "delete", expr_name); + } } /* If the LHS is a list or tuple, we need to set the assignment @@ -1198,7 +1235,7 @@ seq_for_testlist(struct compiling *c, const node *n) for (i = 0; i < NCH(n); i += 2) { const node *ch = CHILD(n, i); - assert(TYPE(ch) == test || TYPE(ch) == test_nocond || TYPE(ch) == star_expr); + assert(TYPE(ch) == test || TYPE(ch) == test_nocond || TYPE(ch) == star_expr || TYPE(ch) == namedexpr_test); expression = ast_for_expr(c, ch); if (!expression) @@ -1691,6 +1728,35 @@ ast_for_decorated(struct compiling *c, const node *n) return thing; } +static expr_ty +ast_for_namedexpr(struct compiling *c, const node *n) +{ + /* if_stmt: 'if' namedexpr_test ':' suite ('elif' namedexpr_test ':' suite)* + ['else' ':' suite] + namedexpr_test: test [':=' test] + argument: ( test [comp_for] | + test ':=' test | + test '=' test | + '**' test | + '*' test ) + */ + expr_ty target, value; + + target = ast_for_expr(c, CHILD(n, 0)); + if (!target) + return NULL; + + value = ast_for_expr(c, CHILD(n, 2)); + if (!value) + return NULL; + + if (!set_context(c, target, NamedStore, n)) + return NULL; + + return NamedExpr(target, value, LINENO(n), n->n_col_offset, n->n_end_lineno, + n->n_end_col_offset, c->c_arena); +} + static expr_ty ast_for_lambdef(struct compiling *c, const node *n) { @@ -2568,6 +2634,7 @@ static expr_ty ast_for_expr(struct compiling *c, const node *n) { /* handle the full range of simple expressions + namedexpr_test: test [':=' test] test: or_test ['if' or_test 'else' test] | lambdef test_nocond: or_test | lambdef_nocond or_test: and_test ('or' and_test)* @@ -2591,6 +2658,10 @@ ast_for_expr(struct compiling *c, const node *n) loop: switch (TYPE(n)) { + case namedexpr_test: + if (NCH(n) == 3) + return ast_for_namedexpr(c, n); + /* Fallthrough */ case test: case test_nocond: if (TYPE(CHILD(n, 0)) == lambdef || @@ -2770,6 +2841,9 @@ ast_for_call(struct compiling *c, const node *n, expr_ty func, } else if (TYPE(CHILD(ch, 0)) == STAR) nargs++; + else if (TYPE(CHILD(ch, 1)) == COLONEQUAL) { + nargs++; + } else /* TYPE(CHILD(ch, 0)) == DOUBLESTAR or keyword argument */ nkeywords++; @@ -2850,6 +2924,26 @@ ast_for_call(struct compiling *c, const node *n, expr_ty func, return NULL; asdl_seq_SET(args, nargs++, e); } + else if (TYPE(CHILD(ch, 1)) == COLONEQUAL) { + /* treat colon equal as positional argument */ + if (nkeywords) { + if (ndoublestars) { + ast_error(c, chch, + "positional argument follows " + "keyword argument unpacking"); + } + else { + ast_error(c, chch, + "positional argument follows " + "keyword argument"); + } + return NULL; + } + e = ast_for_namedexpr(c, ch); + if (!e) + return NULL; + asdl_seq_SET(args, nargs++, e); + } else { /* a keyword argument */ keyword_ty kw; diff --git a/Python/compile.c b/Python/compile.c index 9713bfc9e9b7..eb2c3028b633 100644 --- a/Python/compile.c +++ b/Python/compile.c @@ -3428,7 +3428,10 @@ compiler_nameop(struct compiler *c, identifier name, expr_context_ty ctx) case Load: op = (c->u->u_ste->ste_type == ClassBlock) ? LOAD_CLASSDEREF : LOAD_DEREF; break; - case Store: op = STORE_DEREF; break; + case Store: + case NamedStore: + op = STORE_DEREF; + break; case AugLoad: case AugStore: break; @@ -3443,7 +3446,10 @@ compiler_nameop(struct compiler *c, identifier name, expr_context_ty ctx) case OP_FAST: switch (ctx) { case Load: op = LOAD_FAST; break; - case Store: op = STORE_FAST; break; + case Store: + case NamedStore: + op = STORE_FAST; + break; case Del: op = DELETE_FAST; break; case AugLoad: case AugStore: @@ -3459,7 +3465,10 @@ compiler_nameop(struct compiler *c, identifier name, expr_context_ty ctx) case OP_GLOBAL: switch (ctx) { case Load: op = LOAD_GLOBAL; break; - case Store: op = STORE_GLOBAL; break; + case Store: + case NamedStore: + op = STORE_GLOBAL; + break; case Del: op = DELETE_GLOBAL; break; case AugLoad: case AugStore: @@ -3474,7 +3483,10 @@ compiler_nameop(struct compiler *c, identifier name, expr_context_ty ctx) case OP_NAME: switch (ctx) { case Load: op = LOAD_NAME; break; - case Store: op = STORE_NAME; break; + case Store: + case NamedStore: + op = STORE_NAME; + break; case Del: op = DELETE_NAME; break; case AugLoad: case AugStore: @@ -3592,7 +3604,7 @@ static int compiler_list(struct compiler *c, expr_ty e) { asdl_seq *elts = e->v.List.elts; - if (e->v.List.ctx == Store) { + if (e->v.List.ctx == Store || e->v.List.ctx == NamedStore) { return assignment_helper(c, elts); } else if (e->v.List.ctx == Load) { @@ -3608,7 +3620,7 @@ static int compiler_tuple(struct compiler *c, expr_ty e) { asdl_seq *elts = e->v.Tuple.elts; - if (e->v.Tuple.ctx == Store) { + if (e->v.Tuple.ctx == Store || e->v.Tuple.ctx == NamedStore) { return assignment_helper(c, elts); } else if (e->v.Tuple.ctx == Load) { @@ -4569,6 +4581,11 @@ static int compiler_visit_expr1(struct compiler *c, expr_ty e) { switch (e->kind) { + case NamedExpr_kind: + VISIT(c, expr, e->v.NamedExpr.value); + ADDOP(c, DUP_TOP); + VISIT(c, expr, e->v.NamedExpr.target); + break; case BoolOp_kind: return compiler_boolop(c, e); case BinOp_kind: @@ -5003,6 +5020,7 @@ compiler_handle_subscr(struct compiler *c, const char *kind, case AugStore:/* fall through to Store */ case Store: op = STORE_SUBSCR; break; case Del: op = DELETE_SUBSCR; break; + case NamedStore: case Param: PyErr_Format(PyExc_SystemError, "invalid %s kind %d in subscript\n", diff --git a/Python/graminit.c b/Python/graminit.c index 0a681f7b797f..91092f1e0b9e 100644 --- a/Python/graminit.c +++ b/Python/graminit.c @@ -891,7 +891,7 @@ static arc arcs_41_0[1] = { {97, 1}, }; static arc arcs_41_1[1] = { - {26, 2}, + {98, 2}, }; static arc arcs_41_2[1] = { {27, 3}, @@ -900,8 +900,8 @@ static arc arcs_41_3[1] = { {28, 4}, }; static arc arcs_41_4[3] = { - {98, 1}, - {99, 5}, + {99, 1}, + {100, 5}, {0, 4}, }; static arc arcs_41_5[1] = { @@ -924,7 +924,7 @@ static state states_41[8] = { {1, arcs_41_7}, }; static arc arcs_42_0[1] = { - {100, 1}, + {101, 1}, }; static arc arcs_42_1[1] = { {26, 2}, @@ -936,7 +936,7 @@ static arc arcs_42_3[1] = { {28, 4}, }; static arc arcs_42_4[2] = { - {99, 5}, + {100, 5}, {0, 4}, }; static arc arcs_42_5[1] = { @@ -959,13 +959,13 @@ static state states_42[8] = { {1, arcs_42_7}, }; static arc arcs_43_0[1] = { - {101, 1}, + {102, 1}, }; static arc arcs_43_1[1] = { {66, 2}, }; static arc arcs_43_2[1] = { - {102, 3}, + {103, 3}, }; static arc arcs_43_3[1] = { {9, 4}, @@ -977,7 +977,7 @@ static arc arcs_43_5[1] = { {28, 6}, }; static arc arcs_43_6[2] = { - {99, 7}, + {100, 7}, {0, 6}, }; static arc arcs_43_7[1] = { @@ -1002,7 +1002,7 @@ static state states_43[10] = { {1, arcs_43_9}, }; static arc arcs_44_0[1] = { - {103, 1}, + {104, 1}, }; static arc arcs_44_1[1] = { {27, 2}, @@ -1011,8 +1011,8 @@ static arc arcs_44_2[1] = { {28, 3}, }; static arc arcs_44_3[2] = { - {104, 4}, - {105, 5}, + {105, 4}, + {106, 5}, }; static arc arcs_44_4[1] = { {27, 6}, @@ -1027,9 +1027,9 @@ static arc arcs_44_7[1] = { {28, 9}, }; static arc arcs_44_8[4] = { - {104, 4}, - {99, 10}, - {105, 5}, + {105, 4}, + {100, 10}, + {106, 5}, {0, 8}, }; static arc arcs_44_9[1] = { @@ -1042,7 +1042,7 @@ static arc arcs_44_11[1] = { {28, 12}, }; static arc arcs_44_12[2] = { - {105, 5}, + {106, 5}, {0, 12}, }; static state states_44[13] = { @@ -1061,10 +1061,10 @@ static state states_44[13] = { {2, arcs_44_12}, }; static arc arcs_45_0[1] = { - {106, 1}, + {107, 1}, }; static arc arcs_45_1[1] = { - {107, 2}, + {108, 2}, }; static arc arcs_45_2[2] = { {32, 1}, @@ -1091,7 +1091,7 @@ static arc arcs_46_1[2] = { {0, 1}, }; static arc arcs_46_2[1] = { - {108, 3}, + {109, 3}, }; static arc arcs_46_3[1] = { {0, 3}, @@ -1103,7 +1103,7 @@ static state states_46[4] = { {1, arcs_46_3}, }; static arc arcs_47_0[1] = { - {109, 1}, + {110, 1}, }; static arc arcs_47_1[2] = { {26, 2}, @@ -1134,14 +1134,14 @@ static arc arcs_48_1[1] = { {0, 1}, }; static arc arcs_48_2[1] = { - {110, 3}, + {111, 3}, }; static arc arcs_48_3[1] = { {6, 4}, }; static arc arcs_48_4[2] = { {6, 4}, - {111, 1}, + {112, 1}, }; static state states_48[5] = { {2, arcs_48_0}, @@ -1150,70 +1150,66 @@ static state states_48[5] = { {1, arcs_48_3}, {2, arcs_48_4}, }; -static arc arcs_49_0[2] = { - {112, 1}, - {113, 2}, +static arc arcs_49_0[1] = { + {26, 1}, }; static arc arcs_49_1[2] = { - {97, 3}, + {113, 2}, {0, 1}, }; static arc arcs_49_2[1] = { - {0, 2}, + {26, 3}, }; static arc arcs_49_3[1] = { - {112, 4}, -}; -static arc arcs_49_4[1] = { - {99, 5}, -}; -static arc arcs_49_5[1] = { - {26, 2}, + {0, 3}, }; -static state states_49[6] = { - {2, arcs_49_0}, +static state states_49[4] = { + {1, arcs_49_0}, {2, arcs_49_1}, {1, arcs_49_2}, {1, arcs_49_3}, - {1, arcs_49_4}, - {1, arcs_49_5}, }; static arc arcs_50_0[2] = { - {112, 1}, - {115, 1}, + {114, 1}, + {115, 2}, }; -static arc arcs_50_1[1] = { +static arc arcs_50_1[2] = { + {97, 3}, {0, 1}, }; -static state states_50[2] = { - {2, arcs_50_0}, - {1, arcs_50_1}, +static arc arcs_50_2[1] = { + {0, 2}, }; -static arc arcs_51_0[1] = { - {116, 1}, +static arc arcs_50_3[1] = { + {114, 4}, }; -static arc arcs_51_1[2] = { - {35, 2}, - {27, 3}, +static arc arcs_50_4[1] = { + {100, 5}, }; -static arc arcs_51_2[1] = { - {27, 3}, +static arc arcs_50_5[1] = { + {26, 2}, }; -static arc arcs_51_3[1] = { - {26, 4}, +static state states_50[6] = { + {2, arcs_50_0}, + {2, arcs_50_1}, + {1, arcs_50_2}, + {1, arcs_50_3}, + {1, arcs_50_4}, + {1, arcs_50_5}, +}; +static arc arcs_51_0[2] = { + {114, 1}, + {117, 1}, }; -static arc arcs_51_4[1] = { - {0, 4}, +static arc arcs_51_1[1] = { + {0, 1}, }; -static state states_51[5] = { - {1, arcs_51_0}, - {2, arcs_51_1}, - {1, arcs_51_2}, - {1, arcs_51_3}, - {1, arcs_51_4}, +static state states_51[2] = { + {2, arcs_51_0}, + {1, arcs_51_1}, }; static arc arcs_52_0[1] = { - {116, 1}, + {118, 1}, }; static arc arcs_52_1[2] = { {35, 2}, @@ -1223,7 +1219,7 @@ static arc arcs_52_2[1] = { {27, 3}, }; static arc arcs_52_3[1] = { - {114, 4}, + {26, 4}, }; static arc arcs_52_4[1] = { {0, 4}, @@ -1236,15 +1232,27 @@ static state states_52[5] = { {1, arcs_52_4}, }; static arc arcs_53_0[1] = { - {117, 1}, + {118, 1}, }; static arc arcs_53_1[2] = { - {118, 0}, - {0, 1}, + {35, 2}, + {27, 3}, +}; +static arc arcs_53_2[1] = { + {27, 3}, +}; +static arc arcs_53_3[1] = { + {116, 4}, }; -static state states_53[2] = { +static arc arcs_53_4[1] = { + {0, 4}, +}; +static state states_53[5] = { {1, arcs_53_0}, {2, arcs_53_1}, + {1, arcs_53_2}, + {1, arcs_53_3}, + {1, arcs_53_4}, }; static arc arcs_54_0[1] = { {119, 1}, @@ -1257,84 +1265,84 @@ static state states_54[2] = { {1, arcs_54_0}, {2, arcs_54_1}, }; -static arc arcs_55_0[2] = { +static arc arcs_55_0[1] = { {121, 1}, - {122, 2}, }; -static arc arcs_55_1[1] = { - {119, 2}, +static arc arcs_55_1[2] = { + {122, 0}, + {0, 1}, +}; +static state states_55[2] = { + {1, arcs_55_0}, + {2, arcs_55_1}, }; -static arc arcs_55_2[1] = { +static arc arcs_56_0[2] = { + {123, 1}, + {124, 2}, +}; +static arc arcs_56_1[1] = { + {121, 2}, +}; +static arc arcs_56_2[1] = { {0, 2}, }; -static state states_55[3] = { - {2, arcs_55_0}, - {1, arcs_55_1}, - {1, arcs_55_2}, +static state states_56[3] = { + {2, arcs_56_0}, + {1, arcs_56_1}, + {1, arcs_56_2}, }; -static arc arcs_56_0[1] = { - {108, 1}, +static arc arcs_57_0[1] = { + {109, 1}, }; -static arc arcs_56_1[2] = { - {123, 0}, +static arc arcs_57_1[2] = { + {125, 0}, {0, 1}, }; -static state states_56[2] = { - {1, arcs_56_0}, - {2, arcs_56_1}, +static state states_57[2] = { + {1, arcs_57_0}, + {2, arcs_57_1}, }; -static arc arcs_57_0[10] = { - {124, 1}, - {125, 1}, +static arc arcs_58_0[10] = { {126, 1}, {127, 1}, {128, 1}, {129, 1}, {130, 1}, - {102, 1}, - {121, 2}, - {131, 3}, + {131, 1}, + {132, 1}, + {103, 1}, + {123, 2}, + {133, 3}, }; -static arc arcs_57_1[1] = { +static arc arcs_58_1[1] = { {0, 1}, }; -static arc arcs_57_2[1] = { - {102, 1}, +static arc arcs_58_2[1] = { + {103, 1}, }; -static arc arcs_57_3[2] = { - {121, 1}, +static arc arcs_58_3[2] = { + {123, 1}, {0, 3}, }; -static state states_57[4] = { - {10, arcs_57_0}, - {1, arcs_57_1}, - {1, arcs_57_2}, - {2, arcs_57_3}, -}; -static arc arcs_58_0[1] = { - {33, 1}, -}; -static arc arcs_58_1[1] = { - {108, 2}, -}; -static arc arcs_58_2[1] = { - {0, 2}, -}; -static state states_58[3] = { - {1, arcs_58_0}, +static state states_58[4] = { + {10, arcs_58_0}, {1, arcs_58_1}, {1, arcs_58_2}, + {2, arcs_58_3}, }; static arc arcs_59_0[1] = { - {132, 1}, + {33, 1}, }; -static arc arcs_59_1[2] = { - {133, 0}, - {0, 1}, +static arc arcs_59_1[1] = { + {109, 2}, }; -static state states_59[2] = { +static arc arcs_59_2[1] = { + {0, 2}, +}; +static state states_59[3] = { {1, arcs_59_0}, - {2, arcs_59_1}, + {1, arcs_59_1}, + {1, arcs_59_2}, }; static arc arcs_60_0[1] = { {134, 1}, @@ -1361,21 +1369,20 @@ static state states_61[2] = { static arc arcs_62_0[1] = { {138, 1}, }; -static arc arcs_62_1[3] = { +static arc arcs_62_1[2] = { {139, 0}, - {140, 0}, {0, 1}, }; static state states_62[2] = { {1, arcs_62_0}, - {3, arcs_62_1}, + {2, arcs_62_1}, }; static arc arcs_63_0[1] = { - {141, 1}, + {140, 1}, }; static arc arcs_63_1[3] = { + {141, 0}, {142, 0}, - {143, 0}, {0, 1}, }; static state states_63[2] = { @@ -1383,543 +1390,556 @@ static state states_63[2] = { {3, arcs_63_1}, }; static arc arcs_64_0[1] = { - {144, 1}, + {143, 1}, }; -static arc arcs_64_1[6] = { - {33, 0}, - {11, 0}, +static arc arcs_64_1[3] = { + {144, 0}, {145, 0}, - {146, 0}, - {147, 0}, {0, 1}, }; static state states_64[2] = { {1, arcs_64_0}, - {6, arcs_64_1}, + {3, arcs_64_1}, }; -static arc arcs_65_0[4] = { - {142, 1}, - {143, 1}, - {148, 1}, - {149, 2}, +static arc arcs_65_0[1] = { + {146, 1}, }; -static arc arcs_65_1[1] = { - {144, 2}, +static arc arcs_65_1[6] = { + {33, 0}, + {11, 0}, + {147, 0}, + {148, 0}, + {149, 0}, + {0, 1}, }; -static arc arcs_65_2[1] = { +static state states_65[2] = { + {1, arcs_65_0}, + {6, arcs_65_1}, +}; +static arc arcs_66_0[4] = { + {144, 1}, + {145, 1}, + {150, 1}, + {151, 2}, +}; +static arc arcs_66_1[1] = { + {146, 2}, +}; +static arc arcs_66_2[1] = { {0, 2}, }; -static state states_65[3] = { - {4, arcs_65_0}, - {1, arcs_65_1}, - {1, arcs_65_2}, +static state states_66[3] = { + {4, arcs_66_0}, + {1, arcs_66_1}, + {1, arcs_66_2}, }; -static arc arcs_66_0[1] = { - {150, 1}, +static arc arcs_67_0[1] = { + {152, 1}, }; -static arc arcs_66_1[2] = { +static arc arcs_67_1[2] = { {34, 2}, {0, 1}, }; -static arc arcs_66_2[1] = { - {144, 3}, +static arc arcs_67_2[1] = { + {146, 3}, }; -static arc arcs_66_3[1] = { +static arc arcs_67_3[1] = { {0, 3}, }; -static state states_66[4] = { - {1, arcs_66_0}, - {2, arcs_66_1}, - {1, arcs_66_2}, - {1, arcs_66_3}, +static state states_67[4] = { + {1, arcs_67_0}, + {2, arcs_67_1}, + {1, arcs_67_2}, + {1, arcs_67_3}, }; -static arc arcs_67_0[2] = { - {151, 1}, - {152, 2}, +static arc arcs_68_0[2] = { + {153, 1}, + {154, 2}, }; -static arc arcs_67_1[1] = { - {152, 2}, +static arc arcs_68_1[1] = { + {154, 2}, }; -static arc arcs_67_2[2] = { - {153, 2}, +static arc arcs_68_2[2] = { + {155, 2}, {0, 2}, }; -static state states_67[3] = { - {2, arcs_67_0}, - {1, arcs_67_1}, - {2, arcs_67_2}, +static state states_68[3] = { + {2, arcs_68_0}, + {1, arcs_68_1}, + {2, arcs_68_2}, }; -static arc arcs_68_0[10] = { +static arc arcs_69_0[10] = { {13, 1}, - {155, 2}, - {157, 3}, + {157, 2}, + {159, 3}, {23, 4}, - {160, 4}, - {161, 5}, - {83, 4}, {162, 4}, - {163, 4}, + {163, 5}, + {83, 4}, {164, 4}, + {165, 4}, + {166, 4}, }; -static arc arcs_68_1[3] = { +static arc arcs_69_1[3] = { {50, 6}, - {154, 6}, + {156, 6}, {15, 4}, }; -static arc arcs_68_2[2] = { - {154, 7}, - {156, 4}, +static arc arcs_69_2[2] = { + {156, 7}, + {158, 4}, }; -static arc arcs_68_3[2] = { - {158, 8}, - {159, 4}, +static arc arcs_69_3[2] = { + {160, 8}, + {161, 4}, }; -static arc arcs_68_4[1] = { +static arc arcs_69_4[1] = { {0, 4}, }; -static arc arcs_68_5[2] = { - {161, 5}, +static arc arcs_69_5[2] = { + {163, 5}, {0, 5}, }; -static arc arcs_68_6[1] = { +static arc arcs_69_6[1] = { {15, 4}, }; -static arc arcs_68_7[1] = { - {156, 4}, +static arc arcs_69_7[1] = { + {158, 4}, }; -static arc arcs_68_8[1] = { - {159, 4}, +static arc arcs_69_8[1] = { + {161, 4}, }; -static state states_68[9] = { - {10, arcs_68_0}, - {3, arcs_68_1}, - {2, arcs_68_2}, - {2, arcs_68_3}, - {1, arcs_68_4}, - {2, arcs_68_5}, - {1, arcs_68_6}, - {1, arcs_68_7}, - {1, arcs_68_8}, -}; -static arc arcs_69_0[2] = { - {26, 1}, +static state states_69[9] = { + {10, arcs_69_0}, + {3, arcs_69_1}, + {2, arcs_69_2}, + {2, arcs_69_3}, + {1, arcs_69_4}, + {2, arcs_69_5}, + {1, arcs_69_6}, + {1, arcs_69_7}, + {1, arcs_69_8}, +}; +static arc arcs_70_0[2] = { + {98, 1}, {51, 1}, }; -static arc arcs_69_1[3] = { - {165, 2}, +static arc arcs_70_1[3] = { + {167, 2}, {32, 3}, {0, 1}, }; -static arc arcs_69_2[1] = { +static arc arcs_70_2[1] = { {0, 2}, }; -static arc arcs_69_3[3] = { - {26, 4}, +static arc arcs_70_3[3] = { + {98, 4}, {51, 4}, {0, 3}, }; -static arc arcs_69_4[2] = { +static arc arcs_70_4[2] = { {32, 3}, {0, 4}, }; -static state states_69[5] = { - {2, arcs_69_0}, - {3, arcs_69_1}, - {1, arcs_69_2}, - {3, arcs_69_3}, - {2, arcs_69_4}, +static state states_70[5] = { + {2, arcs_70_0}, + {3, arcs_70_1}, + {1, arcs_70_2}, + {3, arcs_70_3}, + {2, arcs_70_4}, }; -static arc arcs_70_0[3] = { +static arc arcs_71_0[3] = { {13, 1}, - {155, 2}, + {157, 2}, {82, 3}, }; -static arc arcs_70_1[2] = { +static arc arcs_71_1[2] = { {14, 4}, {15, 5}, }; -static arc arcs_70_2[1] = { - {166, 6}, +static arc arcs_71_2[1] = { + {168, 6}, }; -static arc arcs_70_3[1] = { +static arc arcs_71_3[1] = { {23, 5}, }; -static arc arcs_70_4[1] = { +static arc arcs_71_4[1] = { {15, 5}, }; -static arc arcs_70_5[1] = { +static arc arcs_71_5[1] = { {0, 5}, }; -static arc arcs_70_6[1] = { - {156, 5}, +static arc arcs_71_6[1] = { + {158, 5}, }; -static state states_70[7] = { - {3, arcs_70_0}, - {2, arcs_70_1}, - {1, arcs_70_2}, - {1, arcs_70_3}, - {1, arcs_70_4}, - {1, arcs_70_5}, - {1, arcs_70_6}, +static state states_71[7] = { + {3, arcs_71_0}, + {2, arcs_71_1}, + {1, arcs_71_2}, + {1, arcs_71_3}, + {1, arcs_71_4}, + {1, arcs_71_5}, + {1, arcs_71_6}, }; -static arc arcs_71_0[1] = { - {167, 1}, +static arc arcs_72_0[1] = { + {169, 1}, }; -static arc arcs_71_1[2] = { +static arc arcs_72_1[2] = { {32, 2}, {0, 1}, }; -static arc arcs_71_2[2] = { - {167, 1}, +static arc arcs_72_2[2] = { + {169, 1}, {0, 2}, }; -static state states_71[3] = { - {1, arcs_71_0}, - {2, arcs_71_1}, - {2, arcs_71_2}, +static state states_72[3] = { + {1, arcs_72_0}, + {2, arcs_72_1}, + {2, arcs_72_2}, }; -static arc arcs_72_0[2] = { +static arc arcs_73_0[2] = { {26, 1}, {27, 2}, }; -static arc arcs_72_1[2] = { +static arc arcs_73_1[2] = { {27, 2}, {0, 1}, }; -static arc arcs_72_2[3] = { +static arc arcs_73_2[3] = { {26, 3}, - {168, 4}, + {170, 4}, {0, 2}, }; -static arc arcs_72_3[2] = { - {168, 4}, +static arc arcs_73_3[2] = { + {170, 4}, {0, 3}, }; -static arc arcs_72_4[1] = { +static arc arcs_73_4[1] = { {0, 4}, }; -static state states_72[5] = { - {2, arcs_72_0}, - {2, arcs_72_1}, - {3, arcs_72_2}, - {2, arcs_72_3}, - {1, arcs_72_4}, +static state states_73[5] = { + {2, arcs_73_0}, + {2, arcs_73_1}, + {3, arcs_73_2}, + {2, arcs_73_3}, + {1, arcs_73_4}, }; -static arc arcs_73_0[1] = { +static arc arcs_74_0[1] = { {27, 1}, }; -static arc arcs_73_1[2] = { +static arc arcs_74_1[2] = { {26, 2}, {0, 1}, }; -static arc arcs_73_2[1] = { +static arc arcs_74_2[1] = { {0, 2}, }; -static state states_73[3] = { - {1, arcs_73_0}, - {2, arcs_73_1}, - {1, arcs_73_2}, +static state states_74[3] = { + {1, arcs_74_0}, + {2, arcs_74_1}, + {1, arcs_74_2}, }; -static arc arcs_74_0[2] = { - {108, 1}, +static arc arcs_75_0[2] = { + {109, 1}, {51, 1}, }; -static arc arcs_74_1[2] = { +static arc arcs_75_1[2] = { {32, 2}, {0, 1}, }; -static arc arcs_74_2[3] = { - {108, 1}, +static arc arcs_75_2[3] = { + {109, 1}, {51, 1}, {0, 2}, }; -static state states_74[3] = { - {2, arcs_74_0}, - {2, arcs_74_1}, - {3, arcs_74_2}, +static state states_75[3] = { + {2, arcs_75_0}, + {2, arcs_75_1}, + {3, arcs_75_2}, }; -static arc arcs_75_0[1] = { +static arc arcs_76_0[1] = { {26, 1}, }; -static arc arcs_75_1[2] = { +static arc arcs_76_1[2] = { {32, 2}, {0, 1}, }; -static arc arcs_75_2[2] = { +static arc arcs_76_2[2] = { {26, 1}, {0, 2}, }; -static state states_75[3] = { - {1, arcs_75_0}, - {2, arcs_75_1}, - {2, arcs_75_2}, +static state states_76[3] = { + {1, arcs_76_0}, + {2, arcs_76_1}, + {2, arcs_76_2}, }; -static arc arcs_76_0[3] = { +static arc arcs_77_0[3] = { {26, 1}, {34, 2}, {51, 3}, }; -static arc arcs_76_1[4] = { +static arc arcs_77_1[4] = { {27, 4}, - {165, 5}, + {167, 5}, {32, 6}, {0, 1}, }; -static arc arcs_76_2[1] = { - {108, 7}, +static arc arcs_77_2[1] = { + {109, 7}, }; -static arc arcs_76_3[3] = { - {165, 5}, +static arc arcs_77_3[3] = { + {167, 5}, {32, 6}, {0, 3}, }; -static arc arcs_76_4[1] = { +static arc arcs_77_4[1] = { {26, 7}, }; -static arc arcs_76_5[1] = { +static arc arcs_77_5[1] = { {0, 5}, }; -static arc arcs_76_6[3] = { +static arc arcs_77_6[3] = { {26, 8}, {51, 8}, {0, 6}, }; -static arc arcs_76_7[3] = { - {165, 5}, +static arc arcs_77_7[3] = { + {167, 5}, {32, 9}, {0, 7}, }; -static arc arcs_76_8[2] = { +static arc arcs_77_8[2] = { {32, 6}, {0, 8}, }; -static arc arcs_76_9[3] = { +static arc arcs_77_9[3] = { {26, 10}, {34, 11}, {0, 9}, }; -static arc arcs_76_10[1] = { +static arc arcs_77_10[1] = { {27, 12}, }; -static arc arcs_76_11[1] = { - {108, 13}, +static arc arcs_77_11[1] = { + {109, 13}, }; -static arc arcs_76_12[1] = { +static arc arcs_77_12[1] = { {26, 13}, }; -static arc arcs_76_13[2] = { +static arc arcs_77_13[2] = { {32, 9}, {0, 13}, }; -static state states_76[14] = { - {3, arcs_76_0}, - {4, arcs_76_1}, - {1, arcs_76_2}, - {3, arcs_76_3}, - {1, arcs_76_4}, - {1, arcs_76_5}, - {3, arcs_76_6}, - {3, arcs_76_7}, - {2, arcs_76_8}, - {3, arcs_76_9}, - {1, arcs_76_10}, - {1, arcs_76_11}, - {1, arcs_76_12}, - {2, arcs_76_13}, -}; -static arc arcs_77_0[1] = { - {169, 1}, +static state states_77[14] = { + {3, arcs_77_0}, + {4, arcs_77_1}, + {1, arcs_77_2}, + {3, arcs_77_3}, + {1, arcs_77_4}, + {1, arcs_77_5}, + {3, arcs_77_6}, + {3, arcs_77_7}, + {2, arcs_77_8}, + {3, arcs_77_9}, + {1, arcs_77_10}, + {1, arcs_77_11}, + {1, arcs_77_12}, + {2, arcs_77_13}, }; -static arc arcs_77_1[1] = { +static arc arcs_78_0[1] = { + {171, 1}, +}; +static arc arcs_78_1[1] = { {23, 2}, }; -static arc arcs_77_2[2] = { +static arc arcs_78_2[2] = { {13, 3}, {27, 4}, }; -static arc arcs_77_3[2] = { +static arc arcs_78_3[2] = { {14, 5}, {15, 6}, }; -static arc arcs_77_4[1] = { +static arc arcs_78_4[1] = { {28, 7}, }; -static arc arcs_77_5[1] = { +static arc arcs_78_5[1] = { {15, 6}, }; -static arc arcs_77_6[1] = { +static arc arcs_78_6[1] = { {27, 4}, }; -static arc arcs_77_7[1] = { +static arc arcs_78_7[1] = { {0, 7}, }; -static state states_77[8] = { - {1, arcs_77_0}, - {1, arcs_77_1}, - {2, arcs_77_2}, - {2, arcs_77_3}, - {1, arcs_77_4}, - {1, arcs_77_5}, - {1, arcs_77_6}, - {1, arcs_77_7}, +static state states_78[8] = { + {1, arcs_78_0}, + {1, arcs_78_1}, + {2, arcs_78_2}, + {2, arcs_78_3}, + {1, arcs_78_4}, + {1, arcs_78_5}, + {1, arcs_78_6}, + {1, arcs_78_7}, }; -static arc arcs_78_0[1] = { - {170, 1}, +static arc arcs_79_0[1] = { + {172, 1}, }; -static arc arcs_78_1[2] = { +static arc arcs_79_1[2] = { {32, 2}, {0, 1}, }; -static arc arcs_78_2[2] = { - {170, 1}, +static arc arcs_79_2[2] = { + {172, 1}, {0, 2}, }; -static state states_78[3] = { - {1, arcs_78_0}, - {2, arcs_78_1}, - {2, arcs_78_2}, +static state states_79[3] = { + {1, arcs_79_0}, + {2, arcs_79_1}, + {2, arcs_79_2}, }; -static arc arcs_79_0[3] = { +static arc arcs_80_0[3] = { {26, 1}, {34, 2}, {33, 2}, }; -static arc arcs_79_1[3] = { - {165, 3}, +static arc arcs_80_1[4] = { + {167, 3}, + {113, 2}, {31, 2}, {0, 1}, }; -static arc arcs_79_2[1] = { +static arc arcs_80_2[1] = { {26, 3}, }; -static arc arcs_79_3[1] = { +static arc arcs_80_3[1] = { {0, 3}, }; -static state states_79[4] = { - {3, arcs_79_0}, - {3, arcs_79_1}, - {1, arcs_79_2}, - {1, arcs_79_3}, +static state states_80[4] = { + {3, arcs_80_0}, + {4, arcs_80_1}, + {1, arcs_80_2}, + {1, arcs_80_3}, }; -static arc arcs_80_0[2] = { - {165, 1}, - {172, 1}, +static arc arcs_81_0[2] = { + {167, 1}, + {174, 1}, }; -static arc arcs_80_1[1] = { +static arc arcs_81_1[1] = { {0, 1}, }; -static state states_80[2] = { - {2, arcs_80_0}, - {1, arcs_80_1}, +static state states_81[2] = { + {2, arcs_81_0}, + {1, arcs_81_1}, }; -static arc arcs_81_0[1] = { - {101, 1}, +static arc arcs_82_0[1] = { + {102, 1}, }; -static arc arcs_81_1[1] = { +static arc arcs_82_1[1] = { {66, 2}, }; -static arc arcs_81_2[1] = { - {102, 3}, +static arc arcs_82_2[1] = { + {103, 3}, }; -static arc arcs_81_3[1] = { - {112, 4}, +static arc arcs_82_3[1] = { + {114, 4}, }; -static arc arcs_81_4[2] = { - {171, 5}, +static arc arcs_82_4[2] = { + {173, 5}, {0, 4}, }; -static arc arcs_81_5[1] = { +static arc arcs_82_5[1] = { {0, 5}, }; -static state states_81[6] = { - {1, arcs_81_0}, - {1, arcs_81_1}, - {1, arcs_81_2}, - {1, arcs_81_3}, - {2, arcs_81_4}, - {1, arcs_81_5}, +static state states_82[6] = { + {1, arcs_82_0}, + {1, arcs_82_1}, + {1, arcs_82_2}, + {1, arcs_82_3}, + {2, arcs_82_4}, + {1, arcs_82_5}, }; -static arc arcs_82_0[2] = { +static arc arcs_83_0[2] = { {21, 1}, - {173, 2}, + {175, 2}, }; -static arc arcs_82_1[1] = { - {173, 2}, +static arc arcs_83_1[1] = { + {175, 2}, }; -static arc arcs_82_2[1] = { +static arc arcs_83_2[1] = { {0, 2}, }; -static state states_82[3] = { - {2, arcs_82_0}, - {1, arcs_82_1}, - {1, arcs_82_2}, +static state states_83[3] = { + {2, arcs_83_0}, + {1, arcs_83_1}, + {1, arcs_83_2}, }; -static arc arcs_83_0[1] = { +static arc arcs_84_0[1] = { {97, 1}, }; -static arc arcs_83_1[1] = { - {114, 2}, +static arc arcs_84_1[1] = { + {116, 2}, }; -static arc arcs_83_2[2] = { - {171, 3}, +static arc arcs_84_2[2] = { + {173, 3}, {0, 2}, }; -static arc arcs_83_3[1] = { +static arc arcs_84_3[1] = { {0, 3}, }; -static state states_83[4] = { - {1, arcs_83_0}, - {1, arcs_83_1}, - {2, arcs_83_2}, - {1, arcs_83_3}, +static state states_84[4] = { + {1, arcs_84_0}, + {1, arcs_84_1}, + {2, arcs_84_2}, + {1, arcs_84_3}, }; -static arc arcs_84_0[1] = { +static arc arcs_85_0[1] = { {23, 1}, }; -static arc arcs_84_1[1] = { +static arc arcs_85_1[1] = { {0, 1}, }; -static state states_84[2] = { - {1, arcs_84_0}, - {1, arcs_84_1}, +static state states_85[2] = { + {1, arcs_85_0}, + {1, arcs_85_1}, }; -static arc arcs_85_0[1] = { - {175, 1}, +static arc arcs_86_0[1] = { + {177, 1}, }; -static arc arcs_85_1[2] = { - {176, 2}, +static arc arcs_86_1[2] = { + {178, 2}, {0, 1}, }; -static arc arcs_85_2[1] = { +static arc arcs_86_2[1] = { {0, 2}, }; -static state states_85[3] = { - {1, arcs_85_0}, - {2, arcs_85_1}, - {1, arcs_85_2}, +static state states_86[3] = { + {1, arcs_86_0}, + {2, arcs_86_1}, + {1, arcs_86_2}, }; -static arc arcs_86_0[2] = { +static arc arcs_87_0[2] = { {77, 1}, {47, 2}, }; -static arc arcs_86_1[1] = { +static arc arcs_87_1[1] = { {26, 2}, }; -static arc arcs_86_2[1] = { +static arc arcs_87_2[1] = { {0, 2}, }; -static state states_86[3] = { - {2, arcs_86_0}, - {1, arcs_86_1}, - {1, arcs_86_2}, +static state states_87[3] = { + {2, arcs_87_0}, + {1, arcs_87_1}, + {1, arcs_87_2}, }; -static dfa dfas[87] = { +static dfa dfas[88] = { {256, "single_input", 0, 3, states_0, - "\004\050\340\000\002\000\000\000\012\076\011\007\262\004\020\002\000\300\220\050\037\202\000"}, + "\004\050\340\000\002\000\000\000\012\076\011\007\142\011\100\010\000\000\103\242\174\010\002"}, {257, "file_input", 0, 2, states_1, - "\204\050\340\000\002\000\000\000\012\076\011\007\262\004\020\002\000\300\220\050\037\202\000"}, + "\204\050\340\000\002\000\000\000\012\076\011\007\142\011\100\010\000\000\103\242\174\010\002"}, {258, "eval_input", 0, 3, states_2, - "\000\040\200\000\000\000\000\000\000\000\010\000\000\000\020\002\000\300\220\050\037\000\000"}, + "\000\040\200\000\000\000\000\000\000\000\010\000\000\000\100\010\000\000\103\242\174\000\000"}, {259, "decorator", 0, 7, states_3, "\000\010\000\000\000\000\000\000\000\000\000\000\000\000\000\000\000\000\000\000\000\000\000"}, {260, "decorators", 0, 2, states_4, @@ -1941,17 +1961,17 @@ static dfa dfas[87] = { {268, "vfpdef", 0, 2, states_12, "\000\000\200\000\000\000\000\000\000\000\000\000\000\000\000\000\000\000\000\000\000\000\000"}, {269, "stmt", 0, 2, states_13, - "\000\050\340\000\002\000\000\000\012\076\011\007\262\004\020\002\000\300\220\050\037\202\000"}, + "\000\050\340\000\002\000\000\000\012\076\011\007\142\011\100\010\000\000\103\242\174\010\002"}, {270, "simple_stmt", 0, 4, states_14, - "\000\040\200\000\002\000\000\000\012\076\011\007\000\000\020\002\000\300\220\050\037\200\000"}, + "\000\040\200\000\002\000\000\000\012\076\011\007\000\000\100\010\000\000\103\242\174\000\002"}, {271, "small_stmt", 0, 2, states_15, - "\000\040\200\000\002\000\000\000\012\076\011\007\000\000\020\002\000\300\220\050\037\200\000"}, + "\000\040\200\000\002\000\000\000\012\076\011\007\000\000\100\010\000\000\103\242\174\000\002"}, {272, "expr_stmt", 0, 6, states_16, - "\000\040\200\000\002\000\000\000\000\000\010\000\000\000\020\002\000\300\220\050\037\000\000"}, + "\000\040\200\000\002\000\000\000\000\000\010\000\000\000\100\010\000\000\103\242\174\000\000"}, {273, "annassign", 0, 5, states_17, "\000\000\000\010\000\000\000\000\000\000\000\000\000\000\000\000\000\000\000\000\000\000\000"}, {274, "testlist_star_expr", 0, 3, states_18, - "\000\040\200\000\002\000\000\000\000\000\010\000\000\000\020\002\000\300\220\050\037\000\000"}, + "\000\040\200\000\002\000\000\000\000\000\010\000\000\000\100\010\000\000\103\242\174\000\000"}, {275, "augassign", 0, 2, states_19, "\000\000\000\000\000\000\360\377\001\000\000\000\000\000\000\000\000\000\000\000\000\000\000"}, {276, "del_stmt", 0, 3, states_20, @@ -1959,7 +1979,7 @@ static dfa dfas[87] = { {277, "pass_stmt", 0, 2, states_21, "\000\000\000\000\000\000\000\000\010\000\000\000\000\000\000\000\000\000\000\000\000\000\000"}, {278, "flow_stmt", 0, 2, states_22, - "\000\000\000\000\000\000\000\000\000\036\000\000\000\000\000\000\000\000\000\000\000\200\000"}, + "\000\000\000\000\000\000\000\000\000\036\000\000\000\000\000\000\000\000\000\000\000\000\002"}, {279, "break_stmt", 0, 2, states_23, "\000\000\000\000\000\000\000\000\000\002\000\000\000\000\000\000\000\000\000\000\000\000\000"}, {280, "continue_stmt", 0, 2, states_24, @@ -1967,7 +1987,7 @@ static dfa dfas[87] = { {281, "return_stmt", 0, 3, states_25, "\000\000\000\000\000\000\000\000\000\010\000\000\000\000\000\000\000\000\000\000\000\000\000"}, {282, "yield_stmt", 0, 2, states_26, - "\000\000\000\000\000\000\000\000\000\000\000\000\000\000\000\000\000\000\000\000\000\200\000"}, + "\000\000\000\000\000\000\000\000\000\000\000\000\000\000\000\000\000\000\000\000\000\000\002"}, {283, "raise_stmt", 0, 5, states_27, "\000\000\000\000\000\000\000\000\000\020\000\000\000\000\000\000\000\000\000\000\000\000\000"}, {284, "import_stmt", 0, 2, states_28, @@ -1993,103 +2013,105 @@ static dfa dfas[87] = { {294, "assert_stmt", 0, 5, states_38, "\000\000\000\000\000\000\000\000\000\000\000\004\000\000\000\000\000\000\000\000\000\000\000"}, {295, "compound_stmt", 0, 2, states_39, - "\000\010\140\000\000\000\000\000\000\000\000\000\262\004\000\000\000\000\000\000\000\002\000"}, + "\000\010\140\000\000\000\000\000\000\000\000\000\142\011\000\000\000\000\000\000\000\010\000"}, {296, "async_stmt", 0, 3, states_40, "\000\000\040\000\000\000\000\000\000\000\000\000\000\000\000\000\000\000\000\000\000\000\000"}, {297, "if_stmt", 0, 8, states_41, "\000\000\000\000\000\000\000\000\000\000\000\000\002\000\000\000\000\000\000\000\000\000\000"}, {298, "while_stmt", 0, 8, states_42, - "\000\000\000\000\000\000\000\000\000\000\000\000\020\000\000\000\000\000\000\000\000\000\000"}, - {299, "for_stmt", 0, 10, states_43, "\000\000\000\000\000\000\000\000\000\000\000\000\040\000\000\000\000\000\000\000\000\000\000"}, + {299, "for_stmt", 0, 10, states_43, + "\000\000\000\000\000\000\000\000\000\000\000\000\100\000\000\000\000\000\000\000\000\000\000"}, {300, "try_stmt", 0, 13, states_44, - "\000\000\000\000\000\000\000\000\000\000\000\000\200\000\000\000\000\000\000\000\000\000\000"}, + "\000\000\000\000\000\000\000\000\000\000\000\000\000\001\000\000\000\000\000\000\000\000\000"}, {301, "with_stmt", 0, 5, states_45, - "\000\000\000\000\000\000\000\000\000\000\000\000\000\004\000\000\000\000\000\000\000\000\000"}, + "\000\000\000\000\000\000\000\000\000\000\000\000\000\010\000\000\000\000\000\000\000\000\000"}, {302, "with_item", 0, 4, states_46, - "\000\040\200\000\000\000\000\000\000\000\010\000\000\000\020\002\000\300\220\050\037\000\000"}, + "\000\040\200\000\000\000\000\000\000\000\010\000\000\000\100\010\000\000\103\242\174\000\000"}, {303, "except_clause", 0, 5, states_47, - "\000\000\000\000\000\000\000\000\000\000\000\000\000\040\000\000\000\000\000\000\000\000\000"}, + "\000\000\000\000\000\000\000\000\000\000\000\000\000\100\000\000\000\000\000\000\000\000\000"}, {304, "suite", 0, 5, states_48, - "\004\040\200\000\002\000\000\000\012\076\011\007\000\000\020\002\000\300\220\050\037\200\000"}, - {305, "test", 0, 6, states_49, - "\000\040\200\000\000\000\000\000\000\000\010\000\000\000\020\002\000\300\220\050\037\000\000"}, - {306, "test_nocond", 0, 2, states_50, - "\000\040\200\000\000\000\000\000\000\000\010\000\000\000\020\002\000\300\220\050\037\000\000"}, - {307, "lambdef", 0, 5, states_51, - "\000\000\000\000\000\000\000\000\000\000\000\000\000\000\020\000\000\000\000\000\000\000\000"}, - {308, "lambdef_nocond", 0, 5, states_52, - "\000\000\000\000\000\000\000\000\000\000\000\000\000\000\020\000\000\000\000\000\000\000\000"}, - {309, "or_test", 0, 2, states_53, - "\000\040\200\000\000\000\000\000\000\000\010\000\000\000\000\002\000\300\220\050\037\000\000"}, - {310, "and_test", 0, 2, states_54, - "\000\040\200\000\000\000\000\000\000\000\010\000\000\000\000\002\000\300\220\050\037\000\000"}, - {311, "not_test", 0, 3, states_55, - "\000\040\200\000\000\000\000\000\000\000\010\000\000\000\000\002\000\300\220\050\037\000\000"}, - {312, "comparison", 0, 2, states_56, - "\000\040\200\000\000\000\000\000\000\000\010\000\000\000\000\000\000\300\220\050\037\000\000"}, - {313, "comp_op", 0, 4, states_57, - "\000\000\000\000\000\000\000\000\000\000\000\000\100\000\000\362\017\000\000\000\000\000\000"}, - {314, "star_expr", 0, 3, states_58, + "\004\040\200\000\002\000\000\000\012\076\011\007\000\000\100\010\000\000\103\242\174\000\002"}, + {305, "namedexpr_test", 0, 4, states_49, + "\000\040\200\000\000\000\000\000\000\000\010\000\000\000\100\010\000\000\103\242\174\000\000"}, + {306, "test", 0, 6, states_50, + "\000\040\200\000\000\000\000\000\000\000\010\000\000\000\100\010\000\000\103\242\174\000\000"}, + {307, "test_nocond", 0, 2, states_51, + "\000\040\200\000\000\000\000\000\000\000\010\000\000\000\100\010\000\000\103\242\174\000\000"}, + {308, "lambdef", 0, 5, states_52, + "\000\000\000\000\000\000\000\000\000\000\000\000\000\000\100\000\000\000\000\000\000\000\000"}, + {309, "lambdef_nocond", 0, 5, states_53, + "\000\000\000\000\000\000\000\000\000\000\000\000\000\000\100\000\000\000\000\000\000\000\000"}, + {310, "or_test", 0, 2, states_54, + "\000\040\200\000\000\000\000\000\000\000\010\000\000\000\000\010\000\000\103\242\174\000\000"}, + {311, "and_test", 0, 2, states_55, + "\000\040\200\000\000\000\000\000\000\000\010\000\000\000\000\010\000\000\103\242\174\000\000"}, + {312, "not_test", 0, 3, states_56, + "\000\040\200\000\000\000\000\000\000\000\010\000\000\000\000\010\000\000\103\242\174\000\000"}, + {313, "comparison", 0, 2, states_57, + "\000\040\200\000\000\000\000\000\000\000\010\000\000\000\000\000\000\000\103\242\174\000\000"}, + {314, "comp_op", 0, 4, states_58, + "\000\000\000\000\000\000\000\000\000\000\000\000\200\000\000\310\077\000\000\000\000\000\000"}, + {315, "star_expr", 0, 3, states_59, "\000\000\000\000\002\000\000\000\000\000\000\000\000\000\000\000\000\000\000\000\000\000\000"}, - {315, "expr", 0, 2, states_59, - "\000\040\200\000\000\000\000\000\000\000\010\000\000\000\000\000\000\300\220\050\037\000\000"}, - {316, "xor_expr", 0, 2, states_60, - "\000\040\200\000\000\000\000\000\000\000\010\000\000\000\000\000\000\300\220\050\037\000\000"}, - {317, "and_expr", 0, 2, states_61, - "\000\040\200\000\000\000\000\000\000\000\010\000\000\000\000\000\000\300\220\050\037\000\000"}, - {318, "shift_expr", 0, 2, states_62, - "\000\040\200\000\000\000\000\000\000\000\010\000\000\000\000\000\000\300\220\050\037\000\000"}, - {319, "arith_expr", 0, 2, states_63, - "\000\040\200\000\000\000\000\000\000\000\010\000\000\000\000\000\000\300\220\050\037\000\000"}, - {320, "term", 0, 2, states_64, - "\000\040\200\000\000\000\000\000\000\000\010\000\000\000\000\000\000\300\220\050\037\000\000"}, - {321, "factor", 0, 3, states_65, - "\000\040\200\000\000\000\000\000\000\000\010\000\000\000\000\000\000\300\220\050\037\000\000"}, - {322, "power", 0, 4, states_66, - "\000\040\200\000\000\000\000\000\000\000\010\000\000\000\000\000\000\000\200\050\037\000\000"}, - {323, "atom_expr", 0, 3, states_67, - "\000\040\200\000\000\000\000\000\000\000\010\000\000\000\000\000\000\000\200\050\037\000\000"}, - {324, "atom", 0, 9, states_68, - "\000\040\200\000\000\000\000\000\000\000\010\000\000\000\000\000\000\000\000\050\037\000\000"}, - {325, "testlist_comp", 0, 5, states_69, - "\000\040\200\000\002\000\000\000\000\000\010\000\000\000\020\002\000\300\220\050\037\000\000"}, - {326, "trailer", 0, 7, states_70, - "\000\040\000\000\000\000\000\000\000\000\004\000\000\000\000\000\000\000\000\010\000\000\000"}, - {327, "subscriptlist", 0, 3, states_71, - "\000\040\200\010\000\000\000\000\000\000\010\000\000\000\020\002\000\300\220\050\037\000\000"}, - {328, "subscript", 0, 5, states_72, - "\000\040\200\010\000\000\000\000\000\000\010\000\000\000\020\002\000\300\220\050\037\000\000"}, - {329, "sliceop", 0, 3, states_73, + {316, "expr", 0, 2, states_60, + "\000\040\200\000\000\000\000\000\000\000\010\000\000\000\000\000\000\000\103\242\174\000\000"}, + {317, "xor_expr", 0, 2, states_61, + "\000\040\200\000\000\000\000\000\000\000\010\000\000\000\000\000\000\000\103\242\174\000\000"}, + {318, "and_expr", 0, 2, states_62, + "\000\040\200\000\000\000\000\000\000\000\010\000\000\000\000\000\000\000\103\242\174\000\000"}, + {319, "shift_expr", 0, 2, states_63, + "\000\040\200\000\000\000\000\000\000\000\010\000\000\000\000\000\000\000\103\242\174\000\000"}, + {320, "arith_expr", 0, 2, states_64, + "\000\040\200\000\000\000\000\000\000\000\010\000\000\000\000\000\000\000\103\242\174\000\000"}, + {321, "term", 0, 2, states_65, + "\000\040\200\000\000\000\000\000\000\000\010\000\000\000\000\000\000\000\103\242\174\000\000"}, + {322, "factor", 0, 3, states_66, + "\000\040\200\000\000\000\000\000\000\000\010\000\000\000\000\000\000\000\103\242\174\000\000"}, + {323, "power", 0, 4, states_67, + "\000\040\200\000\000\000\000\000\000\000\010\000\000\000\000\000\000\000\000\242\174\000\000"}, + {324, "atom_expr", 0, 3, states_68, + "\000\040\200\000\000\000\000\000\000\000\010\000\000\000\000\000\000\000\000\242\174\000\000"}, + {325, "atom", 0, 9, states_69, + "\000\040\200\000\000\000\000\000\000\000\010\000\000\000\000\000\000\000\000\240\174\000\000"}, + {326, "testlist_comp", 0, 5, states_70, + "\000\040\200\000\002\000\000\000\000\000\010\000\000\000\100\010\000\000\103\242\174\000\000"}, + {327, "trailer", 0, 7, states_71, + "\000\040\000\000\000\000\000\000\000\000\004\000\000\000\000\000\000\000\000\040\000\000\000"}, + {328, "subscriptlist", 0, 3, states_72, + "\000\040\200\010\000\000\000\000\000\000\010\000\000\000\100\010\000\000\103\242\174\000\000"}, + {329, "subscript", 0, 5, states_73, + "\000\040\200\010\000\000\000\000\000\000\010\000\000\000\100\010\000\000\103\242\174\000\000"}, + {330, "sliceop", 0, 3, states_74, "\000\000\000\010\000\000\000\000\000\000\000\000\000\000\000\000\000\000\000\000\000\000\000"}, - {330, "exprlist", 0, 3, states_74, - "\000\040\200\000\002\000\000\000\000\000\010\000\000\000\000\000\000\300\220\050\037\000\000"}, - {331, "testlist", 0, 3, states_75, - "\000\040\200\000\000\000\000\000\000\000\010\000\000\000\020\002\000\300\220\050\037\000\000"}, - {332, "dictorsetmaker", 0, 14, states_76, - "\000\040\200\000\006\000\000\000\000\000\010\000\000\000\020\002\000\300\220\050\037\000\000"}, - {333, "classdef", 0, 8, states_77, - "\000\000\000\000\000\000\000\000\000\000\000\000\000\000\000\000\000\000\000\000\000\002\000"}, - {334, "arglist", 0, 3, states_78, - "\000\040\200\000\006\000\000\000\000\000\010\000\000\000\020\002\000\300\220\050\037\000\000"}, - {335, "argument", 0, 4, states_79, - "\000\040\200\000\006\000\000\000\000\000\010\000\000\000\020\002\000\300\220\050\037\000\000"}, - {336, "comp_iter", 0, 2, states_80, - "\000\000\040\000\000\000\000\000\000\000\000\000\042\000\000\000\000\000\000\000\000\000\000"}, - {337, "sync_comp_for", 0, 6, states_81, - "\000\000\000\000\000\000\000\000\000\000\000\000\040\000\000\000\000\000\000\000\000\000\000"}, - {338, "comp_for", 0, 3, states_82, - "\000\000\040\000\000\000\000\000\000\000\000\000\040\000\000\000\000\000\000\000\000\000\000"}, - {339, "comp_if", 0, 4, states_83, + {331, "exprlist", 0, 3, states_75, + "\000\040\200\000\002\000\000\000\000\000\010\000\000\000\000\000\000\000\103\242\174\000\000"}, + {332, "testlist", 0, 3, states_76, + "\000\040\200\000\000\000\000\000\000\000\010\000\000\000\100\010\000\000\103\242\174\000\000"}, + {333, "dictorsetmaker", 0, 14, states_77, + "\000\040\200\000\006\000\000\000\000\000\010\000\000\000\100\010\000\000\103\242\174\000\000"}, + {334, "classdef", 0, 8, states_78, + "\000\000\000\000\000\000\000\000\000\000\000\000\000\000\000\000\000\000\000\000\000\010\000"}, + {335, "arglist", 0, 3, states_79, + "\000\040\200\000\006\000\000\000\000\000\010\000\000\000\100\010\000\000\103\242\174\000\000"}, + {336, "argument", 0, 4, states_80, + "\000\040\200\000\006\000\000\000\000\000\010\000\000\000\100\010\000\000\103\242\174\000\000"}, + {337, "comp_iter", 0, 2, states_81, + "\000\000\040\000\000\000\000\000\000\000\000\000\102\000\000\000\000\000\000\000\000\000\000"}, + {338, "sync_comp_for", 0, 6, states_82, + "\000\000\000\000\000\000\000\000\000\000\000\000\100\000\000\000\000\000\000\000\000\000\000"}, + {339, "comp_for", 0, 3, states_83, + "\000\000\040\000\000\000\000\000\000\000\000\000\100\000\000\000\000\000\000\000\000\000\000"}, + {340, "comp_if", 0, 4, states_84, "\000\000\000\000\000\000\000\000\000\000\000\000\002\000\000\000\000\000\000\000\000\000\000"}, - {340, "encoding_decl", 0, 2, states_84, + {341, "encoding_decl", 0, 2, states_85, "\000\000\200\000\000\000\000\000\000\000\000\000\000\000\000\000\000\000\000\000\000\000\000"}, - {341, "yield_expr", 0, 3, states_85, - "\000\000\000\000\000\000\000\000\000\000\000\000\000\000\000\000\000\000\000\000\000\200\000"}, - {342, "yield_arg", 0, 3, states_86, - "\000\040\200\000\002\000\000\000\000\040\010\000\000\000\020\002\000\300\220\050\037\000\000"}, + {342, "yield_expr", 0, 3, states_86, + "\000\000\000\000\000\000\000\000\000\000\000\000\000\000\000\000\000\000\000\000\000\000\002"}, + {343, "yield_arg", 0, 3, states_87, + "\000\040\200\000\002\000\000\000\000\040\010\000\000\000\100\010\000\000\103\242\174\000\000"}, }; -static label labels[177] = { +static label labels[179] = { {0, "EMPTY"}, {256, 0}, {4, 0}, @@ -2099,16 +2121,16 @@ static label labels[177] = { {269, 0}, {0, 0}, {258, 0}, - {331, 0}, + {332, 0}, {259, 0}, {49, 0}, {291, 0}, {7, 0}, - {334, 0}, + {335, 0}, {8, 0}, {260, 0}, {261, 0}, - {333, 0}, + {334, 0}, {263, 0}, {262, 0}, {1, "async"}, @@ -2116,7 +2138,7 @@ static label labels[177] = { {1, 0}, {264, 0}, {51, 0}, - {305, 0}, + {306, 0}, {11, 0}, {304, 0}, {265, 0}, @@ -2140,8 +2162,8 @@ static label labels[177] = { {274, 0}, {273, 0}, {275, 0}, - {341, 0}, - {314, 0}, + {342, 0}, + {315, 0}, {36, 0}, {37, 0}, {38, 0}, @@ -2156,7 +2178,7 @@ static label labels[177] = { {46, 0}, {48, 0}, {1, "del"}, - {330, 0}, + {331, 0}, {1, "pass"}, {279, 0}, {280, 0}, @@ -2188,6 +2210,7 @@ static label labels[177] = { {301, 0}, {296, 0}, {1, "if"}, + {305, 0}, {1, "elif"}, {1, "else"}, {1, "while"}, @@ -2198,22 +2221,23 @@ static label labels[177] = { {1, "finally"}, {1, "with"}, {302, 0}, - {315, 0}, + {316, 0}, {1, "except"}, {5, 0}, {6, 0}, - {309, 0}, - {307, 0}, - {306, 0}, + {53, 0}, + {310, 0}, {308, 0}, + {307, 0}, + {309, 0}, {1, "lambda"}, - {310, 0}, - {1, "or"}, {311, 0}, + {1, "or"}, + {312, 0}, {1, "and"}, {1, "not"}, - {312, 0}, {313, 0}, + {314, 0}, {20, 0}, {21, 0}, {27, 0}, @@ -2222,55 +2246,55 @@ static label labels[177] = { {28, 0}, {28, 0}, {1, "is"}, - {316, 0}, - {18, 0}, {317, 0}, - {32, 0}, + {18, 0}, {318, 0}, - {19, 0}, + {32, 0}, {319, 0}, + {19, 0}, + {320, 0}, {33, 0}, {34, 0}, - {320, 0}, + {321, 0}, {14, 0}, {15, 0}, - {321, 0}, + {322, 0}, {17, 0}, {24, 0}, {47, 0}, {31, 0}, - {322, 0}, {323, 0}, - {1, "await"}, {324, 0}, - {326, 0}, + {1, "await"}, {325, 0}, + {327, 0}, + {326, 0}, {9, 0}, {10, 0}, {25, 0}, - {332, 0}, + {333, 0}, {26, 0}, {2, 0}, {3, 0}, {1, "None"}, {1, "True"}, {1, "False"}, - {338, 0}, - {327, 0}, + {339, 0}, {328, 0}, {329, 0}, + {330, 0}, {1, "class"}, - {335, 0}, {336, 0}, - {339, 0}, {337, 0}, {340, 0}, + {338, 0}, + {341, 0}, {1, "yield"}, - {342, 0}, + {343, 0}, }; grammar _PyParser_Grammar = { - 87, + 88, dfas, - {177, labels}, + {179, labels}, 256 }; diff --git a/Python/symtable.c b/Python/symtable.c index 677b6043438e..879e19ab79e0 100644 --- a/Python/symtable.c +++ b/Python/symtable.c @@ -31,6 +31,9 @@ #define IMPORT_STAR_WARNING "import * only allowed at module level" +#define NAMED_EXPR_COMP_IN_CLASS \ +"named expression within a comprehension cannot be used in a class body" + static PySTEntryObject * ste_new(struct symtable *st, identifier name, _Py_block_ty block, void *key, int lineno, int col_offset) @@ -75,6 +78,7 @@ ste_new(struct symtable *st, identifier name, _Py_block_ty block, ste->ste_child_free = 0; ste->ste_generator = 0; ste->ste_coroutine = 0; + ste->ste_comprehension = 0; ste->ste_returns_value = 0; ste->ste_needs_class_closure = 0; @@ -972,7 +976,7 @@ symtable_lookup(struct symtable *st, PyObject *name) } static int -symtable_add_def(struct symtable *st, PyObject *name, int flag) +symtable_add_def_helper(struct symtable *st, PyObject *name, int flag, struct _symtable_entry *ste) { PyObject *o; PyObject *dict; @@ -982,15 +986,15 @@ symtable_add_def(struct symtable *st, PyObject *name, int flag) if (!mangled) return 0; - dict = st->st_cur->ste_symbols; + dict = ste->ste_symbols; if ((o = PyDict_GetItem(dict, mangled))) { val = PyLong_AS_LONG(o); if ((flag & DEF_PARAM) && (val & DEF_PARAM)) { /* Is it better to use 'mangled' or 'name' here? */ PyErr_Format(PyExc_SyntaxError, DUPLICATE_ARGUMENT, name); PyErr_SyntaxLocationObject(st->st_filename, - st->st_cur->ste_lineno, - st->st_cur->ste_col_offset + 1); + ste->ste_lineno, + ste->ste_col_offset + 1); goto error; } val |= flag; @@ -1006,7 +1010,7 @@ symtable_add_def(struct symtable *st, PyObject *name, int flag) Py_DECREF(o); if (flag & DEF_PARAM) { - if (PyList_Append(st->st_cur->ste_varnames, mangled) < 0) + if (PyList_Append(ste->ste_varnames, mangled) < 0) goto error; } else if (flag & DEF_GLOBAL) { /* XXX need to update DEF_GLOBAL for other flags too; @@ -1032,6 +1036,11 @@ symtable_add_def(struct symtable *st, PyObject *name, int flag) return 0; } +static int +symtable_add_def(struct symtable *st, PyObject *name, int flag) { + return symtable_add_def_helper(st, name, flag, st->st_cur); +} + /* VISIT, VISIT_SEQ and VIST_SEQ_TAIL take an ASDL type as their second argument. They use the ASDL name to synthesize the name of the C type and the visit function. @@ -1082,7 +1091,7 @@ symtable_add_def(struct symtable *st, PyObject *name, int flag) } static int -symtable_record_directive(struct symtable *st, identifier name, stmt_ty s) +symtable_record_directive(struct symtable *st, identifier name, int lineno, int col_offset) { PyObject *data, *mangled; int res; @@ -1094,7 +1103,7 @@ symtable_record_directive(struct symtable *st, identifier name, stmt_ty s) mangled = _Py_Mangle(st->st_private, name); if (!mangled) return 0; - data = Py_BuildValue("(Nii)", mangled, s->lineno, s->col_offset); + data = Py_BuildValue("(Nii)", mangled, lineno, col_offset); if (!data) return 0; res = PyList_Append(st->st_cur->ste_directives, data); @@ -1280,7 +1289,7 @@ symtable_visit_stmt(struct symtable *st, stmt_ty s) } if (!symtable_add_def(st, name, DEF_GLOBAL)) VISIT_QUIT(st, 0); - if (!symtable_record_directive(st, name, s)) + if (!symtable_record_directive(st, name, s->lineno, s->col_offset)) VISIT_QUIT(st, 0); } break; @@ -1312,7 +1321,7 @@ symtable_visit_stmt(struct symtable *st, stmt_ty s) } if (!symtable_add_def(st, name, DEF_NONLOCAL)) VISIT_QUIT(st, 0); - if (!symtable_record_directive(st, name, s)) + if (!symtable_record_directive(st, name, s->lineno, s->col_offset)) VISIT_QUIT(st, 0); } break; @@ -1367,6 +1376,60 @@ symtable_visit_stmt(struct symtable *st, stmt_ty s) VISIT_QUIT(st, 1); } +static int +symtable_extend_namedexpr_scope(struct symtable *st, expr_ty e) +{ + assert(st->st_stack); + + Py_ssize_t i, size; + struct _symtable_entry *ste; + size = PyList_GET_SIZE(st->st_stack); + assert(size); + + /* Iterate over the stack in reverse and add to the nearest adequate scope */ + for (i = size - 1; i >= 0; i--) { + ste = (struct _symtable_entry *) PyList_GET_ITEM(st->st_stack, i); + + /* If our current entry is a comprehension, skip it */ + if (ste->ste_comprehension) { + continue; + } + + /* If we find a FunctionBlock entry, add as NONLOCAL/LOCAL */ + if (ste->ste_type == FunctionBlock) { + if (!symtable_add_def(st, e->v.Name.id, DEF_NONLOCAL)) + VISIT_QUIT(st, 0); + if (!symtable_record_directive(st, e->v.Name.id, e->lineno, e->col_offset)) + VISIT_QUIT(st, 0); + + return symtable_add_def_helper(st, e->v.Name.id, DEF_LOCAL, ste); + } + /* If we find a ModuleBlock entry, add as GLOBAL */ + if (ste->ste_type == ModuleBlock) { + if (!symtable_add_def(st, e->v.Name.id, DEF_GLOBAL)) + VISIT_QUIT(st, 0); + if (!symtable_record_directive(st, e->v.Name.id, e->lineno, e->col_offset)) + VISIT_QUIT(st, 0); + + return symtable_add_def_helper(st, e->v.Name.id, DEF_GLOBAL, ste); + } + /* Disallow usage in ClassBlock */ + if (ste->ste_type == ClassBlock) { + PyErr_Format(PyExc_TargetScopeError, NAMED_EXPR_COMP_IN_CLASS, e->v.Name.id); + PyErr_SyntaxLocationObject(st->st_filename, + e->lineno, + e->col_offset); + VISIT_QUIT(st, 0); + } + } + + /* We should always find either a FunctionBlock, ModuleBlock or ClassBlock + and should never fall to this case + */ + assert(0); + return 0; +} + static int symtable_visit_expr(struct symtable *st, expr_ty e) { @@ -1376,6 +1439,10 @@ symtable_visit_expr(struct symtable *st, expr_ty e) VISIT_QUIT(st, 0); } switch (e->kind) { + case NamedExpr_kind: + VISIT(st, expr, e->v.NamedExpr.value); + VISIT(st, expr, e->v.NamedExpr.target); + break; case BoolOp_kind: VISIT_SEQ(st, expr, e->v.BoolOp.values); break; @@ -1476,6 +1543,11 @@ symtable_visit_expr(struct symtable *st, expr_ty e) VISIT(st, expr, e->v.Starred.value); break; case Name_kind: + /* Special-case: named expr */ + if (e->v.Name.ctx == NamedStore && st->st_cur->ste_comprehension) { + if(!symtable_extend_namedexpr_scope(st, e)) + VISIT_QUIT(st, 0); + } if (!symtable_add_def(st, e->v.Name.id, e->v.Name.ctx == Load ? USE : DEF_LOCAL)) VISIT_QUIT(st, 0); @@ -1713,6 +1785,8 @@ symtable_handle_comprehension(struct symtable *st, expr_ty e, if (outermost->is_async) { st->st_cur->ste_coroutine = 1; } + st->st_cur->ste_comprehension = 1; + /* Outermost iter is received as an argument */ if (!symtable_implicit_arg(st, 0)) { symtable_exit_block(st, (void *)e); From webhook-mailer at python.org Thu Jan 24 19:50:10 2019 From: webhook-mailer at python.org (Victor Stinner) Date: Fri, 25 Jan 2019 00:50:10 -0000 Subject: [Python-checkins] bpo-35224: Add support for NamedExpr to unparse.py (GH-11670) Message-ID: https://github.com/python/cpython/commit/1396d8fab4d0ae830d45f4937322bbb43ce0c30e commit: 1396d8fab4d0ae830d45f4937322bbb43ce0c30e branch: master author: Victor Stinner committer: GitHub date: 2019-01-25T01:49:53+01:00 summary: bpo-35224: Add support for NamedExpr to unparse.py (GH-11670) files: M Tools/parser/unparse.py diff --git a/Tools/parser/unparse.py b/Tools/parser/unparse.py index 82c3c7768072..70b47a174053 100644 --- a/Tools/parser/unparse.py +++ b/Tools/parser/unparse.py @@ -79,6 +79,13 @@ def _Expr(self, tree): self.fill() self.dispatch(tree.value) + def _NamedExpr(self, tree): + self.write("(") + self.dispatch(tree.target) + self.write(" := ") + self.dispatch(tree.value) + self.write(")") + def _Import(self, t): self.fill("import ") interleave(lambda: self.write(", "), self.dispatch, t.names) From webhook-mailer at python.org Thu Jan 24 20:39:25 2019 From: webhook-mailer at python.org (Ivan Levkivskyi) Date: Fri, 25 Jan 2019 01:39:25 -0000 Subject: [Python-checkins] bpo-35814: Allow same r.h.s. in annotated assignments as in normal ones (GH-11667) Message-ID: https://github.com/python/cpython/commit/62c35a8a8ff5854ed470b1c16a7a14f3bb80368c commit: 62c35a8a8ff5854ed470b1c16a7a14f3bb80368c branch: master author: Ivan Levkivskyi committer: GitHub date: 2019-01-25T01:39:19Z summary: bpo-35814: Allow same r.h.s. in annotated assignments as in normal ones (GH-11667) files: A Misc/NEWS.d/next/Core and Builtins/2019-01-24-13-25-21.bpo-35814.r_MjA6.rst M Grammar/Grammar M Lib/test/test_grammar.py M Lib/test/test_parser.py M Python/ast.c M Python/graminit.c diff --git a/Grammar/Grammar b/Grammar/Grammar index f21fa1136432..8455c1259259 100644 --- a/Grammar/Grammar +++ b/Grammar/Grammar @@ -40,7 +40,7 @@ small_stmt: (expr_stmt | del_stmt | pass_stmt | flow_stmt | import_stmt | global_stmt | nonlocal_stmt | assert_stmt) expr_stmt: testlist_star_expr (annassign | augassign (yield_expr|testlist) | ('=' (yield_expr|testlist_star_expr))*) -annassign: ':' test ['=' test] +annassign: ':' test ['=' (yield_expr|testlist)] testlist_star_expr: (test|star_expr) (',' (test|star_expr))* [','] augassign: ('+=' | '-=' | '*=' | '@=' | '/=' | '%=' | '&=' | '|=' | '^=' | '<<=' | '>>=' | '**=' | '//=') diff --git a/Lib/test/test_grammar.py b/Lib/test/test_grammar.py index 3ed19ff1cb04..74590eb86f08 100644 --- a/Lib/test/test_grammar.py +++ b/Lib/test/test_grammar.py @@ -445,6 +445,15 @@ def __getitem__(self, item): exec('X: str', {}, CNS2()) self.assertEqual(nonloc_ns['__annotations__']['x'], str) + def test_var_annot_rhs(self): + ns = {} + exec('x: tuple = 1, 2', ns) + self.assertEqual(ns['x'], (1, 2)) + stmt = ('def f():\n' + ' x: int = yield') + exec(stmt, ns) + self.assertEqual(list(ns['f']()), [None]) + def test_funcdef(self): ### [decorators] 'def' NAME parameters ['->' test] ':' suite ### decorator: '@' dotted_name [ '(' [arglist] ')' ] NEWLINE diff --git a/Lib/test/test_parser.py b/Lib/test/test_parser.py index ac3899baedb6..19f178203642 100644 --- a/Lib/test/test_parser.py +++ b/Lib/test/test_parser.py @@ -166,7 +166,7 @@ def test_var_annot(self): with self.assertRaises(SyntaxError): exec("x, *y, z: int = range(5)", {}, {}) with self.assertRaises(SyntaxError): - exec("t: tuple = 1, 2", {}, {}) + exec("x: int = 1, y = 2", {}, {}) with self.assertRaises(SyntaxError): exec("u = v: int", {}, {}) with self.assertRaises(SyntaxError): diff --git a/Misc/NEWS.d/next/Core and Builtins/2019-01-24-13-25-21.bpo-35814.r_MjA6.rst b/Misc/NEWS.d/next/Core and Builtins/2019-01-24-13-25-21.bpo-35814.r_MjA6.rst new file mode 100644 index 000000000000..5d216b273e95 --- /dev/null +++ b/Misc/NEWS.d/next/Core and Builtins/2019-01-24-13-25-21.bpo-35814.r_MjA6.rst @@ -0,0 +1,2 @@ +Allow same right hand side expressions in annotated assignments as in normal ones. +In particular, ``x: Tuple[int, int] = 1, 2`` (without parentheses on the right) is now allowed. \ No newline at end of file diff --git a/Python/ast.c b/Python/ast.c index 6560026109c8..e10e63f539c3 100644 --- a/Python/ast.c +++ b/Python/ast.c @@ -3163,7 +3163,12 @@ ast_for_expr_stmt(struct compiling *c, const node *n) } else { ch = CHILD(ann, 3); - expr3 = ast_for_expr(c, ch); + if (TYPE(ch) == testlist) { + expr3 = ast_for_testlist(c, ch); + } + else { + expr3 = ast_for_expr(c, ch); + } if (!expr3) { return NULL; } diff --git a/Python/graminit.c b/Python/graminit.c index 91092f1e0b9e..225d32793906 100644 --- a/Python/graminit.c +++ b/Python/graminit.c @@ -498,8 +498,9 @@ static arc arcs_17_2[2] = { {31, 3}, {0, 2}, }; -static arc arcs_17_3[1] = { - {26, 4}, +static arc arcs_17_3[2] = { + {50, 4}, + {9, 4}, }; static arc arcs_17_4[1] = { {0, 4}, @@ -508,7 +509,7 @@ static state states_17[5] = { {1, arcs_17_0}, {1, arcs_17_1}, {2, arcs_17_2}, - {1, arcs_17_3}, + {2, arcs_17_3}, {1, arcs_17_4}, }; static arc arcs_18_0[2] = { From solipsis at pitrou.net Fri Jan 25 04:08:25 2019 From: solipsis at pitrou.net (solipsis at pitrou.net) Date: Fri, 25 Jan 2019 09:08:25 +0000 Subject: [Python-checkins] Daily reference leaks (4243df51fe43): sum=-1 Message-ID: <20190125090825.1.57F1409C97ABF447@psf.io> results for 4243df51fe43 on branch "default" -------------------------------------------- test_collections leaked [-7, 1, 0] memory blocks, sum=-6 test_functools leaked [0, 3, 1] memory blocks, sum=4 test_multiprocessing_fork leaked [-1, 2, 0] memory blocks, sum=1 Command line was: ['./python', '-m', 'test.regrtest', '-uall', '-R', '3:3:/home/psf-users/antoine/refleaks/reflogP0d3tE', '--timeout', '7200'] From webhook-mailer at python.org Fri Jan 25 07:01:46 2019 From: webhook-mailer at python.org (Antoine Pitrou) Date: Fri, 25 Jan 2019 12:01:46 -0000 Subject: [Python-checkins] bpo-34134: Advise to use imap or imap_unordered when handling long iterables. (gh-8324) Message-ID: https://github.com/python/cpython/commit/3bab40db96efda2e127ef84e6501fda0cdc4f5b8 commit: 3bab40db96efda2e127ef84e6501fda0cdc4f5b8 branch: master author: Windson yang committer: Antoine Pitrou date: 2019-01-25T13:01:41+01:00 summary: bpo-34134: Advise to use imap or imap_unordered when handling long iterables. (gh-8324) files: M Doc/library/multiprocessing.rst diff --git a/Doc/library/multiprocessing.rst b/Doc/library/multiprocessing.rst index a77815918d2b..987a0d508768 100644 --- a/Doc/library/multiprocessing.rst +++ b/Doc/library/multiprocessing.rst @@ -2157,6 +2157,10 @@ with the :class:`Pool` class. the process pool as separate tasks. The (approximate) size of these chunks can be specified by setting *chunksize* to a positive integer. + Note that it may cause high memory usage for very long iterables. Consider + using :meth:`imap` or :meth:`imap_unordered` with explicit *chunksize* + option for better efficiency. + .. method:: map_async(func, iterable[, chunksize[, callback[, error_callback]]]) A variant of the :meth:`.map` method which returns a result object. @@ -2175,7 +2179,7 @@ with the :class:`Pool` class. .. method:: imap(func, iterable[, chunksize]) - A lazier version of :meth:`map`. + A lazier version of :meth:`.map`. The *chunksize* argument is the same as the one used by the :meth:`.map` method. For very long iterables using a large value for *chunksize* can From webhook-mailer at python.org Fri Jan 25 07:08:18 2019 From: webhook-mailer at python.org (Antoine Pitrou) Date: Fri, 25 Jan 2019 12:08:18 -0000 Subject: [Python-checkins] bpo-34134: Advise to use imap or imap_unordered when handling long iterables. (gh-8324) (gh-11673) Message-ID: https://github.com/python/cpython/commit/c2674bf11036af1e06c1be739f0eebcc72dfbf7a commit: c2674bf11036af1e06c1be739f0eebcc72dfbf7a branch: 3.7 author: Miss Islington (bot) <31488909+miss-islington at users.noreply.github.com> committer: Antoine Pitrou date: 2019-01-25T13:08:14+01:00 summary: bpo-34134: Advise to use imap or imap_unordered when handling long iterables. (gh-8324) (gh-11673) (cherry picked from commit 3bab40db96efda2e127ef84e6501fda0cdc4f5b8) Co-authored-by: Windson yang files: M Doc/library/multiprocessing.rst diff --git a/Doc/library/multiprocessing.rst b/Doc/library/multiprocessing.rst index c50625dda320..6ed8f211f361 100644 --- a/Doc/library/multiprocessing.rst +++ b/Doc/library/multiprocessing.rst @@ -2148,6 +2148,10 @@ with the :class:`Pool` class. the process pool as separate tasks. The (approximate) size of these chunks can be specified by setting *chunksize* to a positive integer. + Note that it may cause high memory usage for very long iterables. Consider + using :meth:`imap` or :meth:`imap_unordered` with explicit *chunksize* + option for better efficiency. + .. method:: map_async(func, iterable[, chunksize[, callback[, error_callback]]]) A variant of the :meth:`.map` method which returns a result object. @@ -2166,7 +2170,7 @@ with the :class:`Pool` class. .. method:: imap(func, iterable[, chunksize]) - A lazier version of :meth:`map`. + A lazier version of :meth:`.map`. The *chunksize* argument is the same as the one used by the :meth:`.map` method. For very long iterables using a large value for *chunksize* can From webhook-mailer at python.org Fri Jan 25 17:59:18 2019 From: webhook-mailer at python.org (Steve Dower) Date: Fri, 25 Jan 2019 22:59:18 -0000 Subject: [Python-checkins] bpo-35797: Fix default executable used by the multiprocessing module (GH-11676) Message-ID: https://github.com/python/cpython/commit/4e02f8f8b4baab63f927cfd87b401200ba2969e9 commit: 4e02f8f8b4baab63f927cfd87b401200ba2969e9 branch: master author: Steve Dower committer: GitHub date: 2019-01-25T14:59:12-08:00 summary: bpo-35797: Fix default executable used by the multiprocessing module (GH-11676) files: A Misc/NEWS.d/next/Windows/2019-01-25-12-29-14.bpo-35797.MzyOK9.rst M Lib/multiprocessing/spawn.py M Lib/test/test_venv.py diff --git a/Lib/multiprocessing/spawn.py b/Lib/multiprocessing/spawn.py index 73aa69471f29..860fa4ceb5ce 100644 --- a/Lib/multiprocessing/spawn.py +++ b/Lib/multiprocessing/spawn.py @@ -29,12 +29,19 @@ if sys.platform != 'win32': WINEXE = False WINSERVICE = False + _WINENV = False else: - WINEXE = (sys.platform == 'win32' and getattr(sys, 'frozen', False)) + WINEXE = getattr(sys, 'frozen', False) WINSERVICE = sys.executable.lower().endswith("pythonservice.exe") + _WINENV = '__PYVENV_LAUNCHER__' in os.environ if WINSERVICE: _python_exe = os.path.join(sys.exec_prefix, 'python.exe') +elif _WINENV: + # bpo-35797: When running in a venv, we need to bypass the redirect + # executor and launch our base Python. + import _winapi + _python_exe = _winapi.GetModuleFileName(0) else: _python_exe = sys.executable diff --git a/Lib/test/test_venv.py b/Lib/test/test_venv.py index 22a3b78852f8..34c2234493bc 100644 --- a/Lib/test/test_venv.py +++ b/Lib/test/test_venv.py @@ -306,6 +306,19 @@ def test_unicode_in_batch_file(self): ) self.assertEqual(out.strip(), '0') + def test_multiprocessing(self): + """ + Test that the multiprocessing is able to spawn. + """ + rmtree(self.env_dir) + self.run_with_capture(venv.create, self.env_dir) + envpy = os.path.join(os.path.realpath(self.env_dir), + self.bindir, self.exe) + out, err = check_output([envpy, '-c', + 'from multiprocessing import Pool; ' + + 'print(Pool(1).apply_async("Python".lower).get(3))']) + self.assertEqual(out.strip(), "python".encode()) + @skipInVenv class EnsurePipTest(BaseTest): """Test venv module installation of pip.""" diff --git a/Misc/NEWS.d/next/Windows/2019-01-25-12-29-14.bpo-35797.MzyOK9.rst b/Misc/NEWS.d/next/Windows/2019-01-25-12-29-14.bpo-35797.MzyOK9.rst new file mode 100644 index 000000000000..a0745f500b13 --- /dev/null +++ b/Misc/NEWS.d/next/Windows/2019-01-25-12-29-14.bpo-35797.MzyOK9.rst @@ -0,0 +1 @@ +Fix default executable used by the multiprocessing module From webhook-mailer at python.org Fri Jan 25 18:00:01 2019 From: webhook-mailer at python.org (Steve Dower) Date: Fri, 25 Jan 2019 23:00:01 -0000 Subject: [Python-checkins] bpo-35811: Avoid propagating venv settings when launching via py.exe (GH-11677) Message-ID: https://github.com/python/cpython/commit/adad9e68013aac166c84ffe4e23f3a5464f41840 commit: adad9e68013aac166c84ffe4e23f3a5464f41840 branch: master author: Steve Dower committer: GitHub date: 2019-01-25T14:59:58-08:00 summary: bpo-35811: Avoid propagating venv settings when launching via py.exe (GH-11677) files: A Misc/NEWS.d/next/Windows/2019-01-25-12-46-36.bpo-35811.2hU-mm.rst M PC/launcher.c diff --git a/Misc/NEWS.d/next/Windows/2019-01-25-12-46-36.bpo-35811.2hU-mm.rst b/Misc/NEWS.d/next/Windows/2019-01-25-12-46-36.bpo-35811.2hU-mm.rst new file mode 100644 index 000000000000..3207c955bff4 --- /dev/null +++ b/Misc/NEWS.d/next/Windows/2019-01-25-12-46-36.bpo-35811.2hU-mm.rst @@ -0,0 +1 @@ +Avoid propagating venv settings when launching via py.exe diff --git a/PC/launcher.c b/PC/launcher.c index 4c620dab7c09..a78620a8924f 100644 --- a/PC/launcher.c +++ b/PC/launcher.c @@ -1707,6 +1707,17 @@ process(int argc, wchar_t ** argv) command = skip_me(GetCommandLineW()); debug(L"Called with command line: %ls\n", command); +#if !defined(VENV_REDIRECT) + /* bpo-35811: The __PYVENV_LAUNCHER__ variable is used to + * override sys.executable and locate the original prefix path. + * However, if it is silently inherited by a non-venv Python + * process, that process will believe it is running in the venv + * still. This is the only place where *we* can clear it (that is, + * when py.exe is being used to launch Python), so we do. + */ + SetEnvironmentVariableW(L"__PYVENV_LAUNCHER__", NULL); +#endif + #if defined(SCRIPT_WRAPPER) /* The launcher is being used in "script wrapper" mode. * There should therefore be a Python script named -script.py in From webhook-mailer at python.org Fri Jan 25 18:14:46 2019 From: webhook-mailer at python.org (Miss Islington (bot)) Date: Fri, 25 Jan 2019 23:14:46 -0000 Subject: [Python-checkins] bpo-35797: Fix default executable used by the multiprocessing module (GH-11676) Message-ID: https://github.com/python/cpython/commit/6a9c0fca3f2f93681468b51929472f4433753f25 commit: 6a9c0fca3f2f93681468b51929472f4433753f25 branch: 3.7 author: Miss Islington (bot) <31488909+miss-islington at users.noreply.github.com> committer: GitHub date: 2019-01-25T15:14:41-08:00 summary: bpo-35797: Fix default executable used by the multiprocessing module (GH-11676) (cherry picked from commit 4e02f8f8b4baab63f927cfd87b401200ba2969e9) Co-authored-by: Steve Dower files: A Misc/NEWS.d/next/Windows/2019-01-25-12-29-14.bpo-35797.MzyOK9.rst M Lib/multiprocessing/spawn.py M Lib/test/test_venv.py diff --git a/Lib/multiprocessing/spawn.py b/Lib/multiprocessing/spawn.py index 1f4f3f496f51..693f2fb9d475 100644 --- a/Lib/multiprocessing/spawn.py +++ b/Lib/multiprocessing/spawn.py @@ -29,12 +29,19 @@ if sys.platform != 'win32': WINEXE = False WINSERVICE = False + _WINENV = False else: - WINEXE = (sys.platform == 'win32' and getattr(sys, 'frozen', False)) + WINEXE = getattr(sys, 'frozen', False) WINSERVICE = sys.executable.lower().endswith("pythonservice.exe") + _WINENV = '__PYVENV_LAUNCHER__' in os.environ if WINSERVICE: _python_exe = os.path.join(sys.exec_prefix, 'python.exe') +elif _WINENV: + # bpo-35797: When running in a venv, we need to bypass the redirect + # executor and launch our base Python. + import _winapi + _python_exe = _winapi.GetModuleFileName(0) else: _python_exe = sys.executable diff --git a/Lib/test/test_venv.py b/Lib/test/test_venv.py index 22a3b78852f8..34c2234493bc 100644 --- a/Lib/test/test_venv.py +++ b/Lib/test/test_venv.py @@ -306,6 +306,19 @@ def test_unicode_in_batch_file(self): ) self.assertEqual(out.strip(), '0') + def test_multiprocessing(self): + """ + Test that the multiprocessing is able to spawn. + """ + rmtree(self.env_dir) + self.run_with_capture(venv.create, self.env_dir) + envpy = os.path.join(os.path.realpath(self.env_dir), + self.bindir, self.exe) + out, err = check_output([envpy, '-c', + 'from multiprocessing import Pool; ' + + 'print(Pool(1).apply_async("Python".lower).get(3))']) + self.assertEqual(out.strip(), "python".encode()) + @skipInVenv class EnsurePipTest(BaseTest): """Test venv module installation of pip.""" diff --git a/Misc/NEWS.d/next/Windows/2019-01-25-12-29-14.bpo-35797.MzyOK9.rst b/Misc/NEWS.d/next/Windows/2019-01-25-12-29-14.bpo-35797.MzyOK9.rst new file mode 100644 index 000000000000..a0745f500b13 --- /dev/null +++ b/Misc/NEWS.d/next/Windows/2019-01-25-12-29-14.bpo-35797.MzyOK9.rst @@ -0,0 +1 @@ +Fix default executable used by the multiprocessing module From webhook-mailer at python.org Fri Jan 25 18:31:24 2019 From: webhook-mailer at python.org (Miss Islington (bot)) Date: Fri, 25 Jan 2019 23:31:24 -0000 Subject: [Python-checkins] bpo-35811: Avoid propagating venv settings when launching via py.exe (GH-11677) Message-ID: https://github.com/python/cpython/commit/a6a8524bb1c78c7425346ec20ecffc02d1d02a79 commit: a6a8524bb1c78c7425346ec20ecffc02d1d02a79 branch: 3.7 author: Miss Islington (bot) <31488909+miss-islington at users.noreply.github.com> committer: GitHub date: 2019-01-25T15:31:18-08:00 summary: bpo-35811: Avoid propagating venv settings when launching via py.exe (GH-11677) (cherry picked from commit adad9e68013aac166c84ffe4e23f3a5464f41840) Co-authored-by: Steve Dower files: A Misc/NEWS.d/next/Windows/2019-01-25-12-46-36.bpo-35811.2hU-mm.rst M PC/launcher.c diff --git a/Misc/NEWS.d/next/Windows/2019-01-25-12-46-36.bpo-35811.2hU-mm.rst b/Misc/NEWS.d/next/Windows/2019-01-25-12-46-36.bpo-35811.2hU-mm.rst new file mode 100644 index 000000000000..3207c955bff4 --- /dev/null +++ b/Misc/NEWS.d/next/Windows/2019-01-25-12-46-36.bpo-35811.2hU-mm.rst @@ -0,0 +1 @@ +Avoid propagating venv settings when launching via py.exe diff --git a/PC/launcher.c b/PC/launcher.c index 4c620dab7c09..a78620a8924f 100644 --- a/PC/launcher.c +++ b/PC/launcher.c @@ -1707,6 +1707,17 @@ process(int argc, wchar_t ** argv) command = skip_me(GetCommandLineW()); debug(L"Called with command line: %ls\n", command); +#if !defined(VENV_REDIRECT) + /* bpo-35811: The __PYVENV_LAUNCHER__ variable is used to + * override sys.executable and locate the original prefix path. + * However, if it is silently inherited by a non-venv Python + * process, that process will believe it is running in the venv + * still. This is the only place where *we* can clear it (that is, + * when py.exe is being used to launch Python), so we do. + */ + SetEnvironmentVariableW(L"__PYVENV_LAUNCHER__", NULL); +#endif + #if defined(SCRIPT_WRAPPER) /* The launcher is being used in "script wrapper" mode. * There should therefore be a Python script named -script.py in From webhook-mailer at python.org Sat Jan 26 03:02:06 2019 From: webhook-mailer at python.org (Raymond Hettinger) Date: Sat, 26 Jan 2019 08:02:06 -0000 Subject: [Python-checkins] bpo-35780: Fix errors in lru_cache() C code (GH-11623) Message-ID: https://github.com/python/cpython/commit/d8080c01195cc9a19af752bfa04d98824dd9fb15 commit: d8080c01195cc9a19af752bfa04d98824dd9fb15 branch: master author: Raymond Hettinger committer: GitHub date: 2019-01-26T03:02:00-05:00 summary: bpo-35780: Fix errors in lru_cache() C code (GH-11623) files: A Misc/NEWS.d/next/Library/2019-01-19-17-01-43.bpo-35780.CLf7fT.rst M Lib/functools.py M Lib/test/test_functools.py M Modules/_functoolsmodule.c diff --git a/Lib/functools.py b/Lib/functools.py index ab7d71e126bd..6233c30c203e 100644 --- a/Lib/functools.py +++ b/Lib/functools.py @@ -454,7 +454,7 @@ def __hash__(self): def _make_key(args, kwds, typed, kwd_mark = (object(),), - fasttypes = {int, str, frozenset, type(None)}, + fasttypes = {int, str}, tuple=tuple, type=type, len=len): """Make a cache key from optionally typed positional and keyword arguments @@ -510,8 +510,11 @@ def lru_cache(maxsize=128, typed=False): # Early detection of an erroneous call to @lru_cache without any arguments # resulting in the inner function being passed to maxsize instead of an - # integer or None. - if maxsize is not None and not isinstance(maxsize, int): + # integer or None. Negative maxsize is treated as 0. + if isinstance(maxsize, int): + if maxsize < 0: + maxsize = 0 + elif maxsize is not None: raise TypeError('Expected maxsize to be an integer or None') def decorating_function(user_function): @@ -578,6 +581,7 @@ def wrapper(*args, **kwds): link[NEXT] = root hits += 1 return result + misses += 1 result = user_function(*args, **kwds) with lock: if key in cache: @@ -615,7 +619,6 @@ def wrapper(*args, **kwds): # Use the cache_len bound method instead of the len() function # which could potentially be wrapped in an lru_cache itself. full = (cache_len() >= maxsize) - misses += 1 return result def cache_info(): diff --git a/Lib/test/test_functools.py b/Lib/test/test_functools.py index ffbd0fcf2d80..63a9ade54806 100644 --- a/Lib/test/test_functools.py +++ b/Lib/test/test_functools.py @@ -1233,6 +1233,33 @@ def f(x): self.assertEqual(misses, 4) self.assertEqual(currsize, 2) + def test_lru_bug_35780(self): + # C version of the lru_cache was not checking to see if + # the user function call has already modified the cache + # (this arises in recursive calls and in multi-threading). + # This cause the cache to have orphan links not referenced + # by the cache dictionary. + + once = True # Modified by f(x) below + + @self.module.lru_cache(maxsize=10) + def f(x): + nonlocal once + rv = f'.{x}.' + if x == 20 and once: + once = False + rv = f(x) + return rv + + # Fill the cache + for x in range(15): + self.assertEqual(f(x), f'.{x}.') + self.assertEqual(f.cache_info().currsize, 10) + + # Make a recursive call and make sure the cache remains full + self.assertEqual(f(20), '.20.') + self.assertEqual(f.cache_info().currsize, 10) + def test_lru_hash_only_once(self): # To protect against weird reentrancy bugs and to improve # efficiency when faced with slow __hash__ methods, the @@ -1329,7 +1356,7 @@ def eq(n): for i in (0, 1): self.assertEqual([eq(n) for n in range(150)], list(range(150))) self.assertEqual(eq.cache_info(), - self.module._CacheInfo(hits=0, misses=300, maxsize=-10, currsize=1)) + self.module._CacheInfo(hits=0, misses=300, maxsize=0, currsize=0)) def test_lru_with_exceptions(self): # Verify that user_function exceptions get passed through without diff --git a/Misc/NEWS.d/next/Library/2019-01-19-17-01-43.bpo-35780.CLf7fT.rst b/Misc/NEWS.d/next/Library/2019-01-19-17-01-43.bpo-35780.CLf7fT.rst new file mode 100644 index 000000000000..d44488272170 --- /dev/null +++ b/Misc/NEWS.d/next/Library/2019-01-19-17-01-43.bpo-35780.CLf7fT.rst @@ -0,0 +1,11 @@ +Fix lru_cache() errors arising in recursive, reentrant, or +multi-threaded code. These errors could result in orphan links and in +the cache being trapped in a state with fewer than the specified maximum +number of links. Fix handling of negative maxsize which should have +been treated as zero. Fix errors in toggling the "full" status flag. +Fix misordering of links when errors are encountered. Sync-up the C +code and pure Python code for the space saving path in functions with a +single positional argument. In this common case, the space overhead of +an lru cache entry is reduced by almost half. Fix counting of cache +misses. In error cases, the miss count was out of sync with the actual +number of times the underlying user function was called. diff --git a/Modules/_functoolsmodule.c b/Modules/_functoolsmodule.c index 0fb4847af9c3..141210204ca5 100644 --- a/Modules/_functoolsmodule.c +++ b/Modules/_functoolsmodule.c @@ -711,16 +711,15 @@ typedef PyObject *(*lru_cache_ternaryfunc)(struct lru_cache_object *, PyObject * typedef struct lru_cache_object { lru_list_elem root; /* includes PyObject_HEAD */ - Py_ssize_t maxsize; - PyObject *maxsize_O; - PyObject *func; lru_cache_ternaryfunc wrapper; + int typed; PyObject *cache; + Py_ssize_t hits; + PyObject *func; + Py_ssize_t maxsize; + Py_ssize_t misses; PyObject *cache_info_type; - Py_ssize_t misses, hits; - int typed; PyObject *dict; - int full; } lru_cache_object; static PyTypeObject lru_cache_type; @@ -733,6 +732,15 @@ lru_cache_make_key(PyObject *args, PyObject *kwds, int typed) /* short path, key will match args anyway, which is a tuple */ if (!typed && !kwds) { + if (PyTuple_GET_SIZE(args) == 1) { + key = PyTuple_GET_ITEM(args, 0); + if (PyUnicode_CheckExact(key) || PyLong_CheckExact(key)) { + /* For common scalar keys, save space by + dropping the enclosing args tuple */ + Py_INCREF(key); + return key; + } + } Py_INCREF(args); return args; } @@ -835,10 +843,12 @@ infinite_lru_cache_wrapper(lru_cache_object *self, PyObject *args, PyObject *kwd } static void -lru_cache_extricate_link(lru_list_elem *link) +lru_cache_extract_link(lru_list_elem *link) { - link->prev->next = link->next; - link->next->prev = link->prev; + lru_list_elem *link_prev = link->prev; + lru_list_elem *link_next = link->next; + link_prev->next = link->next; + link_next->prev = link->prev; } static void @@ -851,11 +861,52 @@ lru_cache_append_link(lru_cache_object *self, lru_list_elem *link) link->next = root; } +static void +lru_cache_prepend_link(lru_cache_object *self, lru_list_elem *link) +{ + lru_list_elem *root = &self->root; + lru_list_elem *first = root->next; + first->prev = root->next = link; + link->prev = root; + link->next = first; +} + +/* General note on reentrancy: + + There are four dictionary calls in the bounded_lru_cache_wrapper(): + 1) The initial check for a cache match. 2) The post user-function + check for a cache match. 3) The deletion of the oldest entry. + 4) The addition of the newest entry. + + In all four calls, we have a known hash which lets use avoid a call + to __hash__(). That leaves only __eq__ as a possible source of a + reentrant call. + + The __eq__ method call is always made for a cache hit (dict access #1). + Accordingly, we have make sure not modify the cache state prior to + this call. + + The __eq__ method call is never made for the deletion (dict access #3) + because it is an identity match. + + For the other two accesses (#2 and #4), calls to __eq__ only occur + when some other entry happens to have an exactly matching hash (all + 64-bits). Though rare, this can happen, so we have to make sure to + either call it at the top of its code path before any cache + state modifications (dict access #2) or be prepared to restore + invariants at the end of the code path (dict access #4). + + Another possible source of reentrancy is a decref which can trigger + arbitrary code execution. To make the code easier to reason about, + the decrefs are deferred to the end of the each possible code path + so that we know the cache is a consistent state. + */ + static PyObject * bounded_lru_cache_wrapper(lru_cache_object *self, PyObject *args, PyObject *kwds) { lru_list_elem *link; - PyObject *key, *result; + PyObject *key, *result, *testresult; Py_hash_t hash; key = lru_cache_make_key(args, kwds, self->typed); @@ -867,11 +918,11 @@ bounded_lru_cache_wrapper(lru_cache_object *self, PyObject *args, PyObject *kwds return NULL; } link = (lru_list_elem *)_PyDict_GetItem_KnownHash(self->cache, key, hash); - if (link) { - lru_cache_extricate_link(link); + if (link != NULL) { + lru_cache_extract_link(link); lru_cache_append_link(self, link); - self->hits++; result = link->result; + self->hits++; Py_INCREF(result); Py_DECREF(key); return result; @@ -880,65 +931,38 @@ bounded_lru_cache_wrapper(lru_cache_object *self, PyObject *args, PyObject *kwds Py_DECREF(key); return NULL; } + self->misses++; result = PyObject_Call(self->func, args, kwds); if (!result) { Py_DECREF(key); return NULL; } - if (self->full && self->root.next != &self->root) { - /* Use the oldest item to store the new key and result. */ - PyObject *oldkey, *oldresult, *popresult; - /* Extricate the oldest item. */ - link = self->root.next; - lru_cache_extricate_link(link); - /* Remove it from the cache. - The cache dict holds one reference to the link, - and the linked list holds yet one reference to it. */ - popresult = _PyDict_Pop_KnownHash(self->cache, - link->key, link->hash, - Py_None); - if (popresult == Py_None) { - /* Getting here means that this same key was added to the - cache while the lock was released. Since the link - update is already done, we need only return the - computed result and update the count of misses. */ - Py_DECREF(popresult); - Py_DECREF(link); - Py_DECREF(key); - } - else if (popresult == NULL) { - lru_cache_append_link(self, link); - Py_DECREF(key); - Py_DECREF(result); - return NULL; - } - else { - Py_DECREF(popresult); - /* Keep a reference to the old key and old result to - prevent their ref counts from going to zero during the - update. That will prevent potentially arbitrary object - clean-up code (i.e. __del__) from running while we're - still adjusting the links. */ - oldkey = link->key; - oldresult = link->result; - - link->hash = hash; - link->key = key; - link->result = result; - if (_PyDict_SetItem_KnownHash(self->cache, key, (PyObject *)link, - hash) < 0) { - Py_DECREF(link); - Py_DECREF(oldkey); - Py_DECREF(oldresult); - return NULL; - } - lru_cache_append_link(self, link); - Py_INCREF(result); /* for return */ - Py_DECREF(oldkey); - Py_DECREF(oldresult); - } - } else { - /* Put result in a new link at the front of the queue. */ + testresult = _PyDict_GetItem_KnownHash(self->cache, key, hash); + if (testresult != NULL) { + /* Getting here means that this same key was added to the cache + during the PyObject_Call(). Since the link update is already + done, we need only return the computed result. */ + Py_DECREF(key); + return result; + } + if (PyErr_Occurred()) { + /* This is an unusual case since this same lookup + did not previously trigger an error during lookup. + Treat it the same as an error in user function + and return with the error set. */ + Py_DECREF(key); + Py_DECREF(result); + return NULL; + } + /* This is the normal case. The new key wasn't found before + user function call and it is still not there. So we + proceed normally and update the cache with the new result. */ + + assert(self->maxsize > 0); + if (PyDict_GET_SIZE(self->cache) < self->maxsize || + self->root.next == &self->root) + { + /* Cache is not full, so put the result in a new link */ link = (lru_list_elem *)PyObject_New(lru_list_elem, &lru_list_elem_type); if (link == NULL) { @@ -950,6 +974,11 @@ bounded_lru_cache_wrapper(lru_cache_object *self, PyObject *args, PyObject *kwds link->hash = hash; link->key = key; link->result = result; + /* What is really needed here is a SetItem variant with a "no clobber" + option. If the __eq__ call triggers a reentrant call that adds + this same key, then this setitem call will update the cache dict + with this new link, leaving the old link as an orphan (i.e. not + having a cache dict entry that refers to it). */ if (_PyDict_SetItem_KnownHash(self->cache, key, (PyObject *)link, hash) < 0) { Py_DECREF(link); @@ -957,9 +986,83 @@ bounded_lru_cache_wrapper(lru_cache_object *self, PyObject *args, PyObject *kwds } lru_cache_append_link(self, link); Py_INCREF(result); /* for return */ - self->full = (PyDict_GET_SIZE(self->cache) >= self->maxsize); + return result; } - self->misses++; + /* Since the cache is full, we need to evict an old key and add + a new key. Rather than free the old link and allocate a new + one, we reuse the link for the new key and result and move it + to front of the cache to mark it as recently used. + + We try to assure all code paths (including errors) leave all + of the links in place. Either the link is successfully + updated and moved or it is restored to its old position. + However if an unrecoverable error is found, it doesn't + make sense to reinsert the link, so we leave it out + and the cache will no longer register as full. + */ + PyObject *oldkey, *oldresult, *popresult; + + /* Extract the oldest item. */ + assert(self->root.next != &self->root); + link = self->root.next; + lru_cache_extract_link(link); + /* Remove it from the cache. + The cache dict holds one reference to the link, + and the linked list holds yet one reference to it. */ + popresult = _PyDict_Pop_KnownHash(self->cache, link->key, + link->hash, Py_None); + if (popresult == Py_None) { + /* Getting here means that the user function call or another + thread has already removed the old key from the dictionary. + This link is now an orpan. Since we don't want to leave the + cache in an inconsistent state, we don't restore the link. */ + Py_DECREF(popresult); + Py_DECREF(link); + Py_DECREF(key); + return result; + } + if (popresult == NULL) { + /* An error arose while trying to remove the oldest key (the one + being evicted) from the cache. We restore the link to its + original position as the oldest link. Then we allow the + error propagate upward; treating it the same as an error + arising in the user function. */ + lru_cache_prepend_link(self, link); + Py_DECREF(key); + Py_DECREF(result); + return NULL; + } + /* Keep a reference to the old key and old result to prevent their + ref counts from going to zero during the update. That will + prevent potentially arbitrary object clean-up code (i.e. __del__) + from running while we're still adjusting the links. */ + oldkey = link->key; + oldresult = link->result; + + link->hash = hash; + link->key = key; + link->result = result; + /* Note: The link is being added to the cache dict without the + prev and next fields set to valid values. We have to wait + for successful insertion in the cache dict before adding the + link to the linked list. Otherwise, the potentially reentrant + __eq__ call could cause the then ophan link to be visited. */ + if (_PyDict_SetItem_KnownHash(self->cache, key, (PyObject *)link, + hash) < 0) { + /* Somehow the cache dict update failed. We no longer can + restore the old link. Let the error propagate upward and + leave the cache short one link. */ + Py_DECREF(popresult); + Py_DECREF(link); + Py_DECREF(oldkey); + Py_DECREF(oldresult); + return NULL; + } + lru_cache_append_link(self, link); + Py_INCREF(result); /* for return */ + Py_DECREF(popresult); + Py_DECREF(oldkey); + Py_DECREF(oldresult); return result; } @@ -995,6 +1098,9 @@ lru_cache_new(PyTypeObject *type, PyObject *args, PyObject *kw) maxsize = PyNumber_AsSsize_t(maxsize_O, PyExc_OverflowError); if (maxsize == -1 && PyErr_Occurred()) return NULL; + if (maxsize < 0) { + maxsize = 0; + } if (maxsize == 0) wrapper = uncached_lru_cache_wrapper; else @@ -1013,20 +1119,17 @@ lru_cache_new(PyTypeObject *type, PyObject *args, PyObject *kw) return NULL; } - obj->cache = cachedict; obj->root.prev = &obj->root; obj->root.next = &obj->root; - obj->maxsize = maxsize; - Py_INCREF(maxsize_O); - obj->maxsize_O = maxsize_O; + obj->wrapper = wrapper; + obj->typed = typed; + obj->cache = cachedict; Py_INCREF(func); obj->func = func; - obj->wrapper = wrapper; obj->misses = obj->hits = 0; - obj->typed = typed; + obj->maxsize = maxsize; Py_INCREF(cache_info_type); obj->cache_info_type = cache_info_type; - return (PyObject *)obj; } @@ -1060,11 +1163,10 @@ lru_cache_dealloc(lru_cache_object *obj) PyObject_GC_UnTrack(obj); list = lru_cache_unlink_list(obj); - Py_XDECREF(obj->maxsize_O); - Py_XDECREF(obj->func); Py_XDECREF(obj->cache); - Py_XDECREF(obj->dict); + Py_XDECREF(obj->func); Py_XDECREF(obj->cache_info_type); + Py_XDECREF(obj->dict); lru_cache_clear_list(list); Py_TYPE(obj)->tp_free(obj); } @@ -1088,8 +1190,13 @@ lru_cache_descr_get(PyObject *self, PyObject *obj, PyObject *type) static PyObject * lru_cache_cache_info(lru_cache_object *self, PyObject *unused) { - return PyObject_CallFunction(self->cache_info_type, "nnOn", - self->hits, self->misses, self->maxsize_O, + if (self->maxsize == -1) { + return PyObject_CallFunction(self->cache_info_type, "nnOn", + self->hits, self->misses, Py_None, + PyDict_GET_SIZE(self->cache)); + } + return PyObject_CallFunction(self->cache_info_type, "nnnn", + self->hits, self->misses, self->maxsize, PyDict_GET_SIZE(self->cache)); } @@ -1098,7 +1205,6 @@ lru_cache_cache_clear(lru_cache_object *self, PyObject *unused) { lru_list_elem *list = lru_cache_unlink_list(self); self->hits = self->misses = 0; - self->full = 0; PyDict_Clear(self->cache); lru_cache_clear_list(list); Py_RETURN_NONE; @@ -1134,7 +1240,6 @@ lru_cache_tp_traverse(lru_cache_object *self, visitproc visit, void *arg) Py_VISIT(link->result); link = next; } - Py_VISIT(self->maxsize_O); Py_VISIT(self->func); Py_VISIT(self->cache); Py_VISIT(self->cache_info_type); @@ -1146,7 +1251,6 @@ static int lru_cache_tp_clear(lru_cache_object *self) { lru_list_elem *list = lru_cache_unlink_list(self); - Py_CLEAR(self->maxsize_O); Py_CLEAR(self->func); Py_CLEAR(self->cache); Py_CLEAR(self->cache_info_type); From webhook-mailer at python.org Sat Jan 26 03:23:45 2019 From: webhook-mailer at python.org (Raymond Hettinger) Date: Sat, 26 Jan 2019 08:23:45 -0000 Subject: [Python-checkins] bpo-35780: Fix errors in lru_cache() C code (GH-11623) (GH-11682) Message-ID: https://github.com/python/cpython/commit/b2b023c657ba8c3f4a24d0c847d10fe8e2a73d44 commit: b2b023c657ba8c3f4a24d0c847d10fe8e2a73d44 branch: 3.7 author: Miss Islington (bot) <31488909+miss-islington at users.noreply.github.com> committer: Raymond Hettinger date: 2019-01-26T03:23:40-05:00 summary: bpo-35780: Fix errors in lru_cache() C code (GH-11623) (GH-11682) files: A Misc/NEWS.d/next/Library/2019-01-19-17-01-43.bpo-35780.CLf7fT.rst M Lib/functools.py M Lib/test/test_functools.py M Modules/_functoolsmodule.c diff --git a/Lib/functools.py b/Lib/functools.py index 24b011dc0428..592f156fe426 100644 --- a/Lib/functools.py +++ b/Lib/functools.py @@ -413,7 +413,7 @@ def __hash__(self): def _make_key(args, kwds, typed, kwd_mark = (object(),), - fasttypes = {int, str, frozenset, type(None)}, + fasttypes = {int, str}, tuple=tuple, type=type, len=len): """Make a cache key from optionally typed positional and keyword arguments @@ -469,8 +469,11 @@ def lru_cache(maxsize=128, typed=False): # Early detection of an erroneous call to @lru_cache without any arguments # resulting in the inner function being passed to maxsize instead of an - # integer or None. - if maxsize is not None and not isinstance(maxsize, int): + # integer or None. Negative maxsize is treated as 0. + if isinstance(maxsize, int): + if maxsize < 0: + maxsize = 0 + elif maxsize is not None: raise TypeError('Expected maxsize to be an integer or None') def decorating_function(user_function): @@ -537,6 +540,7 @@ def wrapper(*args, **kwds): link[NEXT] = root hits += 1 return result + misses += 1 result = user_function(*args, **kwds) with lock: if key in cache: @@ -574,7 +578,6 @@ def wrapper(*args, **kwds): # Use the cache_len bound method instead of the len() function # which could potentially be wrapped in an lru_cache itself. full = (cache_len() >= maxsize) - misses += 1 return result def cache_info(): diff --git a/Lib/test/test_functools.py b/Lib/test/test_functools.py index 7a0757f7e17b..a91c6348e709 100644 --- a/Lib/test/test_functools.py +++ b/Lib/test/test_functools.py @@ -1226,6 +1226,33 @@ def f(x): self.assertEqual(misses, 4) self.assertEqual(currsize, 2) + def test_lru_bug_35780(self): + # C version of the lru_cache was not checking to see if + # the user function call has already modified the cache + # (this arises in recursive calls and in multi-threading). + # This cause the cache to have orphan links not referenced + # by the cache dictionary. + + once = True # Modified by f(x) below + + @self.module.lru_cache(maxsize=10) + def f(x): + nonlocal once + rv = f'.{x}.' + if x == 20 and once: + once = False + rv = f(x) + return rv + + # Fill the cache + for x in range(15): + self.assertEqual(f(x), f'.{x}.') + self.assertEqual(f.cache_info().currsize, 10) + + # Make a recursive call and make sure the cache remains full + self.assertEqual(f(20), '.20.') + self.assertEqual(f.cache_info().currsize, 10) + def test_lru_hash_only_once(self): # To protect against weird reentrancy bugs and to improve # efficiency when faced with slow __hash__ methods, the @@ -1322,7 +1349,7 @@ def eq(n): for i in (0, 1): self.assertEqual([eq(n) for n in range(150)], list(range(150))) self.assertEqual(eq.cache_info(), - self.module._CacheInfo(hits=0, misses=300, maxsize=-10, currsize=1)) + self.module._CacheInfo(hits=0, misses=300, maxsize=0, currsize=0)) def test_lru_with_exceptions(self): # Verify that user_function exceptions get passed through without diff --git a/Misc/NEWS.d/next/Library/2019-01-19-17-01-43.bpo-35780.CLf7fT.rst b/Misc/NEWS.d/next/Library/2019-01-19-17-01-43.bpo-35780.CLf7fT.rst new file mode 100644 index 000000000000..d44488272170 --- /dev/null +++ b/Misc/NEWS.d/next/Library/2019-01-19-17-01-43.bpo-35780.CLf7fT.rst @@ -0,0 +1,11 @@ +Fix lru_cache() errors arising in recursive, reentrant, or +multi-threaded code. These errors could result in orphan links and in +the cache being trapped in a state with fewer than the specified maximum +number of links. Fix handling of negative maxsize which should have +been treated as zero. Fix errors in toggling the "full" status flag. +Fix misordering of links when errors are encountered. Sync-up the C +code and pure Python code for the space saving path in functions with a +single positional argument. In this common case, the space overhead of +an lru cache entry is reduced by almost half. Fix counting of cache +misses. In error cases, the miss count was out of sync with the actual +number of times the underlying user function was called. diff --git a/Modules/_functoolsmodule.c b/Modules/_functoolsmodule.c index ff4172d663b6..4aca63e38c8b 100644 --- a/Modules/_functoolsmodule.c +++ b/Modules/_functoolsmodule.c @@ -711,16 +711,15 @@ typedef PyObject *(*lru_cache_ternaryfunc)(struct lru_cache_object *, PyObject * typedef struct lru_cache_object { lru_list_elem root; /* includes PyObject_HEAD */ - Py_ssize_t maxsize; - PyObject *maxsize_O; - PyObject *func; lru_cache_ternaryfunc wrapper; + int typed; PyObject *cache; + Py_ssize_t hits; + PyObject *func; + Py_ssize_t maxsize; + Py_ssize_t misses; PyObject *cache_info_type; - Py_ssize_t misses, hits; - int typed; PyObject *dict; - int full; } lru_cache_object; static PyTypeObject lru_cache_type; @@ -733,6 +732,15 @@ lru_cache_make_key(PyObject *args, PyObject *kwds, int typed) /* short path, key will match args anyway, which is a tuple */ if (!typed && !kwds) { + if (PyTuple_GET_SIZE(args) == 1) { + key = PyTuple_GET_ITEM(args, 0); + if (PyUnicode_CheckExact(key) || PyLong_CheckExact(key)) { + /* For common scalar keys, save space by + dropping the enclosing args tuple */ + Py_INCREF(key); + return key; + } + } Py_INCREF(args); return args; } @@ -835,10 +843,12 @@ infinite_lru_cache_wrapper(lru_cache_object *self, PyObject *args, PyObject *kwd } static void -lru_cache_extricate_link(lru_list_elem *link) +lru_cache_extract_link(lru_list_elem *link) { - link->prev->next = link->next; - link->next->prev = link->prev; + lru_list_elem *link_prev = link->prev; + lru_list_elem *link_next = link->next; + link_prev->next = link->next; + link_next->prev = link->prev; } static void @@ -851,11 +861,52 @@ lru_cache_append_link(lru_cache_object *self, lru_list_elem *link) link->next = root; } +static void +lru_cache_prepend_link(lru_cache_object *self, lru_list_elem *link) +{ + lru_list_elem *root = &self->root; + lru_list_elem *first = root->next; + first->prev = root->next = link; + link->prev = root; + link->next = first; +} + +/* General note on reentrancy: + + There are four dictionary calls in the bounded_lru_cache_wrapper(): + 1) The initial check for a cache match. 2) The post user-function + check for a cache match. 3) The deletion of the oldest entry. + 4) The addition of the newest entry. + + In all four calls, we have a known hash which lets use avoid a call + to __hash__(). That leaves only __eq__ as a possible source of a + reentrant call. + + The __eq__ method call is always made for a cache hit (dict access #1). + Accordingly, we have make sure not modify the cache state prior to + this call. + + The __eq__ method call is never made for the deletion (dict access #3) + because it is an identity match. + + For the other two accesses (#2 and #4), calls to __eq__ only occur + when some other entry happens to have an exactly matching hash (all + 64-bits). Though rare, this can happen, so we have to make sure to + either call it at the top of its code path before any cache + state modifications (dict access #2) or be prepared to restore + invariants at the end of the code path (dict access #4). + + Another possible source of reentrancy is a decref which can trigger + arbitrary code execution. To make the code easier to reason about, + the decrefs are deferred to the end of the each possible code path + so that we know the cache is a consistent state. + */ + static PyObject * bounded_lru_cache_wrapper(lru_cache_object *self, PyObject *args, PyObject *kwds) { lru_list_elem *link; - PyObject *key, *result; + PyObject *key, *result, *testresult; Py_hash_t hash; key = lru_cache_make_key(args, kwds, self->typed); @@ -867,11 +918,11 @@ bounded_lru_cache_wrapper(lru_cache_object *self, PyObject *args, PyObject *kwds return NULL; } link = (lru_list_elem *)_PyDict_GetItem_KnownHash(self->cache, key, hash); - if (link) { - lru_cache_extricate_link(link); + if (link != NULL) { + lru_cache_extract_link(link); lru_cache_append_link(self, link); - self->hits++; result = link->result; + self->hits++; Py_INCREF(result); Py_DECREF(key); return result; @@ -880,65 +931,38 @@ bounded_lru_cache_wrapper(lru_cache_object *self, PyObject *args, PyObject *kwds Py_DECREF(key); return NULL; } + self->misses++; result = PyObject_Call(self->func, args, kwds); if (!result) { Py_DECREF(key); return NULL; } - if (self->full && self->root.next != &self->root) { - /* Use the oldest item to store the new key and result. */ - PyObject *oldkey, *oldresult, *popresult; - /* Extricate the oldest item. */ - link = self->root.next; - lru_cache_extricate_link(link); - /* Remove it from the cache. - The cache dict holds one reference to the link, - and the linked list holds yet one reference to it. */ - popresult = _PyDict_Pop_KnownHash(self->cache, - link->key, link->hash, - Py_None); - if (popresult == Py_None) { - /* Getting here means that this same key was added to the - cache while the lock was released. Since the link - update is already done, we need only return the - computed result and update the count of misses. */ - Py_DECREF(popresult); - Py_DECREF(link); - Py_DECREF(key); - } - else if (popresult == NULL) { - lru_cache_append_link(self, link); - Py_DECREF(key); - Py_DECREF(result); - return NULL; - } - else { - Py_DECREF(popresult); - /* Keep a reference to the old key and old result to - prevent their ref counts from going to zero during the - update. That will prevent potentially arbitrary object - clean-up code (i.e. __del__) from running while we're - still adjusting the links. */ - oldkey = link->key; - oldresult = link->result; - - link->hash = hash; - link->key = key; - link->result = result; - if (_PyDict_SetItem_KnownHash(self->cache, key, (PyObject *)link, - hash) < 0) { - Py_DECREF(link); - Py_DECREF(oldkey); - Py_DECREF(oldresult); - return NULL; - } - lru_cache_append_link(self, link); - Py_INCREF(result); /* for return */ - Py_DECREF(oldkey); - Py_DECREF(oldresult); - } - } else { - /* Put result in a new link at the front of the queue. */ + testresult = _PyDict_GetItem_KnownHash(self->cache, key, hash); + if (testresult != NULL) { + /* Getting here means that this same key was added to the cache + during the PyObject_Call(). Since the link update is already + done, we need only return the computed result. */ + Py_DECREF(key); + return result; + } + if (PyErr_Occurred()) { + /* This is an unusual case since this same lookup + did not previously trigger an error during lookup. + Treat it the same as an error in user function + and return with the error set. */ + Py_DECREF(key); + Py_DECREF(result); + return NULL; + } + /* This is the normal case. The new key wasn't found before + user function call and it is still not there. So we + proceed normally and update the cache with the new result. */ + + assert(self->maxsize > 0); + if (PyDict_GET_SIZE(self->cache) < self->maxsize || + self->root.next == &self->root) + { + /* Cache is not full, so put the result in a new link */ link = (lru_list_elem *)PyObject_New(lru_list_elem, &lru_list_elem_type); if (link == NULL) { @@ -950,6 +974,11 @@ bounded_lru_cache_wrapper(lru_cache_object *self, PyObject *args, PyObject *kwds link->hash = hash; link->key = key; link->result = result; + /* What is really needed here is a SetItem variant with a "no clobber" + option. If the __eq__ call triggers a reentrant call that adds + this same key, then this setitem call will update the cache dict + with this new link, leaving the old link as an orphan (i.e. not + having a cache dict entry that refers to it). */ if (_PyDict_SetItem_KnownHash(self->cache, key, (PyObject *)link, hash) < 0) { Py_DECREF(link); @@ -957,9 +986,83 @@ bounded_lru_cache_wrapper(lru_cache_object *self, PyObject *args, PyObject *kwds } lru_cache_append_link(self, link); Py_INCREF(result); /* for return */ - self->full = (PyDict_GET_SIZE(self->cache) >= self->maxsize); + return result; } - self->misses++; + /* Since the cache is full, we need to evict an old key and add + a new key. Rather than free the old link and allocate a new + one, we reuse the link for the new key and result and move it + to front of the cache to mark it as recently used. + + We try to assure all code paths (including errors) leave all + of the links in place. Either the link is successfully + updated and moved or it is restored to its old position. + However if an unrecoverable error is found, it doesn't + make sense to reinsert the link, so we leave it out + and the cache will no longer register as full. + */ + PyObject *oldkey, *oldresult, *popresult; + + /* Extract the oldest item. */ + assert(self->root.next != &self->root); + link = self->root.next; + lru_cache_extract_link(link); + /* Remove it from the cache. + The cache dict holds one reference to the link, + and the linked list holds yet one reference to it. */ + popresult = _PyDict_Pop_KnownHash(self->cache, link->key, + link->hash, Py_None); + if (popresult == Py_None) { + /* Getting here means that the user function call or another + thread has already removed the old key from the dictionary. + This link is now an orpan. Since we don't want to leave the + cache in an inconsistent state, we don't restore the link. */ + Py_DECREF(popresult); + Py_DECREF(link); + Py_DECREF(key); + return result; + } + if (popresult == NULL) { + /* An error arose while trying to remove the oldest key (the one + being evicted) from the cache. We restore the link to its + original position as the oldest link. Then we allow the + error propagate upward; treating it the same as an error + arising in the user function. */ + lru_cache_prepend_link(self, link); + Py_DECREF(key); + Py_DECREF(result); + return NULL; + } + /* Keep a reference to the old key and old result to prevent their + ref counts from going to zero during the update. That will + prevent potentially arbitrary object clean-up code (i.e. __del__) + from running while we're still adjusting the links. */ + oldkey = link->key; + oldresult = link->result; + + link->hash = hash; + link->key = key; + link->result = result; + /* Note: The link is being added to the cache dict without the + prev and next fields set to valid values. We have to wait + for successful insertion in the cache dict before adding the + link to the linked list. Otherwise, the potentially reentrant + __eq__ call could cause the then ophan link to be visited. */ + if (_PyDict_SetItem_KnownHash(self->cache, key, (PyObject *)link, + hash) < 0) { + /* Somehow the cache dict update failed. We no longer can + restore the old link. Let the error propagate upward and + leave the cache short one link. */ + Py_DECREF(popresult); + Py_DECREF(link); + Py_DECREF(oldkey); + Py_DECREF(oldresult); + return NULL; + } + lru_cache_append_link(self, link); + Py_INCREF(result); /* for return */ + Py_DECREF(popresult); + Py_DECREF(oldkey); + Py_DECREF(oldresult); return result; } @@ -995,6 +1098,9 @@ lru_cache_new(PyTypeObject *type, PyObject *args, PyObject *kw) maxsize = PyNumber_AsSsize_t(maxsize_O, PyExc_OverflowError); if (maxsize == -1 && PyErr_Occurred()) return NULL; + if (maxsize < 0) { + maxsize = 0; + } if (maxsize == 0) wrapper = uncached_lru_cache_wrapper; else @@ -1013,20 +1119,17 @@ lru_cache_new(PyTypeObject *type, PyObject *args, PyObject *kw) return NULL; } - obj->cache = cachedict; obj->root.prev = &obj->root; obj->root.next = &obj->root; - obj->maxsize = maxsize; - Py_INCREF(maxsize_O); - obj->maxsize_O = maxsize_O; + obj->wrapper = wrapper; + obj->typed = typed; + obj->cache = cachedict; Py_INCREF(func); obj->func = func; - obj->wrapper = wrapper; obj->misses = obj->hits = 0; - obj->typed = typed; + obj->maxsize = maxsize; Py_INCREF(cache_info_type); obj->cache_info_type = cache_info_type; - return (PyObject *)obj; } @@ -1060,11 +1163,10 @@ lru_cache_dealloc(lru_cache_object *obj) PyObject_GC_UnTrack(obj); list = lru_cache_unlink_list(obj); - Py_XDECREF(obj->maxsize_O); - Py_XDECREF(obj->func); Py_XDECREF(obj->cache); - Py_XDECREF(obj->dict); + Py_XDECREF(obj->func); Py_XDECREF(obj->cache_info_type); + Py_XDECREF(obj->dict); lru_cache_clear_list(list); Py_TYPE(obj)->tp_free(obj); } @@ -1088,8 +1190,13 @@ lru_cache_descr_get(PyObject *self, PyObject *obj, PyObject *type) static PyObject * lru_cache_cache_info(lru_cache_object *self, PyObject *unused) { - return PyObject_CallFunction(self->cache_info_type, "nnOn", - self->hits, self->misses, self->maxsize_O, + if (self->maxsize == -1) { + return PyObject_CallFunction(self->cache_info_type, "nnOn", + self->hits, self->misses, Py_None, + PyDict_GET_SIZE(self->cache)); + } + return PyObject_CallFunction(self->cache_info_type, "nnnn", + self->hits, self->misses, self->maxsize, PyDict_GET_SIZE(self->cache)); } @@ -1098,7 +1205,6 @@ lru_cache_cache_clear(lru_cache_object *self, PyObject *unused) { lru_list_elem *list = lru_cache_unlink_list(self); self->hits = self->misses = 0; - self->full = 0; PyDict_Clear(self->cache); lru_cache_clear_list(list); Py_RETURN_NONE; @@ -1134,7 +1240,6 @@ lru_cache_tp_traverse(lru_cache_object *self, visitproc visit, void *arg) Py_VISIT(link->result); link = next; } - Py_VISIT(self->maxsize_O); Py_VISIT(self->func); Py_VISIT(self->cache); Py_VISIT(self->cache_info_type); @@ -1146,7 +1251,6 @@ static int lru_cache_tp_clear(lru_cache_object *self) { lru_list_elem *list = lru_cache_unlink_list(self); - Py_CLEAR(self->maxsize_O); Py_CLEAR(self->func); Py_CLEAR(self->cache); Py_CLEAR(self->cache_info_type); From solipsis at pitrou.net Sat Jan 26 04:11:53 2019 From: solipsis at pitrou.net (solipsis at pitrou.net) Date: Sat, 26 Jan 2019 09:11:53 +0000 Subject: [Python-checkins] Daily reference leaks (4243df51fe43): sum=5 Message-ID: <20190126091153.1.B6016191D08DFBF8@psf.io> results for 4243df51fe43 on branch "default" -------------------------------------------- test_functools leaked [0, 3, 1] memory blocks, sum=4 test_multiprocessing_forkserver leaked [2, 0, -1] memory blocks, sum=1 Command line was: ['./python', '-m', 'test.regrtest', '-uall', '-R', '3:3:/home/psf-users/antoine/refleaks/reflogPqymrl', '--timeout', '7200'] From webhook-mailer at python.org Sat Jan 26 18:19:17 2019 From: webhook-mailer at python.org (Gregory P. Smith) Date: Sat, 26 Jan 2019 23:19:17 -0000 Subject: [Python-checkins] Fix docstr/comment typos in _use_posix_spawn(). (GH-11684) Message-ID: https://github.com/python/cpython/commit/81d04bcf2124341aa73e5c13c1f1c4bdd3f5dbef commit: 81d04bcf2124341aa73e5c13c1f1c4bdd3f5dbef branch: master author: Gregory P. Smith committer: GitHub date: 2019-01-26T15:19:11-08:00 summary: Fix docstr/comment typos in _use_posix_spawn(). (GH-11684) files: M Lib/subprocess.py diff --git a/Lib/subprocess.py b/Lib/subprocess.py index 2300c7352e0d..1f6eb63b387f 100644 --- a/Lib/subprocess.py +++ b/Lib/subprocess.py @@ -607,17 +607,17 @@ def getoutput(cmd): def _use_posix_spawn(): - """Check is posix_spawn() can be used for subprocess. + """Check if posix_spawn() can be used for subprocess. - subprocess requires a posix_spawn() implementation that reports properly - errors to the parent process, set errno on the following failures: + subprocess requires a posix_spawn() implementation that properly reports + errors to the parent process, & sets errno on the following failures: - * process attribute actions failed - * file actions failed - * exec() failed + * Process attribute actions failed. + * File actions failed. + * exec() failed. - Prefer an implementation which can use vfork in some cases for best - performances. + Prefer an implementation which can use vfork() in some cases for best + performance. """ if _mswindows or not hasattr(os, 'posix_spawn'): # os.posix_spawn() is not available @@ -642,15 +642,14 @@ def _use_posix_spawn(): # glibc 2.24 has a new Linux posix_spawn implementation using vfork # which properly reports errors to the parent process. return True - # Note: Don't use the POSIX implementation of glibc because it doesn't + # Note: Don't use the implementation in earlier glibc because it doesn't # use vfork (even if glibc 2.26 added a pipe to properly report errors # to the parent process). except (AttributeError, ValueError, OSError): # os.confstr() or CS_GNU_LIBC_VERSION value not available pass - # By default, consider that the implementation does not properly report - # errors. + # By default, assume that posix_spawn() does not properly report errors. return False From solipsis at pitrou.net Sun Jan 27 04:09:08 2019 From: solipsis at pitrou.net (solipsis at pitrou.net) Date: Sun, 27 Jan 2019 09:09:08 +0000 Subject: [Python-checkins] Daily reference leaks (4243df51fe43): sum=-3 Message-ID: <20190127090908.1.00535BAEC2B3236D@psf.io> results for 4243df51fe43 on branch "default" -------------------------------------------- test_collections leaked [-7, 1, 0] memory blocks, sum=-6 test_functools leaked [0, 3, 1] memory blocks, sum=4 test_multiprocessing_fork leaked [2, -1, -1] memory blocks, sum=0 test_multiprocessing_spawn leaked [-2, 2, -1] memory blocks, sum=-1 Command line was: ['./python', '-m', 'test.regrtest', '-uall', '-R', '3:3:/home/psf-users/antoine/refleaks/reflogJfJM8d', '--timeout', '7200'] From webhook-mailer at python.org Sun Jan 27 11:21:17 2019 From: webhook-mailer at python.org (Nick Coghlan) Date: Sun, 27 Jan 2019 16:21:17 -0000 Subject: [Python-checkins] Clarify U-mode deprecation in open() (GH-11646) Message-ID: https://github.com/python/cpython/commit/3171df34141c1f26ec16dccb4357184c0cf6c58f commit: 3171df34141c1f26ec16dccb4357184c0cf6c58f branch: master author: Nick Coghlan committer: GitHub date: 2019-01-28T02:21:11+10:00 summary: Clarify U-mode deprecation in open() (GH-11646) The previous wording could be read as saying that universal newlines mode itself was deprecated, when it's only the 'U' character in the mode field that should be avoided. The update also moves the description of the 'U' mode character out of the mode table, as the longer explanation was overly intrusive as a table entry and overshadowed the actually useful mode characters. files: M Doc/library/functions.rst diff --git a/Doc/library/functions.rst b/Doc/library/functions.rst index b49d752b16a3..cca28ff5fb3b 100644 --- a/Doc/library/functions.rst +++ b/Doc/library/functions.rst @@ -1004,7 +1004,6 @@ are always available. They are listed here in alphabetical order. ``'b'`` binary mode ``'t'`` text mode (default) ``'+'`` open a disk file for updating (reading and writing) - ``'U'`` :term:`universal newlines` mode (deprecated) ========= =============================================================== The default mode is ``'r'`` (open for reading text, synonym of ``'rt'``). @@ -1019,6 +1018,12 @@ are always available. They are listed here in alphabetical order. first decoded using a platform-dependent encoding or using the specified *encoding* if given. + There is an additional mode character permitted, ``'U'``, which no longer + has any effect, and is considered deprecated. It previously enabled + :term:`universal newlines` in text mode, which became the default behaviour + in Python 3.0. Refer to the documentation of the + :ref:`newline ` parameter for further details. + .. note:: Python doesn't depend on the underlying operating system's notion of text @@ -1085,6 +1090,8 @@ are always available. They are listed here in alphabetical order. .. index:: single: universal newlines; open() built-in function + .. _open-newline-parameter: + *newline* controls how :term:`universal newlines` mode works (it only applies to text mode). It can be ``None``, ``''``, ``'\n'``, ``'\r'``, and ``'\r\n'``. It works as follows: From webhook-mailer at python.org Sun Jan 27 11:28:03 2019 From: webhook-mailer at python.org (Miss Islington (bot)) Date: Sun, 27 Jan 2019 16:28:03 -0000 Subject: [Python-checkins] Clarify U-mode deprecation in open() (GH-11646) Message-ID: https://github.com/python/cpython/commit/658ff844963b68b420d663c188cd5fc78442db3f commit: 658ff844963b68b420d663c188cd5fc78442db3f branch: 3.7 author: Miss Islington (bot) <31488909+miss-islington at users.noreply.github.com> committer: GitHub date: 2019-01-27T08:27:58-08:00 summary: Clarify U-mode deprecation in open() (GH-11646) The previous wording could be read as saying that universal newlines mode itself was deprecated, when it's only the 'U' character in the mode field that should be avoided. The update also moves the description of the 'U' mode character out of the mode table, as the longer explanation was overly intrusive as a table entry and overshadowed the actually useful mode characters. (cherry picked from commit 3171df34141c1f26ec16dccb4357184c0cf6c58f) Co-authored-by: Nick Coghlan files: M Doc/library/functions.rst diff --git a/Doc/library/functions.rst b/Doc/library/functions.rst index ec3388346a65..abf3e26d9e8d 100644 --- a/Doc/library/functions.rst +++ b/Doc/library/functions.rst @@ -997,7 +997,6 @@ are always available. They are listed here in alphabetical order. ``'b'`` binary mode ``'t'`` text mode (default) ``'+'`` open a disk file for updating (reading and writing) - ``'U'`` :term:`universal newlines` mode (deprecated) ========= =============================================================== The default mode is ``'r'`` (open for reading text, synonym of ``'rt'``). @@ -1012,6 +1011,12 @@ are always available. They are listed here in alphabetical order. first decoded using a platform-dependent encoding or using the specified *encoding* if given. + There is an additional mode character permitted, ``'U'``, which no longer + has any effect, and is considered deprecated. It previously enabled + :term:`universal newlines` in text mode, which became the default behaviour + in Python 3.0. Refer to the documentation of the + :ref:`newline ` parameter for further details. + .. note:: Python doesn't depend on the underlying operating system's notion of text @@ -1078,6 +1083,8 @@ are always available. They are listed here in alphabetical order. .. index:: single: universal newlines; open() built-in function + .. _open-newline-parameter: + *newline* controls how :term:`universal newlines` mode works (it only applies to text mode). It can be ``None``, ``''``, ``'\n'``, ``'\r'``, and ``'\r\n'``. It works as follows: From webhook-mailer at python.org Sun Jan 27 17:07:52 2019 From: webhook-mailer at python.org (Miss Islington (bot)) Date: Sun, 27 Jan 2019 22:07:52 -0000 Subject: [Python-checkins] Fix typo: class declaration (GH-11678) Message-ID: https://github.com/python/cpython/commit/dfc8bb987d1fcba9225a19542c0fb9132b846b5b commit: dfc8bb987d1fcba9225a19542c0fb9132b846b5b branch: master author: nu_no committer: Miss Islington (bot) <31488909+miss-islington at users.noreply.github.com> date: 2019-01-27T14:07:47-08:00 summary: Fix typo: class declaration (GH-11678) files: M Doc/library/enum.rst diff --git a/Doc/library/enum.rst b/Doc/library/enum.rst index 702eacd0e98a..19277d76995f 100644 --- a/Doc/library/enum.rst +++ b/Doc/library/enum.rst @@ -394,7 +394,7 @@ A new :class:`Enum` class must have one base Enum class, up to one concrete data type, and as many :class:`object`-based mixin classes as needed. The order of these base classes is:: - def EnumName([mix-in, ...,] [data-type,] base-enum): + class EnumName([mix-in, ...,] [data-type,] base-enum): pass Also, subclassing an enumeration is allowed only if the enumeration does not define From webhook-mailer at python.org Sun Jan 27 17:25:53 2019 From: webhook-mailer at python.org (Miss Islington (bot)) Date: Sun, 27 Jan 2019 22:25:53 -0000 Subject: [Python-checkins] Fix typo: class declaration (GH-11678) Message-ID: https://github.com/python/cpython/commit/ff27f8145d7194fb3891b610443dee15be8f8f63 commit: ff27f8145d7194fb3891b610443dee15be8f8f63 branch: 3.7 author: Miss Islington (bot) <31488909+miss-islington at users.noreply.github.com> committer: GitHub date: 2019-01-27T14:25:49-08:00 summary: Fix typo: class declaration (GH-11678) (cherry picked from commit dfc8bb987d1fcba9225a19542c0fb9132b846b5b) Co-authored-by: nu_no files: M Doc/library/enum.rst diff --git a/Doc/library/enum.rst b/Doc/library/enum.rst index 81e9766e429b..a6285ffaf191 100644 --- a/Doc/library/enum.rst +++ b/Doc/library/enum.rst @@ -394,7 +394,7 @@ A new :class:`Enum` class must have one base Enum class, up to one concrete data type, and as many :class:`object`-based mixin classes as needed. The order of these base classes is:: - def EnumName([mix-in, ...,] [data-type,] base-enum): + class EnumName([mix-in, ...,] [data-type,] base-enum): pass Also, subclassing an enumeration is allowed only if the enumeration does not define From solipsis at pitrou.net Mon Jan 28 04:08:33 2019 From: solipsis at pitrou.net (solipsis at pitrou.net) Date: Mon, 28 Jan 2019 09:08:33 +0000 Subject: [Python-checkins] Daily reference leaks (4243df51fe43): sum=1 Message-ID: <20190128090833.1.E4E72AFC511790B2@psf.io> results for 4243df51fe43 on branch "default" -------------------------------------------- test_functools leaked [0, 3, 1] memory blocks, sum=4 test_multiprocessing_fork leaked [-2, 2, -1] memory blocks, sum=-1 test_multiprocessing_forkserver leaked [-1, 2, -1] memory blocks, sum=0 test_multiprocessing_spawn leaked [0, 0, -2] memory blocks, sum=-2 Command line was: ['./python', '-m', 'test.regrtest', '-uall', '-R', '3:3:/home/psf-users/antoine/refleaks/refloglf3rbV', '--timeout', '7200'] From webhook-mailer at python.org Mon Jan 28 04:31:30 2019 From: webhook-mailer at python.org (Victor Stinner) Date: Mon, 28 Jan 2019 09:31:30 -0000 Subject: [Python-checkins] bpo-35701: Update doc for UUID weak referencing (GH-11621) Message-ID: https://github.com/python/cpython/commit/ea446409cd5f1364beafd5e5255da6799993f285 commit: ea446409cd5f1364beafd5e5255da6799993f285 branch: master author: David H committer: Victor Stinner date: 2019-01-28T10:31:19+01:00 summary: bpo-35701: Update doc for UUID weak referencing (GH-11621) files: M Doc/whatsnew/3.8.rst diff --git a/Doc/whatsnew/3.8.rst b/Doc/whatsnew/3.8.rst index ce46afd48158..fb25ce2f7669 100644 --- a/Doc/whatsnew/3.8.rst +++ b/Doc/whatsnew/3.8.rst @@ -316,8 +316,6 @@ Optimizations (Contributed by Inada Naoki in :issue:`33597`) * :class:`uuid.UUID` now uses ``__slots__`` to reduce its memory footprint. - Note that this means that instances can no longer be weak-referenced and - that arbitrary attributes can no longer be added to them. * The :class:`list` constructor does not overallocate the internal item buffer if the input iterable has a known length (the input implements ``__len__``). @@ -514,9 +512,6 @@ Changes in the Python API * The function :func:`math.factorial` no longer accepts arguments that are not int-like. (Contributed by Pablo Galindo in :issue:`33083`.) -* :class:`uuid.UUID` now uses ``__slots__``, therefore instances can no longer - be weak-referenced and attributes can no longer be added. - * :mod:`xml.dom.minidom` and :mod:`xml.sax` modules no longer process external entities by default. (Contributed by Christian Heimes in :issue:`17239`.) From webhook-mailer at python.org Mon Jan 28 17:00:11 2019 From: webhook-mailer at python.org (Raymond Hettinger) Date: Mon, 28 Jan 2019 22:00:11 -0000 Subject: [Python-checkins] Fast path for int inputs to math.dist() and math.hypot() (GH-11692) Message-ID: https://github.com/python/cpython/commit/808180c206fbde390d9dbdf24a8989fc8a6446ec commit: 808180c206fbde390d9dbdf24a8989fc8a6446ec branch: master author: Raymond Hettinger committer: GitHub date: 2019-01-28T13:59:56-08:00 summary: Fast path for int inputs to math.dist() and math.hypot() (GH-11692) files: M Lib/test/test_math.py M Modules/mathmodule.c diff --git a/Lib/test/test_math.py b/Lib/test/test_math.py index b476a39e0aeb..f9b11f3f74e6 100644 --- a/Lib/test/test_math.py +++ b/Lib/test/test_math.py @@ -766,6 +766,9 @@ def testHypot(self): hypot(x=1) with self.assertRaises(TypeError): # Reject values without __float__ hypot(1.1, 'string', 2.2) + int_too_big_for_float = 10 ** (sys.float_info.max_10_exp + 5) + with self.assertRaises((ValueError, OverflowError)): + hypot(1, int_too_big_for_float) # Any infinity gives positive infinity. self.assertEqual(hypot(INF), INF) @@ -805,7 +808,8 @@ def testDist(self): dist = math.dist sqrt = math.sqrt - # Simple exact case + # Simple exact cases + self.assertEqual(dist((1.0, 2.0, 3.0), (4.0, 2.0, -1.0)), 5.0) self.assertEqual(dist((1, 2, 3), (4, 2, -1)), 5.0) # Test different numbers of arguments (from zero to nine) @@ -869,6 +873,11 @@ class T(tuple): dist((1, 2, 3), (4, 5, 6, 7)) with self.assertRaises(TypeError): # Rejects invalid types dist("abc", "xyz") + int_too_big_for_float = 10 ** (sys.float_info.max_10_exp + 5) + with self.assertRaises((ValueError, OverflowError)): + dist((1, int_too_big_for_float), (2, 3)) + with self.assertRaises((ValueError, OverflowError)): + dist((2, 3), (1, int_too_big_for_float)) # Verify that the one dimensional case is equivalent to abs() for i in range(20): diff --git a/Modules/mathmodule.c b/Modules/mathmodule.c index a190f5ccf7e3..c4353771d960 100644 --- a/Modules/mathmodule.c +++ b/Modules/mathmodule.c @@ -2144,7 +2144,14 @@ math_dist_impl(PyObject *module, PyObject *p, PyObject *q) item = PyTuple_GET_ITEM(p, i); if (PyFloat_CheckExact(item)) { px = PyFloat_AS_DOUBLE(item); - } else { + } + else if (PyLong_CheckExact(item)) { + px = PyLong_AsDouble(item); + if (px == -1.0 && PyErr_Occurred()) { + goto error_exit; + } + } + else { px = PyFloat_AsDouble(item); if (px == -1.0 && PyErr_Occurred()) { goto error_exit; @@ -2153,7 +2160,14 @@ math_dist_impl(PyObject *module, PyObject *p, PyObject *q) item = PyTuple_GET_ITEM(q, i); if (PyFloat_CheckExact(item)) { qx = PyFloat_AS_DOUBLE(item); - } else { + } + else if (PyLong_CheckExact(item)) { + qx = PyLong_AsDouble(item); + if (qx == -1.0 && PyErr_Occurred()) { + goto error_exit; + } + } + else { qx = PyFloat_AsDouble(item); if (qx == -1.0 && PyErr_Occurred()) { goto error_exit; @@ -2201,7 +2215,14 @@ math_hypot(PyObject *self, PyObject *const *args, Py_ssize_t nargs) item = args[i]; if (PyFloat_CheckExact(item)) { x = PyFloat_AS_DOUBLE(item); - } else { + } + else if (PyLong_CheckExact(item)) { + x = PyLong_AsDouble(item); + if (x == -1.0 && PyErr_Occurred()) { + goto error_exit; + } + } + else { x = PyFloat_AsDouble(item); if (x == -1.0 && PyErr_Occurred()) { goto error_exit; From solipsis at pitrou.net Tue Jan 29 04:06:59 2019 From: solipsis at pitrou.net (solipsis at pitrou.net) Date: Tue, 29 Jan 2019 09:06:59 +0000 Subject: [Python-checkins] Daily reference leaks (4243df51fe43): sum=8 Message-ID: <20190129090659.1.96029DF7D15EDB36@psf.io> results for 4243df51fe43 on branch "default" -------------------------------------------- test_asyncio leaked [3, 0, 0] memory blocks, sum=3 test_functools leaked [0, 3, 1] memory blocks, sum=4 test_multiprocessing_fork leaked [-2, 2, 0] memory blocks, sum=0 test_multiprocessing_forkserver leaked [-1, 2, 0] memory blocks, sum=1 Command line was: ['./python', '-m', 'test.regrtest', '-uall', '-R', '3:3:/home/psf-users/antoine/refleaks/reflog0Qujvw', '--timeout', '7200'] From webhook-mailer at python.org Tue Jan 29 11:16:17 2019 From: webhook-mailer at python.org (Miss Islington (bot)) Date: Tue, 29 Jan 2019 16:16:17 -0000 Subject: [Python-checkins] bpo-35847: RISC-V needs CTYPES_PASS_BY_REF_HACK (GH-11694) Message-ID: https://github.com/python/cpython/commit/742d768656512a469ce9571b1cbd777def7bc5ea commit: 742d768656512a469ce9571b1cbd777def7bc5ea branch: master author: Andreas Schwab committer: Miss Islington (bot) <31488909+miss-islington at users.noreply.github.com> date: 2019-01-29T08:16:10-08:00 summary: bpo-35847: RISC-V needs CTYPES_PASS_BY_REF_HACK (GH-11694) This fixes the ctypes.test.test_structures.StructureTestCase test. https://bugs.python.org/issue35847 files: A Misc/NEWS.d/next/Library/2019-01-29-09-11-09.bpo-35847.eiSi4t.rst M Modules/_ctypes/callproc.c diff --git a/Misc/NEWS.d/next/Library/2019-01-29-09-11-09.bpo-35847.eiSi4t.rst b/Misc/NEWS.d/next/Library/2019-01-29-09-11-09.bpo-35847.eiSi4t.rst new file mode 100644 index 000000000000..e3775f96f36e --- /dev/null +++ b/Misc/NEWS.d/next/Library/2019-01-29-09-11-09.bpo-35847.eiSi4t.rst @@ -0,0 +1 @@ +RISC-V needed the CTYPES_PASS_BY_REF_HACK. Fixes ctypes Structure test_pass_by_value. diff --git a/Modules/_ctypes/callproc.c b/Modules/_ctypes/callproc.c index a7965c19b70c..bed536402023 100644 --- a/Modules/_ctypes/callproc.c +++ b/Modules/_ctypes/callproc.c @@ -1058,7 +1058,7 @@ GetComError(HRESULT errcode, GUID *riid, IUnknown *pIunk) #endif #if (defined(__x86_64__) && (defined(__MINGW64__) || defined(__CYGWIN__))) || \ - defined(__aarch64__) + defined(__aarch64__) || defined(__riscv) #define CTYPES_PASS_BY_REF_HACK #define POW2(x) (((x & ~(x - 1)) == x) ? x : 0) #define IS_PASS_BY_REF(x) (x > 8 || !POW2(x)) From webhook-mailer at python.org Tue Jan 29 16:11:41 2019 From: webhook-mailer at python.org (Miss Islington (bot)) Date: Tue, 29 Jan 2019 21:11:41 -0000 Subject: [Python-checkins] bpo-35847: RISC-V needs CTYPES_PASS_BY_REF_HACK (GH-11694) Message-ID: https://github.com/python/cpython/commit/10354cbb5067b4719ba1c2d51d22314a644ed3e5 commit: 10354cbb5067b4719ba1c2d51d22314a644ed3e5 branch: 3.7 author: Miss Islington (bot) <31488909+miss-islington at users.noreply.github.com> committer: GitHub date: 2019-01-29T13:11:36-08:00 summary: bpo-35847: RISC-V needs CTYPES_PASS_BY_REF_HACK (GH-11694) This fixes the ctypes.test.test_structures.StructureTestCase test. https://bugs.python.org/issue35847 (cherry picked from commit 742d768656512a469ce9571b1cbd777def7bc5ea) Co-authored-by: Andreas Schwab files: A Misc/NEWS.d/next/Library/2019-01-29-09-11-09.bpo-35847.eiSi4t.rst M Modules/_ctypes/callproc.c diff --git a/Misc/NEWS.d/next/Library/2019-01-29-09-11-09.bpo-35847.eiSi4t.rst b/Misc/NEWS.d/next/Library/2019-01-29-09-11-09.bpo-35847.eiSi4t.rst new file mode 100644 index 000000000000..e3775f96f36e --- /dev/null +++ b/Misc/NEWS.d/next/Library/2019-01-29-09-11-09.bpo-35847.eiSi4t.rst @@ -0,0 +1 @@ +RISC-V needed the CTYPES_PASS_BY_REF_HACK. Fixes ctypes Structure test_pass_by_value. diff --git a/Modules/_ctypes/callproc.c b/Modules/_ctypes/callproc.c index ec596b4de31d..e971388f69b9 100644 --- a/Modules/_ctypes/callproc.c +++ b/Modules/_ctypes/callproc.c @@ -1052,7 +1052,7 @@ GetComError(HRESULT errcode, GUID *riid, IUnknown *pIunk) #endif #if (defined(__x86_64__) && (defined(__MINGW64__) || defined(__CYGWIN__))) || \ - defined(__aarch64__) + defined(__aarch64__) || defined(__riscv) #define CTYPES_PASS_BY_REF_HACK #define POW2(x) (((x & ~(x - 1)) == x) ? x : 0) #define IS_PASS_BY_REF(x) (x > 8 || !POW2(x)) From webhook-mailer at python.org Tue Jan 29 16:14:28 2019 From: webhook-mailer at python.org (Gregory P. Smith) Date: Tue, 29 Jan 2019 21:14:28 -0000 Subject: [Python-checkins] subprocess: close pipes/fds by using ExitStack (GH-11686) Message-ID: https://github.com/python/cpython/commit/bafa8487f77fa076de3a06755399daf81cb75598 commit: bafa8487f77fa076de3a06755399daf81cb75598 branch: master author: Giampaolo Rodola committer: Gregory P. Smith date: 2019-01-29T13:14:24-08:00 summary: subprocess: close pipes/fds by using ExitStack (GH-11686) Close pipes/fds in subprocess by using ExitStack. "In case of premature failure on X.Close() or os.close(X) the remaining pipes/fds will remain "open". Perhaps it makes sense to use contextlib.ExitStack." - Rationale: https://github.com/python/cpython/pull/11575#discussion_r250288394 files: A Misc/NEWS.d/next/Library/2019-01-29-17-24-52.bpo-35537.Q0ktFC.rst M Lib/subprocess.py diff --git a/Lib/subprocess.py b/Lib/subprocess.py index 1f6eb63b387f..0496b447e8ea 100644 --- a/Lib/subprocess.py +++ b/Lib/subprocess.py @@ -50,6 +50,7 @@ import sys import threading import warnings +import contextlib from time import monotonic as _time @@ -1072,28 +1073,28 @@ def _close_pipe_fds(self, # self._devnull is not always defined. devnull_fd = getattr(self, '_devnull', None) - if _mswindows: - if p2cread != -1: - p2cread.Close() - if c2pwrite != -1: - c2pwrite.Close() - if errwrite != -1: - errwrite.Close() - else: - if p2cread != -1 and p2cwrite != -1 and p2cread != devnull_fd: - os.close(p2cread) - if c2pwrite != -1 and c2pread != -1 and c2pwrite != devnull_fd: - os.close(c2pwrite) - if errwrite != -1 and errread != -1 and errwrite != devnull_fd: - os.close(errwrite) + with contextlib.ExitStack() as stack: + if _mswindows: + if p2cread != -1: + stack.callback(p2cread.Close) + if c2pwrite != -1: + stack.callback(c2pwrite.Close) + if errwrite != -1: + stack.callback(errwrite.Close) + else: + if p2cread != -1 and p2cwrite != -1 and p2cread != devnull_fd: + stack.callback(os.close, p2cread) + if c2pwrite != -1 and c2pread != -1 and c2pwrite != devnull_fd: + stack.callback(os.close, c2pwrite) + if errwrite != -1 and errread != -1 and errwrite != devnull_fd: + stack.callback(os.close, errwrite) - if devnull_fd is not None: - os.close(devnull_fd) + if devnull_fd is not None: + stack.callback(os.close, devnull_fd) # Prevent a double close of these handles/fds from __init__ on error. self._closed_child_pipe_fds = True - if _mswindows: # # Windows methods diff --git a/Misc/NEWS.d/next/Library/2019-01-29-17-24-52.bpo-35537.Q0ktFC.rst b/Misc/NEWS.d/next/Library/2019-01-29-17-24-52.bpo-35537.Q0ktFC.rst new file mode 100644 index 000000000000..2a9588e745f4 --- /dev/null +++ b/Misc/NEWS.d/next/Library/2019-01-29-17-24-52.bpo-35537.Q0ktFC.rst @@ -0,0 +1,4 @@ +An ExitStack is now used internally within subprocess.POpen to clean up pipe +file handles. No behavior change in normal operation. But if closing one +handle were ever to cause an exception, the others will now be closed +instead of leaked. (patch by Giampaolo Rodola) From webhook-mailer at python.org Tue Jan 29 23:40:04 2019 From: webhook-mailer at python.org (Raymond Hettinger) Date: Wed, 30 Jan 2019 04:40:04 -0000 Subject: [Python-checkins] Move float conversion into a macro. Apply to fsum (GH-11698) Message-ID: https://github.com/python/cpython/commit/cfd735ea28a2b985598236f955c72c3f0e82e01d commit: cfd735ea28a2b985598236f955c72c3f0e82e01d branch: master author: Raymond Hettinger committer: GitHub date: 2019-01-29T20:39:53-08:00 summary: Move float conversion into a macro. Apply to fsum (GH-11698) files: M Modules/mathmodule.c diff --git a/Modules/mathmodule.c b/Modules/mathmodule.c index c4353771d960..83dab1269d63 100644 --- a/Modules/mathmodule.c +++ b/Modules/mathmodule.c @@ -76,6 +76,29 @@ static const double logpi = 1.144729885849400174143427351353058711647; static const double sqrtpi = 1.772453850905516027298167483341145182798; #endif /* !defined(HAVE_ERF) || !defined(HAVE_ERFC) */ + +/* Version of PyFloat_AsDouble() with in-line fast paths + for exact floats and integers. Gives a substantial + speed improvement for extracting float arguments. +*/ + +#define ASSIGN_DOUBLE(target_var, obj, error_label) \ + if (PyFloat_CheckExact(obj)) { \ + target_var = PyFloat_AS_DOUBLE(obj); \ + } \ + else if (PyLong_CheckExact(obj)) { \ + target_var = PyLong_AsDouble(obj); \ + if (target_var == -1.0 && PyErr_Occurred()) { \ + goto error_label; \ + } \ + } \ + else { \ + target_var = PyFloat_AsDouble(obj); \ + if (target_var == -1.0 && PyErr_Occurred()) { \ + goto error_label; \ + } \ + } + static double sinpi(double x) { @@ -1323,10 +1346,8 @@ math_fsum(PyObject *module, PyObject *seq) goto _fsum_error; break; } - x = PyFloat_AsDouble(item); + ASSIGN_DOUBLE(x, item, error_with_item); Py_DECREF(item); - if (PyErr_Occurred()) - goto _fsum_error; xsave = x; for (i = j = 0; j < n; j++) { /* for y in partials */ @@ -1407,12 +1428,16 @@ math_fsum(PyObject *module, PyObject *seq) } sum = PyFloat_FromDouble(hi); -_fsum_error: + _fsum_error: PyFPE_END_PROTECT(hi) Py_DECREF(iter); if (p != ps) PyMem_Free(p); return sum; + + error_with_item: + Py_DECREF(item); + goto _fsum_error; } #undef NUM_PARTIALS @@ -2142,37 +2167,9 @@ math_dist_impl(PyObject *module, PyObject *p, PyObject *q) } for (i=0 ; i results for 4243df51fe43 on branch "default" -------------------------------------------- test_functools leaked [0, 3, 1] memory blocks, sum=4 test_multiprocessing_fork leaked [0, 0, 2] memory blocks, sum=2 Command line was: ['./python', '-m', 'test.regrtest', '-uall', '-R', '3:3:/home/psf-users/antoine/refleaks/refloghihOxm', '--timeout', '7200'] From webhook-mailer at python.org Wed Jan 30 10:49:46 2019 From: webhook-mailer at python.org (Antoine Pitrou) Date: Wed, 30 Jan 2019 15:49:46 -0000 Subject: [Python-checkins] bpo-25592: Improve documentation of distutils data_files (GH-9767) Message-ID: https://github.com/python/cpython/commit/598e15d4feaee3849a91d92c9ca51f17baafe19c commit: 598e15d4feaee3849a91d92c9ca51f17baafe19c branch: master author: jdemeyer committer: Antoine Pitrou date: 2019-01-30T16:49:39+01:00 summary: bpo-25592: Improve documentation of distutils data_files (GH-9767) files: M Doc/distutils/setupscript.rst diff --git a/Doc/distutils/setupscript.rst b/Doc/distutils/setupscript.rst index c1051d2e807e..54ed1aebc242 100644 --- a/Doc/distutils/setupscript.rst +++ b/Doc/distutils/setupscript.rst @@ -524,20 +524,23 @@ following way:: setup(..., data_files=[('bitmaps', ['bm/b1.gif', 'bm/b2.gif']), ('config', ['cfg/data.cfg']), - ('/etc/init.d', ['init-script'])] ) -Note that you can specify the directory names where the data files will be -installed, but you cannot rename the data files themselves. - Each (*directory*, *files*) pair in the sequence specifies the installation -directory and the files to install there. If *directory* is a relative path, it -is interpreted relative to the installation prefix (Python's ``sys.prefix`` for -pure-Python packages, ``sys.exec_prefix`` for packages that contain extension -modules). Each file name in *files* is interpreted relative to the -:file:`setup.py` script at the top of the package source distribution. No -directory information from *files* is used to determine the final location of -the installed file; only the name of the file is used. +directory and the files to install there. + +Each file name in *files* is interpreted relative to the :file:`setup.py` +script at the top of the package source distribution. Note that you can +specify the directory where the data files will be installed, but you cannot +rename the data files themselves. + +The *directory* should be a relative path. It is interpreted relative to the +installation prefix (Python's ``sys.prefix`` for system installations; +``site.USER_BASE`` for user installations). Distutils allows *directory* to be +an absolute installation path, but this is discouraged since it is +incompatible with the wheel packaging format. No directory information from +*files* is used to determine the final location of the installed file; only +the name of the file is used. You can specify the ``data_files`` options as a simple sequence of files without specifying a target directory, but this is not recommended, and the From webhook-mailer at python.org Wed Jan 30 10:56:54 2019 From: webhook-mailer at python.org (Antoine Pitrou) Date: Wed, 30 Jan 2019 15:56:54 -0000 Subject: [Python-checkins] bpo-25592: Improve documentation of distutils data_files (GH-9767) (GH-11701) Message-ID: https://github.com/python/cpython/commit/ebae1ce9c40e62ce52dc968f86ed11578d2fcdfd commit: ebae1ce9c40e62ce52dc968f86ed11578d2fcdfd branch: 3.7 author: Miss Islington (bot) <31488909+miss-islington at users.noreply.github.com> committer: Antoine Pitrou date: 2019-01-30T16:56:51+01:00 summary: bpo-25592: Improve documentation of distutils data_files (GH-9767) (GH-11701) (cherry picked from commit 598e15d4feaee3849a91d92c9ca51f17baafe19c) Co-authored-by: jdemeyer files: M Doc/distutils/setupscript.rst diff --git a/Doc/distutils/setupscript.rst b/Doc/distutils/setupscript.rst index c1051d2e807e..54ed1aebc242 100644 --- a/Doc/distutils/setupscript.rst +++ b/Doc/distutils/setupscript.rst @@ -524,20 +524,23 @@ following way:: setup(..., data_files=[('bitmaps', ['bm/b1.gif', 'bm/b2.gif']), ('config', ['cfg/data.cfg']), - ('/etc/init.d', ['init-script'])] ) -Note that you can specify the directory names where the data files will be -installed, but you cannot rename the data files themselves. - Each (*directory*, *files*) pair in the sequence specifies the installation -directory and the files to install there. If *directory* is a relative path, it -is interpreted relative to the installation prefix (Python's ``sys.prefix`` for -pure-Python packages, ``sys.exec_prefix`` for packages that contain extension -modules). Each file name in *files* is interpreted relative to the -:file:`setup.py` script at the top of the package source distribution. No -directory information from *files* is used to determine the final location of -the installed file; only the name of the file is used. +directory and the files to install there. + +Each file name in *files* is interpreted relative to the :file:`setup.py` +script at the top of the package source distribution. Note that you can +specify the directory where the data files will be installed, but you cannot +rename the data files themselves. + +The *directory* should be a relative path. It is interpreted relative to the +installation prefix (Python's ``sys.prefix`` for system installations; +``site.USER_BASE`` for user installations). Distutils allows *directory* to be +an absolute installation path, but this is discouraged since it is +incompatible with the wheel packaging format. No directory information from +*files* is used to determine the final location of the installed file; only +the name of the file is used. You can specify the ``data_files`` options as a simple sequence of files without specifying a target directory, but this is not recommended, and the From webhook-mailer at python.org Wed Jan 30 12:23:44 2019 From: webhook-mailer at python.org (Miss Islington (bot)) Date: Wed, 30 Jan 2019 17:23:44 -0000 Subject: [Python-checkins] bpo-35835: Add reference to Python 3.7 new breakpoint() function in pdb documentation. (GH-11691) Message-ID: https://github.com/python/cpython/commit/cf991e653ac550a9f011631447c61ce583404a57 commit: cf991e653ac550a9f011631447c61ce583404a57 branch: master author: Jo?o Matos committer: Miss Islington (bot) <31488909+miss-islington at users.noreply.github.com> date: 2019-01-30T09:23:39-08:00 summary: bpo-35835: Add reference to Python 3.7 new breakpoint() function in pdb documentation. (GH-11691) files: M Doc/library/pdb.rst diff --git a/Doc/library/pdb.rst b/Doc/library/pdb.rst index a72876f3f5a8..c7864e9e3f22 100644 --- a/Doc/library/pdb.rst +++ b/Doc/library/pdb.rst @@ -76,6 +76,10 @@ at the location you want to break into the debugger. You can then step through the code following this statement, and continue running without the debugger using the :pdbcmd:`continue` command. +.. versionadded:: 3.7 + The built-in :func:`breakpoint()`, when called with defaults, can be used + instead of ``import pdb; pdb.set_trace()``. + The typical usage to inspect a crashed program is:: >>> import pdb From webhook-mailer at python.org Wed Jan 30 12:36:55 2019 From: webhook-mailer at python.org (Victor Stinner) Date: Wed, 30 Jan 2019 17:36:55 -0000 Subject: [Python-checkins] bpo-35717: Fix KeyError exception raised when using enums and compile (GH-11523) (GH-11669) Message-ID: https://github.com/python/cpython/commit/1c79891026c57046f5ac596080ea60c515e0560a commit: 1c79891026c57046f5ac596080ea60c515e0560a branch: 3.7 author: Miss Islington (bot) <31488909+miss-islington at users.noreply.github.com> committer: Victor Stinner date: 2019-01-30T18:36:51+01:00 summary: bpo-35717: Fix KeyError exception raised when using enums and compile (GH-11523) (GH-11669) https://bugs.python.org/issue17467 (cherry picked from commit 1fd06f1eca80dcbf3a916133919482a8327f3da4) Co-authored-by: R?mi Lapeyre files: A Misc/NEWS.d/next/Library/2019-01-11-17-56-15.bpo-35717.6TDTB_.rst M Lib/enum.py M Lib/test/test_enum.py M Misc/ACKS diff --git a/Lib/enum.py b/Lib/enum.py index 782d37433a6e..8984cac60955 100644 --- a/Lib/enum.py +++ b/Lib/enum.py @@ -427,7 +427,7 @@ def _create_(cls, class_name, names, *, module=None, qualname=None, type=None, s if module is None: try: module = sys._getframe(2).f_globals['__name__'] - except (AttributeError, ValueError) as exc: + except (AttributeError, ValueError, KeyError) as exc: pass if module is None: _make_class_unpicklable(enum_class) diff --git a/Lib/test/test_enum.py b/Lib/test/test_enum.py index b221045328db..fdc3dd9c38a0 100644 --- a/Lib/test/test_enum.py +++ b/Lib/test/test_enum.py @@ -1866,6 +1866,15 @@ class Decision2(MyEnum): REVERT_ALL = "REVERT_ALL" RETRY = "RETRY" + def test_empty_globals(self): + # bpo-35717: sys._getframe(2).f_globals['__name__'] fails with KeyError + # when using compile and exec because f_globals is empty + code = "from enum import Enum; Enum('Animal', 'ANT BEE CAT DOG')" + code = compile(code, "", "exec") + global_ns = {} + local_ls = {} + exec(code, global_ns, local_ls) + class TestOrder(unittest.TestCase): diff --git a/Misc/ACKS b/Misc/ACKS index 027e01307225..193592290f0e 100644 --- a/Misc/ACKS +++ b/Misc/ACKS @@ -895,6 +895,7 @@ Glenn Langford Andrew Langmead Wolfgang Langner Detlef Lannert +R?mi Lapeyre Soren Larsen Amos Latteier Piers Lauder diff --git a/Misc/NEWS.d/next/Library/2019-01-11-17-56-15.bpo-35717.6TDTB_.rst b/Misc/NEWS.d/next/Library/2019-01-11-17-56-15.bpo-35717.6TDTB_.rst new file mode 100644 index 000000000000..7cae1d1c82c7 --- /dev/null +++ b/Misc/NEWS.d/next/Library/2019-01-11-17-56-15.bpo-35717.6TDTB_.rst @@ -0,0 +1,2 @@ +Fix KeyError exception raised when using enums and compile. Patch +contributed by R?mi Lapeyre. From webhook-mailer at python.org Wed Jan 30 12:41:56 2019 From: webhook-mailer at python.org (Miss Islington (bot)) Date: Wed, 30 Jan 2019 17:41:56 -0000 Subject: [Python-checkins] bpo-35835: Add reference to Python 3.7 new breakpoint() function in pdb documentation. (GH-11691) Message-ID: https://github.com/python/cpython/commit/7516f265a8517e4fdc7d6e63d72ae1b57fda26ee commit: 7516f265a8517e4fdc7d6e63d72ae1b57fda26ee branch: 3.7 author: Miss Islington (bot) <31488909+miss-islington at users.noreply.github.com> committer: GitHub date: 2019-01-30T09:41:51-08:00 summary: bpo-35835: Add reference to Python 3.7 new breakpoint() function in pdb documentation. (GH-11691) (cherry picked from commit cf991e653ac550a9f011631447c61ce583404a57) Co-authored-by: Jo?o Matos files: M Doc/library/pdb.rst diff --git a/Doc/library/pdb.rst b/Doc/library/pdb.rst index a72876f3f5a8..c7864e9e3f22 100644 --- a/Doc/library/pdb.rst +++ b/Doc/library/pdb.rst @@ -76,6 +76,10 @@ at the location you want to break into the debugger. You can then step through the code following this statement, and continue running without the debugger using the :pdbcmd:`continue` command. +.. versionadded:: 3.7 + The built-in :func:`breakpoint()`, when called with defaults, can be used + instead of ``import pdb; pdb.set_trace()``. + The typical usage to inspect a crashed program is:: >>> import pdb From webhook-mailer at python.org Wed Jan 30 16:30:25 2019 From: webhook-mailer at python.org (Raymond Hettinger) Date: Wed, 30 Jan 2019 21:30:25 -0000 Subject: [Python-checkins] Document differences between random.choices() and random.choice(). (GH-11703) Message-ID: https://github.com/python/cpython/commit/40ebe948e97b47fc84c8f527910063286a174b25 commit: 40ebe948e97b47fc84c8f527910063286a174b25 branch: master author: Raymond Hettinger committer: GitHub date: 2019-01-30T13:30:20-08:00 summary: Document differences between random.choices() and random.choice(). (GH-11703) files: M Doc/library/random.rst diff --git a/Doc/library/random.rst b/Doc/library/random.rst index 4f251574a327..a543ff016a62 100644 --- a/Doc/library/random.rst +++ b/Doc/library/random.rst @@ -162,6 +162,13 @@ Functions for sequences with the :class:`float` values returned by :func:`random` (that includes integers, floats, and fractions but excludes decimals). + For a given seed, the :func:`choices` function with equal weighting + typically produces a different sequence than repeated calls to + :func:`choice`. The algorithm used by :func:`choices` uses floating + point arithmetic for internal consistency and speed. The algorithm used + by :func:`choice` defaults to integer arithmetic with repeated selections + to avoid small biases from round-off error. + .. versionadded:: 3.6 From webhook-mailer at python.org Wed Jan 30 16:49:17 2019 From: webhook-mailer at python.org (Steve Dower) Date: Wed, 30 Jan 2019 21:49:17 -0000 Subject: [Python-checkins] bpo-35854: Fix EnvBuilder and --symlinks in venv on Windows (GH-11700) Message-ID: https://github.com/python/cpython/commit/a1f9a3332bd4767e47013ea787022f06b6dbcbbd commit: a1f9a3332bd4767e47013ea787022f06b6dbcbbd branch: master author: Steve Dower committer: GitHub date: 2019-01-30T13:49:14-08:00 summary: bpo-35854: Fix EnvBuilder and --symlinks in venv on Windows (GH-11700) files: A Misc/NEWS.d/next/Windows/2019-01-29-15-44-46.bpo-35854.Ww3z19.rst M Doc/library/venv.rst M Doc/using/venv-create.inc M Lib/test/test_venv.py M Lib/venv/__init__.py diff --git a/Doc/library/venv.rst b/Doc/library/venv.rst index efa51e231c82..412808ad4486 100644 --- a/Doc/library/venv.rst +++ b/Doc/library/venv.rst @@ -109,8 +109,7 @@ creation according to their needs, the :class:`EnvBuilder` class. any existing target directory, before creating the environment. * ``symlinks`` -- a Boolean value indicating whether to attempt to symlink the - Python binary (and any necessary DLLs or other binaries, - e.g. ``pythonw.exe``), rather than copying. + Python binary rather than copying. * ``upgrade`` -- a Boolean value which, if true, will upgrade an existing environment with the running Python - for use when that Python has been @@ -176,15 +175,15 @@ creation according to their needs, the :class:`EnvBuilder` class. .. method:: setup_python(context) - Creates a copy of the Python executable in the environment on POSIX - systems. If a specific executable ``python3.x`` was used, symlinks to - ``python`` and ``python3`` will be created pointing to that executable, - unless files with those names already exist. + Creates a copy or symlink to the Python executable in the environment. + On POSIX systems, if a specific executable ``python3.x`` was used, + symlinks to ``python`` and ``python3`` will be created pointing to that + executable, unless files with those names already exist. .. method:: setup_scripts(context) Installs activation scripts appropriate to the platform into the virtual - environment. On Windows, also installs the ``python[w].exe`` scripts. + environment. .. method:: post_setup(context) @@ -194,8 +193,13 @@ creation according to their needs, the :class:`EnvBuilder` class. .. versionchanged:: 3.7.2 Windows now uses redirector scripts for ``python[w].exe`` instead of - copying the actual binaries, and so :meth:`setup_python` does nothing - unless running from a build in the source tree. + copying the actual binaries. In 3.7.2 only :meth:`setup_python` does + nothing unless running from a build in the source tree. + + .. versionchanged:: 3.7.3 + Windows copies the redirector scripts as part of :meth:`setup_python` + instead of :meth:`setup_scripts`. This was not the case in 3.7.2. + When using symlinks, the original executables will be linked. In addition, :class:`EnvBuilder` provides this utility method that can be called from :meth:`setup_scripts` or :meth:`post_setup` in subclasses to diff --git a/Doc/using/venv-create.inc b/Doc/using/venv-create.inc index ba5096abd370..1ba538bec48b 100644 --- a/Doc/using/venv-create.inc +++ b/Doc/using/venv-create.inc @@ -70,6 +70,11 @@ The command, if run with ``-h``, will show the available options:: In earlier versions, if the target directory already existed, an error was raised, unless the ``--clear`` or ``--upgrade`` option was provided. +.. note:: + While symlinks are supported on Windows, they are not recommended. Of + particular note is that double-clicking ``python.exe`` in File Explorer + will resolve the symlink eagerly and ignore the virtual environment. + The created ``pyvenv.cfg`` file also includes the ``include-system-site-packages`` key, set to ``true`` if ``venv`` is run with the ``--system-site-packages`` option, ``false`` otherwise. diff --git a/Lib/test/test_venv.py b/Lib/test/test_venv.py index 34c2234493bc..6096b9df45bf 100644 --- a/Lib/test/test_venv.py +++ b/Lib/test/test_venv.py @@ -243,7 +243,6 @@ def test_isolation(self): self.assertIn('include-system-site-packages = %s\n' % s, data) @unittest.skipUnless(can_symlink(), 'Needs symlinks') - @unittest.skipIf(os.name == 'nt', 'Symlinks are never used on Windows') def test_symlinking(self): """ Test symlinking works as expected diff --git a/Lib/venv/__init__.py b/Lib/venv/__init__.py index 5438b0d4e508..8f9e3138474a 100644 --- a/Lib/venv/__init__.py +++ b/Lib/venv/__init__.py @@ -64,11 +64,10 @@ def create(self, env_dir): self.system_site_packages = False self.create_configuration(context) self.setup_python(context) - if not self.upgrade: - self.setup_scripts(context) if self.with_pip: self._setup_pip(context) if not self.upgrade: + self.setup_scripts(context) self.post_setup(context) if true_system_site_packages: # We had set it to False before, now @@ -176,6 +175,23 @@ def symlink_or_copy(self, src, dst, relative_symlinks_ok=False): logger.warning('Unable to symlink %r to %r', src, dst) force_copy = True if force_copy: + if os.name == 'nt': + # On Windows, we rewrite symlinks to our base python.exe into + # copies of venvlauncher.exe + basename, ext = os.path.splitext(os.path.basename(src)) + if basename.endswith('_d'): + ext = '_d' + ext + basename = basename[:-2] + if sysconfig.is_python_build(True): + if basename == 'python': + basename = 'venvlauncher' + elif basename == 'pythonw': + basename = 'venvwlauncher' + scripts = os.path.dirname(src) + else: + scripts = os.path.join(os.path.dirname(__file__), "scripts", "nt") + src = os.path.join(scripts, basename + ext) + shutil.copyfile(src, dst) def setup_python(self, context): @@ -202,23 +218,31 @@ def setup_python(self, context): if not os.path.islink(path): os.chmod(path, 0o755) else: - # For normal cases, the venvlauncher will be copied from - # our scripts folder. For builds, we need to copy it - # manually. - if sysconfig.is_python_build(True): - suffix = '.exe' - if context.python_exe.lower().endswith('_d.exe'): - suffix = '_d.exe' - - src = os.path.join(dirname, "venvlauncher" + suffix) - dst = os.path.join(binpath, context.python_exe) - copier(src, dst) + if self.symlinks: + # For symlinking, we need a complete copy of the root directory + # If symlinks fail, you'll get unnecessary copies of files, but + # we assume that if you've opted into symlinks on Windows then + # you know what you're doing. + suffixes = [ + f for f in os.listdir(dirname) if + os.path.normcase(os.path.splitext(f)[1]) in ('.exe', '.dll') + ] + if sysconfig.is_python_build(True): + suffixes = [ + f for f in suffixes if + os.path.normcase(f).startswith(('python', 'vcruntime')) + ] + else: + suffixes = ['python.exe', 'python_d.exe', 'pythonw.exe', + 'pythonw_d.exe'] - src = os.path.join(dirname, "venvwlauncher" + suffix) - dst = os.path.join(binpath, "pythonw" + suffix) - copier(src, dst) + for suffix in suffixes: + src = os.path.join(dirname, suffix) + if os.path.exists(src): + copier(src, os.path.join(binpath, suffix)) - # copy init.tcl over + if sysconfig.is_python_build(True): + # copy init.tcl for root, dirs, files in os.walk(context.python_dir): if 'init.tcl' in files: tcldir = os.path.basename(root) @@ -304,6 +328,9 @@ def install_scripts(self, context, path): dirs.remove(d) continue # ignore files in top level for f in files: + if (os.name == 'nt' and f.startswith('python') + and f.endswith(('.exe', '.pdb'))): + continue srcfile = os.path.join(root, f) suffix = root[plen:].split(os.sep)[2:] if not suffix: diff --git a/Misc/NEWS.d/next/Windows/2019-01-29-15-44-46.bpo-35854.Ww3z19.rst b/Misc/NEWS.d/next/Windows/2019-01-29-15-44-46.bpo-35854.Ww3z19.rst new file mode 100644 index 000000000000..a1c761458d35 --- /dev/null +++ b/Misc/NEWS.d/next/Windows/2019-01-29-15-44-46.bpo-35854.Ww3z19.rst @@ -0,0 +1 @@ +Fix EnvBuilder and --symlinks in venv on Windows From webhook-mailer at python.org Wed Jan 30 16:49:26 2019 From: webhook-mailer at python.org (Raymond Hettinger) Date: Wed, 30 Jan 2019 21:49:26 -0000 Subject: [Python-checkins] Document differences between random.choices() and random.choice(). (GH-11703) (GH-11706) Message-ID: https://github.com/python/cpython/commit/e31f8604e692c46800abd70f88d9928fa1f17b7c commit: e31f8604e692c46800abd70f88d9928fa1f17b7c branch: 3.7 author: Miss Islington (bot) <31488909+miss-islington at users.noreply.github.com> committer: Raymond Hettinger date: 2019-01-30T13:49:23-08:00 summary: Document differences between random.choices() and random.choice(). (GH-11703) (GH-11706) files: M Doc/library/random.rst diff --git a/Doc/library/random.rst b/Doc/library/random.rst index 4f251574a327..a543ff016a62 100644 --- a/Doc/library/random.rst +++ b/Doc/library/random.rst @@ -162,6 +162,13 @@ Functions for sequences with the :class:`float` values returned by :func:`random` (that includes integers, floats, and fractions but excludes decimals). + For a given seed, the :func:`choices` function with equal weighting + typically produces a different sequence than repeated calls to + :func:`choice`. The algorithm used by :func:`choices` uses floating + point arithmetic for internal consistency and speed. The algorithm used + by :func:`choice` defaults to integer arithmetic with repeated selections + to avoid small biases from round-off error. + .. versionadded:: 3.6 From webhook-mailer at python.org Wed Jan 30 17:14:44 2019 From: webhook-mailer at python.org (Miss Islington (bot)) Date: Wed, 30 Jan 2019 22:14:44 -0000 Subject: [Python-checkins] bpo-35854: Fix EnvBuilder and --symlinks in venv on Windows (GH-11700) Message-ID: https://github.com/python/cpython/commit/03082a836b707528f885080bda9732d89849d4e3 commit: 03082a836b707528f885080bda9732d89849d4e3 branch: 3.7 author: Miss Islington (bot) <31488909+miss-islington at users.noreply.github.com> committer: GitHub date: 2019-01-30T14:14:35-08:00 summary: bpo-35854: Fix EnvBuilder and --symlinks in venv on Windows (GH-11700) (cherry picked from commit a1f9a3332bd4767e47013ea787022f06b6dbcbbd) Co-authored-by: Steve Dower files: A Misc/NEWS.d/next/Windows/2019-01-29-15-44-46.bpo-35854.Ww3z19.rst M Doc/library/venv.rst M Doc/using/venv-create.inc M Lib/test/test_venv.py M Lib/venv/__init__.py diff --git a/Doc/library/venv.rst b/Doc/library/venv.rst index 53f3d2640566..fefc522b45d1 100644 --- a/Doc/library/venv.rst +++ b/Doc/library/venv.rst @@ -114,8 +114,7 @@ creation according to their needs, the :class:`EnvBuilder` class. any existing target directory, before creating the environment. * ``symlinks`` -- a Boolean value indicating whether to attempt to symlink the - Python binary (and any necessary DLLs or other binaries, - e.g. ``pythonw.exe``), rather than copying. + Python binary rather than copying. * ``upgrade`` -- a Boolean value which, if true, will upgrade an existing environment with the running Python - for use when that Python has been @@ -181,15 +180,15 @@ creation according to their needs, the :class:`EnvBuilder` class. .. method:: setup_python(context) - Creates a copy of the Python executable in the environment on POSIX - systems. If a specific executable ``python3.x`` was used, symlinks to - ``python`` and ``python3`` will be created pointing to that executable, - unless files with those names already exist. + Creates a copy or symlink to the Python executable in the environment. + On POSIX systems, if a specific executable ``python3.x`` was used, + symlinks to ``python`` and ``python3`` will be created pointing to that + executable, unless files with those names already exist. .. method:: setup_scripts(context) Installs activation scripts appropriate to the platform into the virtual - environment. On Windows, also installs the ``python[w].exe`` scripts. + environment. .. method:: post_setup(context) @@ -199,8 +198,13 @@ creation according to their needs, the :class:`EnvBuilder` class. .. versionchanged:: 3.7.2 Windows now uses redirector scripts for ``python[w].exe`` instead of - copying the actual binaries, and so :meth:`setup_python` does nothing - unless running from a build in the source tree. + copying the actual binaries. In 3.7.2 only :meth:`setup_python` does + nothing unless running from a build in the source tree. + + .. versionchanged:: 3.7.3 + Windows copies the redirector scripts as part of :meth:`setup_python` + instead of :meth:`setup_scripts`. This was not the case in 3.7.2. + When using symlinks, the original executables will be linked. In addition, :class:`EnvBuilder` provides this utility method that can be called from :meth:`setup_scripts` or :meth:`post_setup` in subclasses to diff --git a/Doc/using/venv-create.inc b/Doc/using/venv-create.inc index ba5096abd370..1ba538bec48b 100644 --- a/Doc/using/venv-create.inc +++ b/Doc/using/venv-create.inc @@ -70,6 +70,11 @@ The command, if run with ``-h``, will show the available options:: In earlier versions, if the target directory already existed, an error was raised, unless the ``--clear`` or ``--upgrade`` option was provided. +.. note:: + While symlinks are supported on Windows, they are not recommended. Of + particular note is that double-clicking ``python.exe`` in File Explorer + will resolve the symlink eagerly and ignore the virtual environment. + The created ``pyvenv.cfg`` file also includes the ``include-system-site-packages`` key, set to ``true`` if ``venv`` is run with the ``--system-site-packages`` option, ``false`` otherwise. diff --git a/Lib/test/test_venv.py b/Lib/test/test_venv.py index 34c2234493bc..6096b9df45bf 100644 --- a/Lib/test/test_venv.py +++ b/Lib/test/test_venv.py @@ -243,7 +243,6 @@ def test_isolation(self): self.assertIn('include-system-site-packages = %s\n' % s, data) @unittest.skipUnless(can_symlink(), 'Needs symlinks') - @unittest.skipIf(os.name == 'nt', 'Symlinks are never used on Windows') def test_symlinking(self): """ Test symlinking works as expected diff --git a/Lib/venv/__init__.py b/Lib/venv/__init__.py index 5438b0d4e508..8f9e3138474a 100644 --- a/Lib/venv/__init__.py +++ b/Lib/venv/__init__.py @@ -64,11 +64,10 @@ def create(self, env_dir): self.system_site_packages = False self.create_configuration(context) self.setup_python(context) - if not self.upgrade: - self.setup_scripts(context) if self.with_pip: self._setup_pip(context) if not self.upgrade: + self.setup_scripts(context) self.post_setup(context) if true_system_site_packages: # We had set it to False before, now @@ -176,6 +175,23 @@ def symlink_or_copy(self, src, dst, relative_symlinks_ok=False): logger.warning('Unable to symlink %r to %r', src, dst) force_copy = True if force_copy: + if os.name == 'nt': + # On Windows, we rewrite symlinks to our base python.exe into + # copies of venvlauncher.exe + basename, ext = os.path.splitext(os.path.basename(src)) + if basename.endswith('_d'): + ext = '_d' + ext + basename = basename[:-2] + if sysconfig.is_python_build(True): + if basename == 'python': + basename = 'venvlauncher' + elif basename == 'pythonw': + basename = 'venvwlauncher' + scripts = os.path.dirname(src) + else: + scripts = os.path.join(os.path.dirname(__file__), "scripts", "nt") + src = os.path.join(scripts, basename + ext) + shutil.copyfile(src, dst) def setup_python(self, context): @@ -202,23 +218,31 @@ def setup_python(self, context): if not os.path.islink(path): os.chmod(path, 0o755) else: - # For normal cases, the venvlauncher will be copied from - # our scripts folder. For builds, we need to copy it - # manually. - if sysconfig.is_python_build(True): - suffix = '.exe' - if context.python_exe.lower().endswith('_d.exe'): - suffix = '_d.exe' - - src = os.path.join(dirname, "venvlauncher" + suffix) - dst = os.path.join(binpath, context.python_exe) - copier(src, dst) + if self.symlinks: + # For symlinking, we need a complete copy of the root directory + # If symlinks fail, you'll get unnecessary copies of files, but + # we assume that if you've opted into symlinks on Windows then + # you know what you're doing. + suffixes = [ + f for f in os.listdir(dirname) if + os.path.normcase(os.path.splitext(f)[1]) in ('.exe', '.dll') + ] + if sysconfig.is_python_build(True): + suffixes = [ + f for f in suffixes if + os.path.normcase(f).startswith(('python', 'vcruntime')) + ] + else: + suffixes = ['python.exe', 'python_d.exe', 'pythonw.exe', + 'pythonw_d.exe'] - src = os.path.join(dirname, "venvwlauncher" + suffix) - dst = os.path.join(binpath, "pythonw" + suffix) - copier(src, dst) + for suffix in suffixes: + src = os.path.join(dirname, suffix) + if os.path.exists(src): + copier(src, os.path.join(binpath, suffix)) - # copy init.tcl over + if sysconfig.is_python_build(True): + # copy init.tcl for root, dirs, files in os.walk(context.python_dir): if 'init.tcl' in files: tcldir = os.path.basename(root) @@ -304,6 +328,9 @@ def install_scripts(self, context, path): dirs.remove(d) continue # ignore files in top level for f in files: + if (os.name == 'nt' and f.startswith('python') + and f.endswith(('.exe', '.pdb'))): + continue srcfile = os.path.join(root, f) suffix = root[plen:].split(os.sep)[2:] if not suffix: diff --git a/Misc/NEWS.d/next/Windows/2019-01-29-15-44-46.bpo-35854.Ww3z19.rst b/Misc/NEWS.d/next/Windows/2019-01-29-15-44-46.bpo-35854.Ww3z19.rst new file mode 100644 index 000000000000..a1c761458d35 --- /dev/null +++ b/Misc/NEWS.d/next/Windows/2019-01-29-15-44-46.bpo-35854.Ww3z19.rst @@ -0,0 +1 @@ +Fix EnvBuilder and --symlinks in venv on Windows From webhook-mailer at python.org Thu Jan 31 03:47:56 2019 From: webhook-mailer at python.org (Raymond Hettinger) Date: Thu, 31 Jan 2019 08:47:56 -0000 Subject: [Python-checkins] bpo-34003: Use dict instead of OrderedDict in csv.DictReader (GH-8014) Message-ID: https://github.com/python/cpython/commit/9f3f0931cfc58498086d287226650599a97412bb commit: 9f3f0931cfc58498086d287226650599a97412bb branch: master author: Michael Selik committer: Raymond Hettinger date: 2019-01-31T00:47:53-08:00 summary: bpo-34003: Use dict instead of OrderedDict in csv.DictReader (GH-8014) files: A Misc/NEWS.d/next/Library/2018-06-29-13-05-01.bpo-34003.Iu831h.rst M Doc/library/csv.rst M Lib/csv.py diff --git a/Doc/library/csv.rst b/Doc/library/csv.rst index 049537eff898..17534fcc4615 100644 --- a/Doc/library/csv.rst +++ b/Doc/library/csv.rst @@ -150,12 +150,12 @@ The :mod:`csv` module defines the following classes: dialect='excel', *args, **kwds) Create an object that operates like a regular reader but maps the - information in each row to an :mod:`OrderedDict ` - whose keys are given by the optional *fieldnames* parameter. + information in each row to a :class:`dict` whose keys are given by the + optional *fieldnames* parameter. The *fieldnames* parameter is a :term:`sequence`. If *fieldnames* is omitted, the values in the first row of file *f* will be used as the - fieldnames. Regardless of how the fieldnames are determined, the ordered + fieldnames. Regardless of how the fieldnames are determined, the dictionary preserves their original ordering. If a row has more fields than fieldnames, the remaining data is put in a @@ -166,8 +166,8 @@ The :mod:`csv` module defines the following classes: All other optional or keyword arguments are passed to the underlying :class:`reader` instance. - .. versionchanged:: 3.6 - Returned rows are now of type :class:`OrderedDict`. + .. versionchanged:: 3.8 + Returned rows are now of type :class:`dict`. A short usage example:: @@ -181,7 +181,7 @@ The :mod:`csv` module defines the following classes: John Cleese >>> print(row) - OrderedDict([('first_name', 'John'), ('last_name', 'Cleese')]) + {'first_name': 'John', 'last_name': 'Cleese'} .. class:: DictWriter(f, fieldnames, restval='', extrasaction='raise', \ diff --git a/Lib/csv.py b/Lib/csv.py index 58624af90534..eeeedabc6bb8 100644 --- a/Lib/csv.py +++ b/Lib/csv.py @@ -11,7 +11,6 @@ __doc__ from _csv import Dialect as _Dialect -from collections import OrderedDict from io import StringIO __all__ = ["QUOTE_MINIMAL", "QUOTE_ALL", "QUOTE_NONNUMERIC", "QUOTE_NONE", @@ -117,7 +116,7 @@ def __next__(self): # values while row == []: row = next(self.reader) - d = OrderedDict(zip(self.fieldnames, row)) + d = dict(zip(self.fieldnames, row)) lf = len(self.fieldnames) lr = len(row) if lf < lr: diff --git a/Misc/NEWS.d/next/Library/2018-06-29-13-05-01.bpo-34003.Iu831h.rst b/Misc/NEWS.d/next/Library/2018-06-29-13-05-01.bpo-34003.Iu831h.rst new file mode 100644 index 000000000000..7bc5e1200ae8 --- /dev/null +++ b/Misc/NEWS.d/next/Library/2018-06-29-13-05-01.bpo-34003.Iu831h.rst @@ -0,0 +1,2 @@ +csv.DictReader now creates dicts instead of OrderedDicts. Patch by Michael +Selik. From webhook-mailer at python.org Thu Jan 31 03:53:53 2019 From: webhook-mailer at python.org (Inada Naoki) Date: Thu, 31 Jan 2019 08:53:53 -0000 Subject: [Python-checkins] bpo-33504: fix wrong "versionchanged" (GH-11712) Message-ID: https://github.com/python/cpython/commit/0897e0c597c065f043e4286d01f16f473ab664ee commit: 0897e0c597c065f043e4286d01f16f473ab664ee branch: master author: Inada Naoki committer: GitHub date: 2019-01-31T17:53:48+09:00 summary: bpo-33504: fix wrong "versionchanged" (GH-11712) files: M Doc/library/configparser.rst diff --git a/Doc/library/configparser.rst b/Doc/library/configparser.rst index efafd6343ce1..185b4a10ec99 100644 --- a/Doc/library/configparser.rst +++ b/Doc/library/configparser.rst @@ -476,9 +476,9 @@ the :meth:`__init__` options: ... 'bar': 'y', ... 'baz': 'z'} ... }) - >>> parser.sections() # doctest: +SKIP + >>> parser.sections() ['section1', 'section2', 'section3'] - >>> [option for option in parser['section3']] # doctest: +SKIP + >>> [option for option in parser['section3']] ['foo', 'bar', 'baz'] * *allow_no_value*, default value: ``False`` @@ -921,7 +921,7 @@ ConfigParser Objects providing consistent behavior across the parser: non-string keys and values are implicitly converted to strings. - .. versionchanged:: 3.7 + .. versionchanged:: 3.8 The default *dict_type* is :class:`dict`, since it now preserves insertion order. @@ -1199,7 +1199,7 @@ RawConfigParser Objects names, and values via its unsafe ``add_section`` and ``set`` methods, as well as the legacy ``defaults=`` keyword argument handling. - .. versionchanged:: 3.7 + .. versionchanged:: 3.8 The default *dict_type* is :class:`dict`, since it now preserves insertion order. From webhook-mailer at python.org Thu Jan 31 03:54:58 2019 From: webhook-mailer at python.org (Inada Naoki) Date: Thu, 31 Jan 2019 08:54:58 -0000 Subject: [Python-checkins] bpo-35865: doc: Remove wrong note and directives (GH-11711) Message-ID: https://github.com/python/cpython/commit/59014449721966a7890f6323616431ad869ec7cc commit: 59014449721966a7890f6323616431ad869ec7cc branch: 3.7 author: Inada Naoki committer: GitHub date: 2019-01-31T17:54:55+09:00 summary: bpo-35865: doc: Remove wrong note and directives (GH-11711) * note about random dict order * Remove wrong versionchanged directive files: M Doc/library/configparser.rst diff --git a/Doc/library/configparser.rst b/Doc/library/configparser.rst index 321770242b00..95cc352010e0 100644 --- a/Doc/library/configparser.rst +++ b/Doc/library/configparser.rst @@ -462,7 +462,8 @@ the :meth:`__init__` options: Please note: there are ways to add a set of key-value pairs in a single operation. When you use a regular dictionary in those operations, the order - of the keys may be random. For example: + of the keys will be ordered because dict preserves order from Python 3.7. + For example: .. doctest:: @@ -477,41 +478,10 @@ the :meth:`__init__` options: ... 'bar': 'y', ... 'baz': 'z'} ... }) - >>> parser.sections() # doctest: +SKIP - ['section3', 'section2', 'section1'] - >>> [option for option in parser['section3']] # doctest: +SKIP - ['baz', 'foo', 'bar'] - - In these operations you need to use an ordered dictionary as well: - - .. doctest:: - - >>> from collections import OrderedDict - >>> parser = configparser.ConfigParser() - >>> parser.read_dict( - ... OrderedDict(( - ... ('s1', - ... OrderedDict(( - ... ('1', '2'), - ... ('3', '4'), - ... ('5', '6'), - ... )) - ... ), - ... ('s2', - ... OrderedDict(( - ... ('a', 'b'), - ... ('c', 'd'), - ... ('e', 'f'), - ... )) - ... ), - ... )) - ... ) - >>> parser.sections() # doctest: +SKIP - ['s1', 's2'] - >>> [option for option in parser['s1']] # doctest: +SKIP - ['1', '3', '5'] - >>> [option for option in parser['s2'].values()] # doctest: +SKIP - ['b', 'd', 'f'] + >>> parser.sections() + ['section1', 'section2', 'section3'] + >>> [option for option in parser['section3']] + ['foo', 'bar', 'baz'] * *allow_no_value*, default value: ``False`` @@ -891,7 +861,7 @@ interpolation if an option used is not defined elsewhere. :: ConfigParser Objects -------------------- -.. class:: ConfigParser(defaults=None, dict_type=dict, allow_no_value=False, delimiters=('=', ':'), comment_prefixes=('#', ';'), inline_comment_prefixes=None, strict=True, empty_lines_in_values=True, default_section=configparser.DEFAULTSECT, interpolation=BasicInterpolation(), converters={}) +.. class:: ConfigParser(defaults=None, dict_type=collections.OrderedDict, allow_no_value=False, delimiters=('=', ':'), comment_prefixes=('#', ';'), inline_comment_prefixes=None, strict=True, empty_lines_in_values=True, default_section=configparser.DEFAULTSECT, interpolation=BasicInterpolation(), converters={}) The main configuration parser. When *defaults* is given, it is initialized into the dictionary of intrinsic defaults. When *dict_type* is given, it @@ -953,10 +923,6 @@ ConfigParser Objects providing consistent behavior across the parser: non-string keys and values are implicitly converted to strings. - .. versionchanged:: 3.7 - The default *dict_type* is :class:`dict`, since it now preserves - insertion order. - .. method:: defaults() Return a dictionary containing the instance-wide defaults. @@ -1213,7 +1179,7 @@ ConfigParser Objects RawConfigParser Objects ----------------------- -.. class:: RawConfigParser(defaults=None, dict_type=dict, \ +.. class:: RawConfigParser(defaults=None, dict_type=collections.OrderedDict, \ allow_no_value=False, *, delimiters=('=', ':'), \ comment_prefixes=('#', ';'), \ inline_comment_prefixes=None, strict=True, \ @@ -1226,10 +1192,6 @@ RawConfigParser Objects names, and values via its unsafe ``add_section`` and ``set`` methods, as well as the legacy ``defaults=`` keyword argument handling. - .. versionchanged:: 3.7 - The default *dict_type* is :class:`dict`, since it now preserves - insertion order. - .. note:: Consider using :class:`ConfigParser` instead which checks types of the values to be stored internally. If you don't want interpolation, you From webhook-mailer at python.org Thu Jan 31 03:59:56 2019 From: webhook-mailer at python.org (Raymond Hettinger) Date: Thu, 31 Jan 2019 08:59:56 -0000 Subject: [Python-checkins] bpo-35864: Replace OrderedDict with regular dict in namedtuple() (#11708) Message-ID: https://github.com/python/cpython/commit/0bb4bdf0d93b301407774c4ffd6df54cff947df8 commit: 0bb4bdf0d93b301407774c4ffd6df54cff947df8 branch: master author: Raymond Hettinger committer: GitHub date: 2019-01-31T00:59:50-08:00 summary: bpo-35864: Replace OrderedDict with regular dict in namedtuple() (#11708) * Change from OrderedDict to a regular dict * Add blurb files: A Misc/NEWS.d/next/Library/2019-01-30-20-22-36.bpo-35864.ig9KnG.rst M Doc/library/collections.rst M Doc/whatsnew/3.8.rst M Lib/collections/__init__.py diff --git a/Doc/library/collections.rst b/Doc/library/collections.rst index bfbf8a7ecc07..3fa8b32ff006 100644 --- a/Doc/library/collections.rst +++ b/Doc/library/collections.rst @@ -894,11 +894,18 @@ field names, the method and attribute names start with an underscore. >>> p = Point(x=11, y=22) >>> p._asdict() - OrderedDict([('x', 11), ('y', 22)]) + {'x': 11, 'y': 22} .. versionchanged:: 3.1 Returns an :class:`OrderedDict` instead of a regular :class:`dict`. + .. versionchanged:: 3.8 + Returns a regular :class:`dict` instead of an :class:`OrderedDict`. + As of Python 3.7, regular dicts are guaranteed to be ordered. If the + extra features of :class:`OrderedDict` are required, the suggested + remediation is to cast the result to the desired type: + ``OrderedDict(nt._asdict())``. + .. method:: somenamedtuple._replace(**kwargs) Return a new instance of the named tuple replacing specified fields with new diff --git a/Doc/whatsnew/3.8.rst b/Doc/whatsnew/3.8.rst index fb25ce2f7669..09c43b1f30a5 100644 --- a/Doc/whatsnew/3.8.rst +++ b/Doc/whatsnew/3.8.rst @@ -125,6 +125,14 @@ New Modules Improved Modules ================ +* The :meth:`_asdict()` method for :func:`collections.namedtuple` now returns + a :class:`dict` instead of a :class:`collections.OrderedDict`. This works because + regular dicts have guaranteed ordering in since Python 3.7. If the extra + features of :class:`OrderedDict` are required, the suggested remediation is + to cast the result to the desired type: ``OrderedDict(nt._asdict())``. + (Contributed by Raymond Hettinger in :issue:`35864`.) + + asyncio ------- diff --git a/Lib/collections/__init__.py b/Lib/collections/__init__.py index c31d7b79185d..835cc363d10a 100644 --- a/Lib/collections/__init__.py +++ b/Lib/collections/__init__.py @@ -426,9 +426,11 @@ def __repr__(self): 'Return a nicely formatted representation string' return self.__class__.__name__ + repr_fmt % self + _dict, _zip = dict, zip + def _asdict(self): 'Return a new OrderedDict which maps field names to their values.' - return OrderedDict(zip(self._fields, self)) + return _dict(_zip(self._fields, self)) def __getnewargs__(self): 'Return self as a plain tuple. Used by copy and pickle.' diff --git a/Misc/NEWS.d/next/Library/2019-01-30-20-22-36.bpo-35864.ig9KnG.rst b/Misc/NEWS.d/next/Library/2019-01-30-20-22-36.bpo-35864.ig9KnG.rst new file mode 100644 index 000000000000..e3b41b700e6b --- /dev/null +++ b/Misc/NEWS.d/next/Library/2019-01-30-20-22-36.bpo-35864.ig9KnG.rst @@ -0,0 +1,2 @@ +The _asdict() method for collections.namedtuple now returns a regular dict +instead of an OrderedDict. From solipsis at pitrou.net Thu Jan 31 04:10:32 2019 From: solipsis at pitrou.net (solipsis at pitrou.net) Date: Thu, 31 Jan 2019 09:10:32 +0000 Subject: [Python-checkins] Daily reference leaks (4243df51fe43): sum=-3 Message-ID: <20190131091032.1.7BDC877A22FDB4B4@psf.io> results for 4243df51fe43 on branch "default" -------------------------------------------- test_collections leaked [-7, 8, -7] memory blocks, sum=-6 test_functools leaked [0, 3, 1] memory blocks, sum=4 test_multiprocessing_fork leaked [-2, 2, 0] memory blocks, sum=0 test_multiprocessing_spawn leaked [-2, 2, -1] memory blocks, sum=-1 Command line was: ['./python', '-m', 'test.regrtest', '-uall', '-R', '3:3:/home/psf-users/antoine/refleaks/reflogIzLI4U', '--timeout', '7200'] From webhook-mailer at python.org Thu Jan 31 05:09:11 2019 From: webhook-mailer at python.org (Inada Naoki) Date: Thu, 31 Jan 2019 10:09:11 -0000 Subject: [Python-checkins] doc: http: Fix enum name for status code 416 (GH-11689) Message-ID: https://github.com/python/cpython/commit/d97daebfa69b4df95231bcae4123eacad6a48d14 commit: d97daebfa69b4df95231bcae4123eacad6a48d14 branch: master author: Phil Jones committer: Inada Naoki date: 2019-01-31T19:08:57+09:00 summary: doc: http: Fix enum name for status code 416 (GH-11689) files: M Doc/library/http.rst diff --git a/Doc/library/http.rst b/Doc/library/http.rst index 7c34004a3ae0..88d62cca3f90 100644 --- a/Doc/library/http.rst +++ b/Doc/library/http.rst @@ -96,7 +96,7 @@ Code Enum Name Details ``413`` ``REQUEST_ENTITY_TOO_LARGE`` HTTP/1.1 :rfc:`7231`, Section 6.5.11 ``414`` ``REQUEST_URI_TOO_LONG`` HTTP/1.1 :rfc:`7231`, Section 6.5.12 ``415`` ``UNSUPPORTED_MEDIA_TYPE`` HTTP/1.1 :rfc:`7231`, Section 6.5.13 -``416`` ``REQUEST_RANGE_NOT_SATISFIABLE`` HTTP/1.1 Range Requests :rfc:`7233`, Section 4.4 +``416`` ``REQUESTED_RANGE_NOT_SATISFIABLE`` HTTP/1.1 Range Requests :rfc:`7233`, Section 4.4 ``417`` ``EXPECTATION_FAILED`` HTTP/1.1 :rfc:`7231`, Section 6.5.14 ``421`` ``MISDIRECTED_REQUEST`` HTTP/2 :rfc:`7540`, Section 9.1.2 ``422`` ``UNPROCESSABLE_ENTITY`` WebDAV :rfc:`4918`, Section 11.2 From webhook-mailer at python.org Thu Jan 31 05:15:34 2019 From: webhook-mailer at python.org (Miss Islington (bot)) Date: Thu, 31 Jan 2019 10:15:34 -0000 Subject: [Python-checkins] doc: http: Fix enum name for status code 416 (GH-11689) Message-ID: https://github.com/python/cpython/commit/09b66e027bdd4d7af6921fa2a2c9536f9cf5eebe commit: 09b66e027bdd4d7af6921fa2a2c9536f9cf5eebe branch: 3.7 author: Miss Islington (bot) <31488909+miss-islington at users.noreply.github.com> committer: GitHub date: 2019-01-31T02:15:29-08:00 summary: doc: http: Fix enum name for status code 416 (GH-11689) (cherry picked from commit d97daebfa69b4df95231bcae4123eacad6a48d14) Co-authored-by: Phil Jones files: M Doc/library/http.rst diff --git a/Doc/library/http.rst b/Doc/library/http.rst index 7c34004a3ae0..88d62cca3f90 100644 --- a/Doc/library/http.rst +++ b/Doc/library/http.rst @@ -96,7 +96,7 @@ Code Enum Name Details ``413`` ``REQUEST_ENTITY_TOO_LARGE`` HTTP/1.1 :rfc:`7231`, Section 6.5.11 ``414`` ``REQUEST_URI_TOO_LONG`` HTTP/1.1 :rfc:`7231`, Section 6.5.12 ``415`` ``UNSUPPORTED_MEDIA_TYPE`` HTTP/1.1 :rfc:`7231`, Section 6.5.13 -``416`` ``REQUEST_RANGE_NOT_SATISFIABLE`` HTTP/1.1 Range Requests :rfc:`7233`, Section 4.4 +``416`` ``REQUESTED_RANGE_NOT_SATISFIABLE`` HTTP/1.1 Range Requests :rfc:`7233`, Section 4.4 ``417`` ``EXPECTATION_FAILED`` HTTP/1.1 :rfc:`7231`, Section 6.5.14 ``421`` ``MISDIRECTED_REQUEST`` HTTP/2 :rfc:`7540`, Section 9.1.2 ``422`` ``UNPROCESSABLE_ENTITY`` WebDAV :rfc:`4918`, Section 11.2 From webhook-mailer at python.org Thu Jan 31 06:40:36 2019 From: webhook-mailer at python.org (=?utf-8?q?=C5=81ukasz?= Langa) Date: Thu, 31 Jan 2019 11:40:36 -0000 Subject: [Python-checkins] bpo-35766: Merge typed_ast back into CPython (GH-11645) Message-ID: https://github.com/python/cpython/commit/dcfcd146f8e6fc5c2fc16a4c192a0c5f5ca8c53c commit: dcfcd146f8e6fc5c2fc16a4c192a0c5f5ca8c53c branch: master author: Guido van Rossum committer: ?ukasz Langa date: 2019-01-31T12:40:27+01:00 summary: bpo-35766: Merge typed_ast back into CPython (GH-11645) files: A Lib/test/test_type_comments.py A Misc/NEWS.d/next/Core and Builtins/2019-01-22-19-17-27.bpo-35766.gh1tHZ.rst M Doc/library/ast.rst M Doc/library/token-list.inc M Doc/library/token.rst M Grammar/Grammar M Grammar/Tokens M Include/Python-ast.h M Include/compile.h M Include/graminit.h M Include/parsetok.h M Include/token.h M Lib/ast.py M Lib/symbol.py M Lib/test/test_asdl_parser.py M Lib/test/test_ast.py M Lib/token.py M Modules/parsermodule.c M Parser/Python.asdl M Parser/asdl_c.py M Parser/parser.c M Parser/parsetok.c M Parser/token.c M Parser/tokenizer.c M Parser/tokenizer.h M Python/Python-ast.c M Python/ast.c M Python/bltinmodule.c M Python/graminit.c M Python/pythonrun.c diff --git a/Doc/library/ast.rst b/Doc/library/ast.rst index 7715a28ce1b8..3df7f9ebc70c 100644 --- a/Doc/library/ast.rst +++ b/Doc/library/ast.rst @@ -126,16 +126,33 @@ The abstract grammar is currently defined as follows: Apart from the node classes, the :mod:`ast` module defines these utility functions and classes for traversing abstract syntax trees: -.. function:: parse(source, filename='', mode='exec') +.. function:: parse(source, filename='', mode='exec', *, type_comments=False) Parse the source into an AST node. Equivalent to ``compile(source, filename, mode, ast.PyCF_ONLY_AST)``. + If ``type_comments=True`` is given, the parser is modified to check + and return type comments as specified by :pep:`484` and :pep:`526`. + This is equivalent to adding :data:`ast.PyCF_TYPE_COMMENTS` to the + flags passed to :func:`compile()`. This will report syntax errors + for misplaced type comments. Without this flag, type comments will + be ignored, and the ``type_comment`` field on selected AST nodes + will always be ``None``. In addition, the locations of ``# type: + ignore`` comments will be returned as the ``type_ignores`` + attribute of :class:`Module` (otherwise it is always an empty list). + + In addition, if ``mode`` is ``'func_type'``, the input syntax is + modified to correspond to :pep:`484` "signature type comments", + e.g. ``(str, int) -> List[str]``. + .. warning:: It is possible to crash the Python interpreter with a sufficiently large/complex string due to stack depth limitations in Python's AST compiler. + .. versionchanged:: 3.8 + Added ``type_comments=True`` and ``mode='func_type'``. + .. function:: literal_eval(node_or_string) diff --git a/Doc/library/token-list.inc b/Doc/library/token-list.inc index 3ea9439be859..cb9fcd79effc 100644 --- a/Doc/library/token-list.inc +++ b/Doc/library/token-list.inc @@ -203,6 +203,10 @@ .. data:: OP +.. data:: TYPE_IGNORE + +.. data:: TYPE_COMMENT + .. data:: ERRORTOKEN .. data:: N_TOKENS diff --git a/Doc/library/token.rst b/Doc/library/token.rst index 5358eb5a291e..4936e9aa08f4 100644 --- a/Doc/library/token.rst +++ b/Doc/library/token.rst @@ -69,6 +69,13 @@ the :mod:`tokenize` module. always be an ``ENCODING`` token. +.. data:: TYPE_COMMENT + + Token value indicating that a type comment was recognized. Such + tokens are only produced when :func:`ast.parse()` is invoked with + ``type_comments=True``. + + .. versionchanged:: 3.5 Added :data:`AWAIT` and :data:`ASYNC` tokens. @@ -78,3 +85,6 @@ the :mod:`tokenize` module. .. versionchanged:: 3.7 Removed :data:`AWAIT` and :data:`ASYNC` tokens. "async" and "await" are now tokenized as :data:`NAME` tokens. + +.. versionchanged:: 3.8 + Added :data:`TYPE_COMMENT`. diff --git a/Grammar/Grammar b/Grammar/Grammar index 8455c1259259..e65a688e4cd8 100644 --- a/Grammar/Grammar +++ b/Grammar/Grammar @@ -7,7 +7,9 @@ # single_input is a single interactive statement; # file_input is a module or sequence of commands read from an input file; # eval_input is the input for the eval() functions. +# func_type_input is a PEP 484 Python 2 function type comment # NB: compound_stmt in single_input is followed by extra NEWLINE! +# NB: due to the way TYPE_COMMENT is tokenized it will always be followed by a NEWLINE single_input: NEWLINE | simple_stmt | compound_stmt NEWLINE file_input: (NEWLINE | stmt)* ENDMARKER eval_input: testlist NEWLINE* ENDMARKER @@ -17,14 +19,14 @@ decorators: decorator+ decorated: decorators (classdef | funcdef | async_funcdef) async_funcdef: 'async' funcdef -funcdef: 'def' NAME parameters ['->' test] ':' suite +funcdef: 'def' NAME parameters ['->' test] ':' [TYPE_COMMENT] func_body_suite parameters: '(' [typedargslist] ')' -typedargslist: (tfpdef ['=' test] (',' tfpdef ['=' test])* [',' [ - '*' [tfpdef] (',' tfpdef ['=' test])* [',' ['**' tfpdef [',']]] - | '**' tfpdef [',']]] - | '*' [tfpdef] (',' tfpdef ['=' test])* [',' ['**' tfpdef [',']]] - | '**' tfpdef [',']) +typedargslist: (tfpdef ['=' test] (',' [TYPE_COMMENT] tfpdef ['=' test])* (TYPE_COMMENT | [',' [TYPE_COMMENT] [ + '*' [tfpdef] (',' [TYPE_COMMENT] tfpdef ['=' test])* (TYPE_COMMENT | [',' [TYPE_COMMENT] ['**' tfpdef [','] [TYPE_COMMENT]]]) + | '**' tfpdef [','] [TYPE_COMMENT]]]) + | '*' [tfpdef] (',' [TYPE_COMMENT] tfpdef ['=' test])* (TYPE_COMMENT | [',' [TYPE_COMMENT] ['**' tfpdef [','] [TYPE_COMMENT]]]) + | '**' tfpdef [','] [TYPE_COMMENT]) tfpdef: NAME [':' test] varargslist: (vfpdef ['=' test] (',' vfpdef ['=' test])* [',' [ '*' [vfpdef] (',' vfpdef ['=' test])* [',' ['**' vfpdef [',']]] @@ -39,7 +41,7 @@ simple_stmt: small_stmt (';' small_stmt)* [';'] NEWLINE small_stmt: (expr_stmt | del_stmt | pass_stmt | flow_stmt | import_stmt | global_stmt | nonlocal_stmt | assert_stmt) expr_stmt: testlist_star_expr (annassign | augassign (yield_expr|testlist) | - ('=' (yield_expr|testlist_star_expr))*) + [('=' (yield_expr|testlist_star_expr))+ [TYPE_COMMENT]] ) annassign: ':' test ['=' (yield_expr|testlist)] testlist_star_expr: (test|star_expr) (',' (test|star_expr))* [','] augassign: ('+=' | '-=' | '*=' | '@=' | '/=' | '%=' | '&=' | '|=' | '^=' | @@ -71,13 +73,13 @@ compound_stmt: if_stmt | while_stmt | for_stmt | try_stmt | with_stmt | funcdef async_stmt: 'async' (funcdef | with_stmt | for_stmt) if_stmt: 'if' namedexpr_test ':' suite ('elif' namedexpr_test ':' suite)* ['else' ':' suite] while_stmt: 'while' test ':' suite ['else' ':' suite] -for_stmt: 'for' exprlist 'in' testlist ':' suite ['else' ':' suite] +for_stmt: 'for' exprlist 'in' testlist ':' [TYPE_COMMENT] suite ['else' ':' suite] try_stmt: ('try' ':' suite ((except_clause ':' suite)+ ['else' ':' suite] ['finally' ':' suite] | 'finally' ':' suite)) -with_stmt: 'with' with_item (',' with_item)* ':' suite +with_stmt: 'with' with_item (',' with_item)* ':' [TYPE_COMMENT] suite with_item: test ['as' expr] # NB compile.c makes sure that the default except clause is last except_clause: 'except' [test ['as' NAME]] @@ -150,3 +152,14 @@ encoding_decl: NAME yield_expr: 'yield' [yield_arg] yield_arg: 'from' test | testlist_star_expr + +# the TYPE_COMMENT in suites is only parsed for funcdefs, +# but can't go elsewhere due to ambiguity +func_body_suite: simple_stmt | NEWLINE [TYPE_COMMENT NEWLINE] INDENT stmt+ DEDENT + +func_type_input: func_type NEWLINE* ENDMARKER +func_type: '(' [typelist] ')' '->' test +# typelist is a modified typedargslist (see above) +typelist: (test (',' test)* [',' + ['*' [test] (',' test)* [',' '**' test] | '**' test]] + | '*' [test] (',' test)* [',' '**' test] | '**' test) diff --git a/Grammar/Tokens b/Grammar/Tokens index f6f303bd5292..1d45e05ea21d 100644 --- a/Grammar/Tokens +++ b/Grammar/Tokens @@ -55,6 +55,8 @@ ELLIPSIS '...' COLONEQUAL ':=' OP +TYPE_IGNORE +TYPE_COMMENT ERRORTOKEN # These aren't used by the C tokenizer but are needed for tokenize.py diff --git a/Include/Python-ast.h b/Include/Python-ast.h index 3527ae8949e7..106cecff887e 100644 --- a/Include/Python-ast.h +++ b/Include/Python-ast.h @@ -46,14 +46,17 @@ typedef struct _alias *alias_ty; typedef struct _withitem *withitem_ty; +typedef struct _type_ignore *type_ignore_ty; + enum _mod_kind {Module_kind=1, Interactive_kind=2, Expression_kind=3, - Suite_kind=4}; + FunctionType_kind=4, Suite_kind=5}; struct _mod { enum _mod_kind kind; union { struct { asdl_seq *body; + asdl_seq *type_ignores; } Module; struct { @@ -64,6 +67,11 @@ struct _mod { expr_ty body; } Expression; + struct { + asdl_seq *argtypes; + expr_ty returns; + } FunctionType; + struct { asdl_seq *body; } Suite; @@ -88,6 +96,7 @@ struct _stmt { asdl_seq *body; asdl_seq *decorator_list; expr_ty returns; + string type_comment; } FunctionDef; struct { @@ -96,6 +105,7 @@ struct _stmt { asdl_seq *body; asdl_seq *decorator_list; expr_ty returns; + string type_comment; } AsyncFunctionDef; struct { @@ -117,6 +127,7 @@ struct _stmt { struct { asdl_seq *targets; expr_ty value; + string type_comment; } Assign; struct { @@ -137,6 +148,7 @@ struct _stmt { expr_ty iter; asdl_seq *body; asdl_seq *orelse; + string type_comment; } For; struct { @@ -144,6 +156,7 @@ struct _stmt { expr_ty iter; asdl_seq *body; asdl_seq *orelse; + string type_comment; } AsyncFor; struct { @@ -161,11 +174,13 @@ struct _stmt { struct { asdl_seq *items; asdl_seq *body; + string type_comment; } With; struct { asdl_seq *items; asdl_seq *body; + string type_comment; } AsyncWith; struct { @@ -421,6 +436,7 @@ struct _arguments { struct _arg { identifier arg; expr_ty annotation; + string type_comment; int lineno; int col_offset; int end_lineno; @@ -442,26 +458,40 @@ struct _withitem { expr_ty optional_vars; }; +enum _type_ignore_kind {TypeIgnore_kind=1}; +struct _type_ignore { + enum _type_ignore_kind kind; + union { + struct { + int lineno; + } TypeIgnore; + + } v; +}; + // Note: these macros affect function definitions, not only call sites. -#define Module(a0, a1) _Py_Module(a0, a1) -mod_ty _Py_Module(asdl_seq * body, PyArena *arena); +#define Module(a0, a1, a2) _Py_Module(a0, a1, a2) +mod_ty _Py_Module(asdl_seq * body, asdl_seq * type_ignores, PyArena *arena); #define Interactive(a0, a1) _Py_Interactive(a0, a1) mod_ty _Py_Interactive(asdl_seq * body, PyArena *arena); #define Expression(a0, a1) _Py_Expression(a0, a1) mod_ty _Py_Expression(expr_ty body, PyArena *arena); +#define FunctionType(a0, a1, a2) _Py_FunctionType(a0, a1, a2) +mod_ty _Py_FunctionType(asdl_seq * argtypes, expr_ty returns, PyArena *arena); #define Suite(a0, a1) _Py_Suite(a0, a1) mod_ty _Py_Suite(asdl_seq * body, PyArena *arena); -#define FunctionDef(a0, a1, a2, a3, a4, a5, a6, a7, a8, a9) _Py_FunctionDef(a0, a1, a2, a3, a4, a5, a6, a7, a8, a9) +#define FunctionDef(a0, a1, a2, a3, a4, a5, a6, a7, a8, a9, a10) _Py_FunctionDef(a0, a1, a2, a3, a4, a5, a6, a7, a8, a9, a10) stmt_ty _Py_FunctionDef(identifier name, arguments_ty args, asdl_seq * body, - asdl_seq * decorator_list, expr_ty returns, int lineno, - int col_offset, int end_lineno, int end_col_offset, - PyArena *arena); -#define AsyncFunctionDef(a0, a1, a2, a3, a4, a5, a6, a7, a8, a9) _Py_AsyncFunctionDef(a0, a1, a2, a3, a4, a5, a6, a7, a8, a9) + asdl_seq * decorator_list, expr_ty returns, string + type_comment, int lineno, int col_offset, int + end_lineno, int end_col_offset, PyArena *arena); +#define AsyncFunctionDef(a0, a1, a2, a3, a4, a5, a6, a7, a8, a9, a10) _Py_AsyncFunctionDef(a0, a1, a2, a3, a4, a5, a6, a7, a8, a9, a10) stmt_ty _Py_AsyncFunctionDef(identifier name, arguments_ty args, asdl_seq * body, asdl_seq * decorator_list, expr_ty returns, - int lineno, int col_offset, int end_lineno, int - end_col_offset, PyArena *arena); + string type_comment, int lineno, int col_offset, + int end_lineno, int end_col_offset, PyArena + *arena); #define ClassDef(a0, a1, a2, a3, a4, a5, a6, a7, a8, a9) _Py_ClassDef(a0, a1, a2, a3, a4, a5, a6, a7, a8, a9) stmt_ty _Py_ClassDef(identifier name, asdl_seq * bases, asdl_seq * keywords, asdl_seq * body, asdl_seq * decorator_list, int lineno, @@ -473,10 +503,10 @@ stmt_ty _Py_Return(expr_ty value, int lineno, int col_offset, int end_lineno, #define Delete(a0, a1, a2, a3, a4, a5) _Py_Delete(a0, a1, a2, a3, a4, a5) stmt_ty _Py_Delete(asdl_seq * targets, int lineno, int col_offset, int end_lineno, int end_col_offset, PyArena *arena); -#define Assign(a0, a1, a2, a3, a4, a5, a6) _Py_Assign(a0, a1, a2, a3, a4, a5, a6) -stmt_ty _Py_Assign(asdl_seq * targets, expr_ty value, int lineno, int - col_offset, int end_lineno, int end_col_offset, PyArena - *arena); +#define Assign(a0, a1, a2, a3, a4, a5, a6, a7) _Py_Assign(a0, a1, a2, a3, a4, a5, a6, a7) +stmt_ty _Py_Assign(asdl_seq * targets, expr_ty value, string type_comment, int + lineno, int col_offset, int end_lineno, int end_col_offset, + PyArena *arena); #define AugAssign(a0, a1, a2, a3, a4, a5, a6, a7) _Py_AugAssign(a0, a1, a2, a3, a4, a5, a6, a7) stmt_ty _Py_AugAssign(expr_ty target, operator_ty op, expr_ty value, int lineno, int col_offset, int end_lineno, int @@ -485,14 +515,14 @@ stmt_ty _Py_AugAssign(expr_ty target, operator_ty op, expr_ty value, int stmt_ty _Py_AnnAssign(expr_ty target, expr_ty annotation, expr_ty value, int simple, int lineno, int col_offset, int end_lineno, int end_col_offset, PyArena *arena); -#define For(a0, a1, a2, a3, a4, a5, a6, a7, a8) _Py_For(a0, a1, a2, a3, a4, a5, a6, a7, a8) +#define For(a0, a1, a2, a3, a4, a5, a6, a7, a8, a9) _Py_For(a0, a1, a2, a3, a4, a5, a6, a7, a8, a9) stmt_ty _Py_For(expr_ty target, expr_ty iter, asdl_seq * body, asdl_seq * - orelse, int lineno, int col_offset, int end_lineno, int - end_col_offset, PyArena *arena); -#define AsyncFor(a0, a1, a2, a3, a4, a5, a6, a7, a8) _Py_AsyncFor(a0, a1, a2, a3, a4, a5, a6, a7, a8) + orelse, string type_comment, int lineno, int col_offset, int + end_lineno, int end_col_offset, PyArena *arena); +#define AsyncFor(a0, a1, a2, a3, a4, a5, a6, a7, a8, a9) _Py_AsyncFor(a0, a1, a2, a3, a4, a5, a6, a7, a8, a9) stmt_ty _Py_AsyncFor(expr_ty target, expr_ty iter, asdl_seq * body, asdl_seq * - orelse, int lineno, int col_offset, int end_lineno, int - end_col_offset, PyArena *arena); + orelse, string type_comment, int lineno, int col_offset, + int end_lineno, int end_col_offset, PyArena *arena); #define While(a0, a1, a2, a3, a4, a5, a6, a7) _Py_While(a0, a1, a2, a3, a4, a5, a6, a7) stmt_ty _Py_While(expr_ty test, asdl_seq * body, asdl_seq * orelse, int lineno, int col_offset, int end_lineno, int end_col_offset, PyArena @@ -501,13 +531,14 @@ stmt_ty _Py_While(expr_ty test, asdl_seq * body, asdl_seq * orelse, int lineno, stmt_ty _Py_If(expr_ty test, asdl_seq * body, asdl_seq * orelse, int lineno, int col_offset, int end_lineno, int end_col_offset, PyArena *arena); -#define With(a0, a1, a2, a3, a4, a5, a6) _Py_With(a0, a1, a2, a3, a4, a5, a6) -stmt_ty _Py_With(asdl_seq * items, asdl_seq * body, int lineno, int col_offset, - int end_lineno, int end_col_offset, PyArena *arena); -#define AsyncWith(a0, a1, a2, a3, a4, a5, a6) _Py_AsyncWith(a0, a1, a2, a3, a4, a5, a6) -stmt_ty _Py_AsyncWith(asdl_seq * items, asdl_seq * body, int lineno, int - col_offset, int end_lineno, int end_col_offset, PyArena - *arena); +#define With(a0, a1, a2, a3, a4, a5, a6, a7) _Py_With(a0, a1, a2, a3, a4, a5, a6, a7) +stmt_ty _Py_With(asdl_seq * items, asdl_seq * body, string type_comment, int + lineno, int col_offset, int end_lineno, int end_col_offset, + PyArena *arena); +#define AsyncWith(a0, a1, a2, a3, a4, a5, a6, a7) _Py_AsyncWith(a0, a1, a2, a3, a4, a5, a6, a7) +stmt_ty _Py_AsyncWith(asdl_seq * items, asdl_seq * body, string type_comment, + int lineno, int col_offset, int end_lineno, int + end_col_offset, PyArena *arena); #define Raise(a0, a1, a2, a3, a4, a5, a6) _Py_Raise(a0, a1, a2, a3, a4, a5, a6) stmt_ty _Py_Raise(expr_ty exc, expr_ty cause, int lineno, int col_offset, int end_lineno, int end_col_offset, PyArena *arena); @@ -656,9 +687,10 @@ excepthandler_ty _Py_ExceptHandler(expr_ty type, identifier name, asdl_seq * arguments_ty _Py_arguments(asdl_seq * args, arg_ty vararg, asdl_seq * kwonlyargs, asdl_seq * kw_defaults, arg_ty kwarg, asdl_seq * defaults, PyArena *arena); -#define arg(a0, a1, a2, a3, a4, a5, a6) _Py_arg(a0, a1, a2, a3, a4, a5, a6) -arg_ty _Py_arg(identifier arg, expr_ty annotation, int lineno, int col_offset, - int end_lineno, int end_col_offset, PyArena *arena); +#define arg(a0, a1, a2, a3, a4, a5, a6, a7) _Py_arg(a0, a1, a2, a3, a4, a5, a6, a7) +arg_ty _Py_arg(identifier arg, expr_ty annotation, string type_comment, int + lineno, int col_offset, int end_lineno, int end_col_offset, + PyArena *arena); #define keyword(a0, a1, a2) _Py_keyword(a0, a1, a2) keyword_ty _Py_keyword(identifier arg, expr_ty value, PyArena *arena); #define alias(a0, a1, a2) _Py_alias(a0, a1, a2) @@ -666,6 +698,8 @@ alias_ty _Py_alias(identifier name, identifier asname, PyArena *arena); #define withitem(a0, a1, a2) _Py_withitem(a0, a1, a2) withitem_ty _Py_withitem(expr_ty context_expr, expr_ty optional_vars, PyArena *arena); +#define TypeIgnore(a0, a1) _Py_TypeIgnore(a0, a1) +type_ignore_ty _Py_TypeIgnore(int lineno, PyArena *arena); PyObject* PyAST_mod2obj(mod_ty t); mod_ty PyAST_obj2mod(PyObject* ast, PyArena* arena, int mode); diff --git a/Include/compile.h b/Include/compile.h index 2dacfff37f8c..d0bbed8f558b 100644 --- a/Include/compile.h +++ b/Include/compile.h @@ -22,6 +22,7 @@ PyAPI_FUNC(PyCodeObject *) PyNode_Compile(struct _node *, const char *); #define PyCF_DONT_IMPLY_DEDENT 0x0200 #define PyCF_ONLY_AST 0x0400 #define PyCF_IGNORE_COOKIE 0x0800 +#define PyCF_TYPE_COMMENTS 0x1000 #ifndef Py_LIMITED_API typedef struct { @@ -85,10 +86,10 @@ PyAPI_FUNC(int) _PyAST_Optimize(struct _mod *, PyArena *arena, int optimize); #endif /* !Py_LIMITED_API */ -/* These definitions must match corresponding definitions in graminit.h. - There's code in compile.c that checks that they are the same. */ +/* These definitions must match corresponding definitions in graminit.h. */ #define Py_single_input 256 #define Py_file_input 257 #define Py_eval_input 258 +#define Py_func_type_input 345 #endif /* !Py_COMPILE_H */ diff --git a/Include/graminit.h b/Include/graminit.h index e3acff8a1e83..d1027b7a743f 100644 --- a/Include/graminit.h +++ b/Include/graminit.h @@ -88,3 +88,7 @@ #define encoding_decl 341 #define yield_expr 342 #define yield_arg 343 +#define func_body_suite 344 +#define func_type_input 345 +#define func_type 346 +#define typelist 347 diff --git a/Include/parsetok.h b/Include/parsetok.h index 1217c46d0ed7..e95dd31fb79a 100644 --- a/Include/parsetok.h +++ b/Include/parsetok.h @@ -37,6 +37,7 @@ typedef struct { #define PyPARSE_IGNORE_COOKIE 0x0010 #define PyPARSE_BARRY_AS_BDFL 0x0020 +#define PyPARSE_TYPE_COMMENTS 0x0040 PyAPI_FUNC(node *) PyParser_ParseString(const char *, grammar *, int, perrdetail *); diff --git a/Include/token.h b/Include/token.h index b87b84cd966d..ef6bf969b7c9 100644 --- a/Include/token.h +++ b/Include/token.h @@ -65,8 +65,10 @@ extern "C" { #define ELLIPSIS 52 #define COLONEQUAL 53 #define OP 54 -#define ERRORTOKEN 55 -#define N_TOKENS 59 +#define TYPE_IGNORE 55 +#define TYPE_COMMENT 56 +#define ERRORTOKEN 57 +#define N_TOKENS 61 #define NT_OFFSET 256 /* Special definitions for cooperation with parser */ diff --git a/Lib/ast.py b/Lib/ast.py index 6c1e978b0586..470a74b3b5ff 100644 --- a/Lib/ast.py +++ b/Lib/ast.py @@ -27,12 +27,16 @@ from _ast import * -def parse(source, filename='', mode='exec'): +def parse(source, filename='', mode='exec', *, type_comments=False): """ Parse the source into an AST node. Equivalent to compile(source, filename, mode, PyCF_ONLY_AST). + Pass type_comments=True to get back type comments where the syntax allows. """ - return compile(source, filename, mode, PyCF_ONLY_AST) + flags = PyCF_ONLY_AST + if type_comments: + flags |= PyCF_TYPE_COMMENTS + return compile(source, filename, mode, flags) def literal_eval(node_or_string): diff --git a/Lib/symbol.py b/Lib/symbol.py index b3fa08984d87..36e0eec7ac1f 100644 --- a/Lib/symbol.py +++ b/Lib/symbol.py @@ -100,6 +100,10 @@ encoding_decl = 341 yield_expr = 342 yield_arg = 343 +func_body_suite = 344 +func_type_input = 345 +func_type = 346 +typelist = 347 #--end constants-- sym_name = {} diff --git a/Lib/test/test_asdl_parser.py b/Lib/test/test_asdl_parser.py index 30e6466dbb32..9eaceecd50db 100644 --- a/Lib/test/test_asdl_parser.py +++ b/Lib/test/test_asdl_parser.py @@ -117,7 +117,8 @@ def visitConstructor(self, cons): v = CustomVisitor() v.visit(self.types['mod']) - self.assertEqual(v.names_with_seq, ['Module', 'Interactive', 'Suite']) + self.assertEqual(v.names_with_seq, + ['Module', 'Module', 'Interactive', 'FunctionType', 'Suite']) if __name__ == '__main__': diff --git a/Lib/test/test_ast.py b/Lib/test/test_ast.py index 09e425de2c30..609c8b2c6a7f 100644 --- a/Lib/test/test_ast.py +++ b/Lib/test/test_ast.py @@ -455,7 +455,7 @@ class N2(ast.Num): def test_module(self): body = [ast.Num(42)] - x = ast.Module(body) + x = ast.Module(body, []) self.assertEqual(x.body, body) def test_nodeclasses(self): @@ -524,13 +524,13 @@ def test_pickling(self): def test_invalid_sum(self): pos = dict(lineno=2, col_offset=3) - m = ast.Module([ast.Expr(ast.expr(**pos), **pos)]) + m = ast.Module([ast.Expr(ast.expr(**pos), **pos)], []) with self.assertRaises(TypeError) as cm: compile(m, "", "exec") self.assertIn("but got <_ast.expr", str(cm.exception)) def test_invalid_identitifer(self): - m = ast.Module([ast.Expr(ast.Name(42, ast.Load()))]) + m = ast.Module([ast.Expr(ast.Name(42, ast.Load()))], []) ast.fix_missing_locations(m) with self.assertRaises(TypeError) as cm: compile(m, "", "exec") @@ -575,11 +575,11 @@ def test_dump(self): self.assertEqual(ast.dump(node), "Module(body=[Expr(value=Call(func=Name(id='spam', ctx=Load()), " "args=[Name(id='eggs', ctx=Load()), Constant(value='and cheese')], " - "keywords=[]))])" + "keywords=[]))], type_ignores=[])" ) self.assertEqual(ast.dump(node, annotate_fields=False), "Module([Expr(Call(Name('spam', Load()), [Name('eggs', Load()), " - "Constant('and cheese')], []))])" + "Constant('and cheese')], []))], [])" ) self.assertEqual(ast.dump(node, include_attributes=True), "Module(body=[Expr(value=Call(func=Name(id='spam', ctx=Load(), " @@ -588,7 +588,7 @@ def test_dump(self): "end_lineno=1, end_col_offset=9), Constant(value='and cheese', " "lineno=1, col_offset=11, end_lineno=1, end_col_offset=23)], keywords=[], " "lineno=1, col_offset=0, end_lineno=1, end_col_offset=24), " - "lineno=1, col_offset=0, end_lineno=1, end_col_offset=24)])" + "lineno=1, col_offset=0, end_lineno=1, end_col_offset=24)], type_ignores=[])" ) def test_copy_location(self): @@ -617,7 +617,8 @@ def test_fix_missing_locations(self): "lineno=1, col_offset=0, end_lineno=1, end_col_offset=0), " "args=[Constant(value='eggs', lineno=1, col_offset=0, end_lineno=1, " "end_col_offset=0)], keywords=[], lineno=1, col_offset=0, end_lineno=1, " - "end_col_offset=0), lineno=1, col_offset=0, end_lineno=1, end_col_offset=0)])" + "end_col_offset=0), lineno=1, col_offset=0, end_lineno=1, end_col_offset=0)], " + "type_ignores=[])" ) def test_increment_lineno(self): @@ -760,7 +761,7 @@ def test_bad_integer(self): names=[ast.alias(name='sleep')], level=None, lineno=None, col_offset=None)] - mod = ast.Module(body) + mod = ast.Module(body, []) with self.assertRaises(ValueError) as cm: compile(mod, 'test', 'exec') self.assertIn("invalid integer value: None", str(cm.exception)) @@ -770,7 +771,7 @@ def test_level_as_none(self): names=[ast.alias(name='sleep')], level=None, lineno=0, col_offset=0)] - mod = ast.Module(body) + mod = ast.Module(body, []) code = compile(mod, 'test', 'exec') ns = {} exec(code, ns) @@ -790,11 +791,11 @@ def mod(self, mod, msg=None, mode="exec", *, exc=ValueError): self.assertIn(msg, str(cm.exception)) def expr(self, node, msg=None, *, exc=ValueError): - mod = ast.Module([ast.Expr(node)]) + mod = ast.Module([ast.Expr(node)], []) self.mod(mod, msg, exc=exc) def stmt(self, stmt, msg=None): - mod = ast.Module([stmt]) + mod = ast.Module([stmt], []) self.mod(mod, msg) def test_module(self): @@ -1603,61 +1604,61 @@ def main(): raise SystemExit unittest.main() -#### EVERYTHING BELOW IS GENERATED ##### +#### EVERYTHING BELOW IS GENERATED BY python Lib/test/test_ast.py -g ##### exec_results = [ -('Module', [('Expr', (1, 0), ('Constant', (1, 0), None))]), -('Module', [('Expr', (1, 0), ('Constant', (1, 0), 'module docstring'))]), -('Module', [('FunctionDef', (1, 0), 'f', ('arguments', [], None, [], [], None, []), [('Pass', (1, 9))], [], None)]), -('Module', [('FunctionDef', (1, 0), 'f', ('arguments', [], None, [], [], None, []), [('Expr', (1, 9), ('Constant', (1, 9), 'function docstring'))], [], None)]), -('Module', [('FunctionDef', (1, 0), 'f', ('arguments', [('arg', (1, 6), 'a', None)], None, [], [], None, []), [('Pass', (1, 10))], [], None)]), -('Module', [('FunctionDef', (1, 0), 'f', ('arguments', [('arg', (1, 6), 'a', None)], None, [], [], None, [('Constant', (1, 8), 0)]), [('Pass', (1, 12))], [], None)]), -('Module', [('FunctionDef', (1, 0), 'f', ('arguments', [], ('arg', (1, 7), 'args', None), [], [], None, []), [('Pass', (1, 14))], [], None)]), -('Module', [('FunctionDef', (1, 0), 'f', ('arguments', [], None, [], [], ('arg', (1, 8), 'kwargs', None), []), [('Pass', (1, 17))], [], None)]), -('Module', [('FunctionDef', (1, 0), 'f', ('arguments', [('arg', (1, 6), 'a', None), ('arg', (1, 9), 'b', None), ('arg', (1, 14), 'c', None), ('arg', (1, 22), 'd', None), ('arg', (1, 28), 'e', None)], ('arg', (1, 35), 'args', None), [('arg', (1, 41), 'f', None)], [('Constant', (1, 43), 42)], ('arg', (1, 49), 'kwargs', None), [('Constant', (1, 11), 1), ('Constant', (1, 16), None), ('List', (1, 24), [], ('Load',)), ('Dict', (1, 30), [], [])]), [('Expr', (1, 58), ('Constant', (1, 58), 'doc for f()'))], [], None)]), -('Module', [('ClassDef', (1, 0), 'C', [], [], [('Pass', (1, 8))], [])]), -('Module', [('ClassDef', (1, 0), 'C', [], [], [('Expr', (1, 9), ('Constant', (1, 9), 'docstring for class C'))], [])]), -('Module', [('ClassDef', (1, 0), 'C', [('Name', (1, 8), 'object', ('Load',))], [], [('Pass', (1, 17))], [])]), -('Module', [('FunctionDef', (1, 0), 'f', ('arguments', [], None, [], [], None, []), [('Return', (1, 8), ('Constant', (1, 15), 1))], [], None)]), -('Module', [('Delete', (1, 0), [('Name', (1, 4), 'v', ('Del',))])]), -('Module', [('Assign', (1, 0), [('Name', (1, 0), 'v', ('Store',))], ('Constant', (1, 4), 1))]), -('Module', [('Assign', (1, 0), [('Tuple', (1, 0), [('Name', (1, 0), 'a', ('Store',)), ('Name', (1, 2), 'b', ('Store',))], ('Store',))], ('Name', (1, 6), 'c', ('Load',)))]), -('Module', [('Assign', (1, 0), [('Tuple', (1, 0), [('Name', (1, 1), 'a', ('Store',)), ('Name', (1, 3), 'b', ('Store',))], ('Store',))], ('Name', (1, 8), 'c', ('Load',)))]), -('Module', [('Assign', (1, 0), [('List', (1, 0), [('Name', (1, 1), 'a', ('Store',)), ('Name', (1, 3), 'b', ('Store',))], ('Store',))], ('Name', (1, 8), 'c', ('Load',)))]), -('Module', [('AugAssign', (1, 0), ('Name', (1, 0), 'v', ('Store',)), ('Add',), ('Constant', (1, 5), 1))]), -('Module', [('For', (1, 0), ('Name', (1, 4), 'v', ('Store',)), ('Name', (1, 9), 'v', ('Load',)), [('Pass', (1, 11))], [])]), -('Module', [('While', (1, 0), ('Name', (1, 6), 'v', ('Load',)), [('Pass', (1, 8))], [])]), -('Module', [('If', (1, 0), ('Name', (1, 3), 'v', ('Load',)), [('Pass', (1, 5))], [])]), -('Module', [('With', (1, 0), [('withitem', ('Name', (1, 5), 'x', ('Load',)), ('Name', (1, 10), 'y', ('Store',)))], [('Pass', (1, 13))])]), -('Module', [('With', (1, 0), [('withitem', ('Name', (1, 5), 'x', ('Load',)), ('Name', (1, 10), 'y', ('Store',))), ('withitem', ('Name', (1, 13), 'z', ('Load',)), ('Name', (1, 18), 'q', ('Store',)))], [('Pass', (1, 21))])]), -('Module', [('Raise', (1, 0), ('Call', (1, 6), ('Name', (1, 6), 'Exception', ('Load',)), [('Constant', (1, 16), 'string')], []), None)]), -('Module', [('Try', (1, 0), [('Pass', (2, 2))], [('ExceptHandler', (3, 0), ('Name', (3, 7), 'Exception', ('Load',)), None, [('Pass', (4, 2))])], [], [])]), -('Module', [('Try', (1, 0), [('Pass', (2, 2))], [], [], [('Pass', (4, 2))])]), -('Module', [('Assert', (1, 0), ('Name', (1, 7), 'v', ('Load',)), None)]), -('Module', [('Import', (1, 0), [('alias', 'sys', None)])]), -('Module', [('ImportFrom', (1, 0), 'sys', [('alias', 'v', None)], 0)]), -('Module', [('Global', (1, 0), ['v'])]), -('Module', [('Expr', (1, 0), ('Constant', (1, 0), 1))]), -('Module', [('Pass', (1, 0))]), -('Module', [('For', (1, 0), ('Name', (1, 4), 'v', ('Store',)), ('Name', (1, 9), 'v', ('Load',)), [('Break', (1, 11))], [])]), -('Module', [('For', (1, 0), ('Name', (1, 4), 'v', ('Store',)), ('Name', (1, 9), 'v', ('Load',)), [('Continue', (1, 11))], [])]), -('Module', [('For', (1, 0), ('Tuple', (1, 4), [('Name', (1, 4), 'a', ('Store',)), ('Name', (1, 6), 'b', ('Store',))], ('Store',)), ('Name', (1, 11), 'c', ('Load',)), [('Pass', (1, 14))], [])]), -('Module', [('For', (1, 0), ('Tuple', (1, 4), [('Name', (1, 5), 'a', ('Store',)), ('Name', (1, 7), 'b', ('Store',))], ('Store',)), ('Name', (1, 13), 'c', ('Load',)), [('Pass', (1, 16))], [])]), -('Module', [('For', (1, 0), ('List', (1, 4), [('Name', (1, 5), 'a', ('Store',)), ('Name', (1, 7), 'b', ('Store',))], ('Store',)), ('Name', (1, 13), 'c', ('Load',)), [('Pass', (1, 16))], [])]), -('Module', [('Expr', (1, 0), ('GeneratorExp', (1, 0), ('Tuple', (2, 4), [('Name', (3, 4), 'Aa', ('Load',)), ('Name', (5, 7), 'Bb', ('Load',))], ('Load',)), [('comprehension', ('Tuple', (8, 4), [('Name', (8, 4), 'Aa', ('Store',)), ('Name', (10, 4), 'Bb', ('Store',))], ('Store',)), ('Name', (10, 10), 'Cc', ('Load',)), [], 0)]))]), -('Module', [('Expr', (1, 0), ('DictComp', (1, 0), ('Name', (1, 1), 'a', ('Load',)), ('Name', (1, 5), 'b', ('Load',)), [('comprehension', ('Name', (1, 11), 'w', ('Store',)), ('Name', (1, 16), 'x', ('Load',)), [], 0), ('comprehension', ('Name', (1, 22), 'm', ('Store',)), ('Name', (1, 27), 'p', ('Load',)), [('Name', (1, 32), 'g', ('Load',))], 0)]))]), -('Module', [('Expr', (1, 0), ('DictComp', (1, 0), ('Name', (1, 1), 'a', ('Load',)), ('Name', (1, 5), 'b', ('Load',)), [('comprehension', ('Tuple', (1, 11), [('Name', (1, 11), 'v', ('Store',)), ('Name', (1, 13), 'w', ('Store',))], ('Store',)), ('Name', (1, 18), 'x', ('Load',)), [], 0)]))]), -('Module', [('Expr', (1, 0), ('SetComp', (1, 0), ('Name', (1, 1), 'r', ('Load',)), [('comprehension', ('Name', (1, 7), 'l', ('Store',)), ('Name', (1, 12), 'x', ('Load',)), [('Name', (1, 17), 'g', ('Load',))], 0)]))]), -('Module', [('Expr', (1, 0), ('SetComp', (1, 0), ('Name', (1, 1), 'r', ('Load',)), [('comprehension', ('Tuple', (1, 7), [('Name', (1, 7), 'l', ('Store',)), ('Name', (1, 9), 'm', ('Store',))], ('Store',)), ('Name', (1, 14), 'x', ('Load',)), [], 0)]))]), -('Module', [('AsyncFunctionDef', (1, 0), 'f', ('arguments', [], None, [], [], None, []), [('Expr', (2, 1), ('Constant', (2, 1), 'async function')), ('Expr', (3, 1), ('Await', (3, 1), ('Call', (3, 7), ('Name', (3, 7), 'something', ('Load',)), [], [])))], [], None)]), -('Module', [('AsyncFunctionDef', (1, 0), 'f', ('arguments', [], None, [], [], None, []), [('AsyncFor', (2, 1), ('Name', (2, 11), 'e', ('Store',)), ('Name', (2, 16), 'i', ('Load',)), [('Expr', (2, 19), ('Constant', (2, 19), 1))], [('Expr', (3, 7), ('Constant', (3, 7), 2))])], [], None)]), -('Module', [('AsyncFunctionDef', (1, 0), 'f', ('arguments', [], None, [], [], None, []), [('AsyncWith', (2, 1), [('withitem', ('Name', (2, 12), 'a', ('Load',)), ('Name', (2, 17), 'b', ('Store',)))], [('Expr', (2, 20), ('Constant', (2, 20), 1))])], [], None)]), -('Module', [('Expr', (1, 0), ('Dict', (1, 0), [None, ('Constant', (1, 10), 2)], [('Dict', (1, 3), [('Constant', (1, 4), 1)], [('Constant', (1, 6), 2)]), ('Constant', (1, 12), 3)]))]), -('Module', [('Expr', (1, 0), ('Set', (1, 0), [('Starred', (1, 1), ('Set', (1, 2), [('Constant', (1, 3), 1), ('Constant', (1, 6), 2)]), ('Load',)), ('Constant', (1, 10), 3)]))]), -('Module', [('AsyncFunctionDef', (1, 0), 'f', ('arguments', [], None, [], [], None, []), [('Expr', (2, 1), ('ListComp', (2, 1), ('Name', (2, 2), 'i', ('Load',)), [('comprehension', ('Name', (2, 14), 'b', ('Store',)), ('Name', (2, 19), 'c', ('Load',)), [], 1)]))], [], None)]), -('Module', [('FunctionDef', (3, 0), 'f', ('arguments', [], None, [], [], None, []), [('Pass', (3, 9))], [('Name', (1, 1), 'deco1', ('Load',)), ('Call', (2, 0), ('Name', (2, 1), 'deco2', ('Load',)), [], [])], None)]), -('Module', [('AsyncFunctionDef', (3, 0), 'f', ('arguments', [], None, [], [], None, []), [('Pass', (3, 15))], [('Name', (1, 1), 'deco1', ('Load',)), ('Call', (2, 0), ('Name', (2, 1), 'deco2', ('Load',)), [], [])], None)]), -('Module', [('ClassDef', (3, 0), 'C', [], [], [('Pass', (3, 9))], [('Name', (1, 1), 'deco1', ('Load',)), ('Call', (2, 0), ('Name', (2, 1), 'deco2', ('Load',)), [], [])])]), -('Module', [('FunctionDef', (2, 0), 'f', ('arguments', [], None, [], [], None, []), [('Pass', (2, 9))], [('Call', (1, 1), ('Name', (1, 1), 'deco', ('Load',)), [('GeneratorExp', (1, 5), ('Name', (1, 6), 'a', ('Load',)), [('comprehension', ('Name', (1, 12), 'a', ('Store',)), ('Name', (1, 17), 'b', ('Load',)), [], 0)])], [])], None)]), +('Module', [('Expr', (1, 0), ('Constant', (1, 0), None))], []), +('Module', [('Expr', (1, 0), ('Constant', (1, 0), 'module docstring'))], []), +('Module', [('FunctionDef', (1, 0), 'f', ('arguments', [], None, [], [], None, []), [('Pass', (1, 9))], [], None, None)], []), +('Module', [('FunctionDef', (1, 0), 'f', ('arguments', [], None, [], [], None, []), [('Expr', (1, 9), ('Constant', (1, 9), 'function docstring'))], [], None, None)], []), +('Module', [('FunctionDef', (1, 0), 'f', ('arguments', [('arg', (1, 6), 'a', None, None)], None, [], [], None, []), [('Pass', (1, 10))], [], None, None)], []), +('Module', [('FunctionDef', (1, 0), 'f', ('arguments', [('arg', (1, 6), 'a', None, None)], None, [], [], None, [('Constant', (1, 8), 0)]), [('Pass', (1, 12))], [], None, None)], []), +('Module', [('FunctionDef', (1, 0), 'f', ('arguments', [], ('arg', (1, 7), 'args', None, None), [], [], None, []), [('Pass', (1, 14))], [], None, None)], []), +('Module', [('FunctionDef', (1, 0), 'f', ('arguments', [], None, [], [], ('arg', (1, 8), 'kwargs', None, None), []), [('Pass', (1, 17))], [], None, None)], []), +('Module', [('FunctionDef', (1, 0), 'f', ('arguments', [('arg', (1, 6), 'a', None, None), ('arg', (1, 9), 'b', None, None), ('arg', (1, 14), 'c', None, None), ('arg', (1, 22), 'd', None, None), ('arg', (1, 28), 'e', None, None)], ('arg', (1, 35), 'args', None, None), [('arg', (1, 41), 'f', None, None)], [('Constant', (1, 43), 42)], ('arg', (1, 49), 'kwargs', None, None), [('Constant', (1, 11), 1), ('Constant', (1, 16), None), ('List', (1, 24), [], ('Load',)), ('Dict', (1, 30), [], [])]), [('Expr', (1, 58), ('Constant', (1, 58), 'doc for f()'))], [], None, None)], []), +('Module', [('ClassDef', (1, 0), 'C', [], [], [('Pass', (1, 8))], [])], []), +('Module', [('ClassDef', (1, 0), 'C', [], [], [('Expr', (1, 9), ('Constant', (1, 9), 'docstring for class C'))], [])], []), +('Module', [('ClassDef', (1, 0), 'C', [('Name', (1, 8), 'object', ('Load',))], [], [('Pass', (1, 17))], [])], []), +('Module', [('FunctionDef', (1, 0), 'f', ('arguments', [], None, [], [], None, []), [('Return', (1, 8), ('Constant', (1, 15), 1))], [], None, None)], []), +('Module', [('Delete', (1, 0), [('Name', (1, 4), 'v', ('Del',))])], []), +('Module', [('Assign', (1, 0), [('Name', (1, 0), 'v', ('Store',))], ('Constant', (1, 4), 1), None)], []), +('Module', [('Assign', (1, 0), [('Tuple', (1, 0), [('Name', (1, 0), 'a', ('Store',)), ('Name', (1, 2), 'b', ('Store',))], ('Store',))], ('Name', (1, 6), 'c', ('Load',)), None)], []), +('Module', [('Assign', (1, 0), [('Tuple', (1, 0), [('Name', (1, 1), 'a', ('Store',)), ('Name', (1, 3), 'b', ('Store',))], ('Store',))], ('Name', (1, 8), 'c', ('Load',)), None)], []), +('Module', [('Assign', (1, 0), [('List', (1, 0), [('Name', (1, 1), 'a', ('Store',)), ('Name', (1, 3), 'b', ('Store',))], ('Store',))], ('Name', (1, 8), 'c', ('Load',)), None)], []), +('Module', [('AugAssign', (1, 0), ('Name', (1, 0), 'v', ('Store',)), ('Add',), ('Constant', (1, 5), 1))], []), +('Module', [('For', (1, 0), ('Name', (1, 4), 'v', ('Store',)), ('Name', (1, 9), 'v', ('Load',)), [('Pass', (1, 11))], [], None)], []), +('Module', [('While', (1, 0), ('Name', (1, 6), 'v', ('Load',)), [('Pass', (1, 8))], [])], []), +('Module', [('If', (1, 0), ('Name', (1, 3), 'v', ('Load',)), [('Pass', (1, 5))], [])], []), +('Module', [('With', (1, 0), [('withitem', ('Name', (1, 5), 'x', ('Load',)), ('Name', (1, 10), 'y', ('Store',)))], [('Pass', (1, 13))], None)], []), +('Module', [('With', (1, 0), [('withitem', ('Name', (1, 5), 'x', ('Load',)), ('Name', (1, 10), 'y', ('Store',))), ('withitem', ('Name', (1, 13), 'z', ('Load',)), ('Name', (1, 18), 'q', ('Store',)))], [('Pass', (1, 21))], None)], []), +('Module', [('Raise', (1, 0), ('Call', (1, 6), ('Name', (1, 6), 'Exception', ('Load',)), [('Constant', (1, 16), 'string')], []), None)], []), +('Module', [('Try', (1, 0), [('Pass', (2, 2))], [('ExceptHandler', (3, 0), ('Name', (3, 7), 'Exception', ('Load',)), None, [('Pass', (4, 2))])], [], [])], []), +('Module', [('Try', (1, 0), [('Pass', (2, 2))], [], [], [('Pass', (4, 2))])], []), +('Module', [('Assert', (1, 0), ('Name', (1, 7), 'v', ('Load',)), None)], []), +('Module', [('Import', (1, 0), [('alias', 'sys', None)])], []), +('Module', [('ImportFrom', (1, 0), 'sys', [('alias', 'v', None)], 0)], []), +('Module', [('Global', (1, 0), ['v'])], []), +('Module', [('Expr', (1, 0), ('Constant', (1, 0), 1))], []), +('Module', [('Pass', (1, 0))], []), +('Module', [('For', (1, 0), ('Name', (1, 4), 'v', ('Store',)), ('Name', (1, 9), 'v', ('Load',)), [('Break', (1, 11))], [], None)], []), +('Module', [('For', (1, 0), ('Name', (1, 4), 'v', ('Store',)), ('Name', (1, 9), 'v', ('Load',)), [('Continue', (1, 11))], [], None)], []), +('Module', [('For', (1, 0), ('Tuple', (1, 4), [('Name', (1, 4), 'a', ('Store',)), ('Name', (1, 6), 'b', ('Store',))], ('Store',)), ('Name', (1, 11), 'c', ('Load',)), [('Pass', (1, 14))], [], None)], []), +('Module', [('For', (1, 0), ('Tuple', (1, 4), [('Name', (1, 5), 'a', ('Store',)), ('Name', (1, 7), 'b', ('Store',))], ('Store',)), ('Name', (1, 13), 'c', ('Load',)), [('Pass', (1, 16))], [], None)], []), +('Module', [('For', (1, 0), ('List', (1, 4), [('Name', (1, 5), 'a', ('Store',)), ('Name', (1, 7), 'b', ('Store',))], ('Store',)), ('Name', (1, 13), 'c', ('Load',)), [('Pass', (1, 16))], [], None)], []), +('Module', [('Expr', (1, 0), ('GeneratorExp', (1, 0), ('Tuple', (2, 4), [('Name', (3, 4), 'Aa', ('Load',)), ('Name', (5, 7), 'Bb', ('Load',))], ('Load',)), [('comprehension', ('Tuple', (8, 4), [('Name', (8, 4), 'Aa', ('Store',)), ('Name', (10, 4), 'Bb', ('Store',))], ('Store',)), ('Name', (10, 10), 'Cc', ('Load',)), [], 0)]))], []), +('Module', [('Expr', (1, 0), ('DictComp', (1, 0), ('Name', (1, 1), 'a', ('Load',)), ('Name', (1, 5), 'b', ('Load',)), [('comprehension', ('Name', (1, 11), 'w', ('Store',)), ('Name', (1, 16), 'x', ('Load',)), [], 0), ('comprehension', ('Name', (1, 22), 'm', ('Store',)), ('Name', (1, 27), 'p', ('Load',)), [('Name', (1, 32), 'g', ('Load',))], 0)]))], []), +('Module', [('Expr', (1, 0), ('DictComp', (1, 0), ('Name', (1, 1), 'a', ('Load',)), ('Name', (1, 5), 'b', ('Load',)), [('comprehension', ('Tuple', (1, 11), [('Name', (1, 11), 'v', ('Store',)), ('Name', (1, 13), 'w', ('Store',))], ('Store',)), ('Name', (1, 18), 'x', ('Load',)), [], 0)]))], []), +('Module', [('Expr', (1, 0), ('SetComp', (1, 0), ('Name', (1, 1), 'r', ('Load',)), [('comprehension', ('Name', (1, 7), 'l', ('Store',)), ('Name', (1, 12), 'x', ('Load',)), [('Name', (1, 17), 'g', ('Load',))], 0)]))], []), +('Module', [('Expr', (1, 0), ('SetComp', (1, 0), ('Name', (1, 1), 'r', ('Load',)), [('comprehension', ('Tuple', (1, 7), [('Name', (1, 7), 'l', ('Store',)), ('Name', (1, 9), 'm', ('Store',))], ('Store',)), ('Name', (1, 14), 'x', ('Load',)), [], 0)]))], []), +('Module', [('AsyncFunctionDef', (1, 0), 'f', ('arguments', [], None, [], [], None, []), [('Expr', (2, 1), ('Constant', (2, 1), 'async function')), ('Expr', (3, 1), ('Await', (3, 1), ('Call', (3, 7), ('Name', (3, 7), 'something', ('Load',)), [], [])))], [], None, None)], []), +('Module', [('AsyncFunctionDef', (1, 0), 'f', ('arguments', [], None, [], [], None, []), [('AsyncFor', (2, 1), ('Name', (2, 11), 'e', ('Store',)), ('Name', (2, 16), 'i', ('Load',)), [('Expr', (2, 19), ('Constant', (2, 19), 1))], [('Expr', (3, 7), ('Constant', (3, 7), 2))], None)], [], None, None)], []), +('Module', [('AsyncFunctionDef', (1, 0), 'f', ('arguments', [], None, [], [], None, []), [('AsyncWith', (2, 1), [('withitem', ('Name', (2, 12), 'a', ('Load',)), ('Name', (2, 17), 'b', ('Store',)))], [('Expr', (2, 20), ('Constant', (2, 20), 1))], None)], [], None, None)], []), +('Module', [('Expr', (1, 0), ('Dict', (1, 0), [None, ('Constant', (1, 10), 2)], [('Dict', (1, 3), [('Constant', (1, 4), 1)], [('Constant', (1, 6), 2)]), ('Constant', (1, 12), 3)]))], []), +('Module', [('Expr', (1, 0), ('Set', (1, 0), [('Starred', (1, 1), ('Set', (1, 2), [('Constant', (1, 3), 1), ('Constant', (1, 6), 2)]), ('Load',)), ('Constant', (1, 10), 3)]))], []), +('Module', [('AsyncFunctionDef', (1, 0), 'f', ('arguments', [], None, [], [], None, []), [('Expr', (2, 1), ('ListComp', (2, 1), ('Name', (2, 2), 'i', ('Load',)), [('comprehension', ('Name', (2, 14), 'b', ('Store',)), ('Name', (2, 19), 'c', ('Load',)), [], 1)]))], [], None, None)], []), +('Module', [('FunctionDef', (3, 0), 'f', ('arguments', [], None, [], [], None, []), [('Pass', (3, 9))], [('Name', (1, 1), 'deco1', ('Load',)), ('Call', (2, 0), ('Name', (2, 1), 'deco2', ('Load',)), [], [])], None, None)], []), +('Module', [('AsyncFunctionDef', (3, 0), 'f', ('arguments', [], None, [], [], None, []), [('Pass', (3, 15))], [('Name', (1, 1), 'deco1', ('Load',)), ('Call', (2, 0), ('Name', (2, 1), 'deco2', ('Load',)), [], [])], None, None)], []), +('Module', [('ClassDef', (3, 0), 'C', [], [], [('Pass', (3, 9))], [('Name', (1, 1), 'deco1', ('Load',)), ('Call', (2, 0), ('Name', (2, 1), 'deco2', ('Load',)), [], [])])], []), +('Module', [('FunctionDef', (2, 0), 'f', ('arguments', [], None, [], [], None, []), [('Pass', (2, 9))], [('Call', (1, 1), ('Name', (1, 1), 'deco', ('Load',)), [('GeneratorExp', (1, 5), ('Name', (1, 6), 'a', ('Load',)), [('comprehension', ('Name', (1, 12), 'a', ('Store',)), ('Name', (1, 17), 'b', ('Load',)), [], 0)])], [])], None, None)], []), ] single_results = [ ('Interactive', [('Expr', (1, 0), ('BinOp', (1, 0), ('Constant', (1, 0), 1), ('Add',), ('Constant', (1, 2), 2)))]), diff --git a/Lib/test/test_type_comments.py b/Lib/test/test_type_comments.py new file mode 100644 index 000000000000..3065ddca2d9a --- /dev/null +++ b/Lib/test/test_type_comments.py @@ -0,0 +1,295 @@ +import ast +import unittest + + +funcdef = """\ +def foo(): + # type: () -> int + pass + +def bar(): # type: () -> None + pass +""" + +asyncdef = """\ +async def foo(): + # type: () -> int + return await bar() + +async def bar(): # type: () -> int + return await bar() +""" + +redundantdef = """\ +def foo(): # type: () -> int + # type: () -> str + return '' +""" + +nonasciidef = """\ +def foo(): + # type: () -> ?????t + pass +""" + +forstmt = """\ +for a in []: # type: int + pass +""" + +withstmt = """\ +with context() as a: # type: int + pass +""" + +vardecl = """\ +a = 0 # type: int +""" + +ignores = """\ +def foo(): + pass # type: ignore + +def bar(): + x = 1 # type: ignore +""" + +# Test for long-form type-comments in arguments. A test function +# named 'fabvk' would have two positional args, a and b, plus a +# var-arg *v, plus a kw-arg **k. It is verified in test_longargs() +# that it has exactly these arguments, no more, no fewer. +longargs = """\ +def fa( + a = 1, # type: A +): + pass + +def fa( + a = 1 # type: A +): + pass + +def fab( + a, # type: A + b, # type: B +): + pass + +def fab( + a, # type: A + b # type: B +): + pass + +def fv( + *v, # type: V +): + pass + +def fv( + *v # type: V +): + pass + +def fk( + **k, # type: K +): + pass + +def fk( + **k # type: K +): + pass + +def fvk( + *v, # type: V + **k, # type: K +): + pass + +def fvk( + *v, # type: V + **k # type: K +): + pass + +def fav( + a, # type: A + *v, # type: V +): + pass + +def fav( + a, # type: A + *v # type: V +): + pass + +def fak( + a, # type: A + **k, # type: K +): + pass + +def fak( + a, # type: A + **k # type: K +): + pass + +def favk( + a, # type: A + *v, # type: V + **k, # type: K +): + pass + +def favk( + a, # type: A + *v, # type: V + **k # type: K +): + pass +""" + + +class TypeCommentTests(unittest.TestCase): + + def parse(self, source): + return ast.parse(source, type_comments=True) + + def classic_parse(self, source): + return ast.parse(source) + + def test_funcdef(self): + tree = self.parse(funcdef) + self.assertEqual(tree.body[0].type_comment, "() -> int") + self.assertEqual(tree.body[1].type_comment, "() -> None") + tree = self.classic_parse(funcdef) + self.assertEqual(tree.body[0].type_comment, None) + self.assertEqual(tree.body[1].type_comment, None) + + def test_asyncdef(self): + tree = self.parse(asyncdef) + self.assertEqual(tree.body[0].type_comment, "() -> int") + self.assertEqual(tree.body[1].type_comment, "() -> int") + tree = self.classic_parse(asyncdef) + self.assertEqual(tree.body[0].type_comment, None) + self.assertEqual(tree.body[1].type_comment, None) + + def test_redundantdef(self): + with self.assertRaisesRegex(SyntaxError, "^Cannot have two type comments on def"): + tree = self.parse(redundantdef) + + def test_nonasciidef(self): + tree = self.parse(nonasciidef) + self.assertEqual(tree.body[0].type_comment, "() -> ?????t") + + def test_forstmt(self): + tree = self.parse(forstmt) + self.assertEqual(tree.body[0].type_comment, "int") + tree = self.classic_parse(forstmt) + self.assertEqual(tree.body[0].type_comment, None) + + def test_withstmt(self): + tree = self.parse(withstmt) + self.assertEqual(tree.body[0].type_comment, "int") + tree = self.classic_parse(withstmt) + self.assertEqual(tree.body[0].type_comment, None) + + def test_vardecl(self): + tree = self.parse(vardecl) + self.assertEqual(tree.body[0].type_comment, "int") + tree = self.classic_parse(vardecl) + self.assertEqual(tree.body[0].type_comment, None) + + def test_ignores(self): + tree = self.parse(ignores) + self.assertEqual([ti.lineno for ti in tree.type_ignores], [2, 5]) + tree = self.classic_parse(ignores) + self.assertEqual(tree.type_ignores, []) + + def test_longargs(self): + tree = self.parse(longargs) + for t in tree.body: + # The expected args are encoded in the function name + todo = set(t.name[1:]) + self.assertEqual(len(t.args.args), + len(todo) - bool(t.args.vararg) - bool(t.args.kwarg)) + self.assertTrue(t.name.startswith('f'), t.name) + for c in t.name[1:]: + todo.remove(c) + if c == 'v': + arg = t.args.vararg + elif c == 'k': + arg = t.args.kwarg + else: + assert 0 <= ord(c) - ord('a') < len(t.args.args) + arg = t.args.args[ord(c) - ord('a')] + self.assertEqual(arg.arg, c) # That's the argument name + self.assertEqual(arg.type_comment, arg.arg.upper()) + assert not todo + tree = self.classic_parse(longargs) + for t in tree.body: + for arg in t.args.args + [t.args.vararg, t.args.kwarg]: + if arg is not None: + self.assertIsNone(arg.type_comment, "%s(%s:%r)" % + (t.name, arg.arg, arg.type_comment)) + + def test_inappropriate_type_comments(self): + """Tests for inappropriately-placed type comments. + + These should be silently ignored with type comments off, + but raise SyntaxError with type comments on. + + This is not meant to be exhaustive. + """ + + def check_both_ways(source): + ast.parse(source, type_comments=False) + with self.assertRaises(SyntaxError): + ast.parse(source, type_comments=True) + + check_both_ways("pass # type: int\n") + check_both_ways("foo() # type: int\n") + check_both_ways("x += 1 # type: int\n") + check_both_ways("while True: # type: int\n continue\n") + check_both_ways("while True:\n continue # type: int\n") + check_both_ways("try: # type: int\n pass\nfinally:\n pass\n") + check_both_ways("try:\n pass\nfinally: # type: int\n pass\n") + + def test_func_type_input(self): + + def parse_func_type_input(source): + return ast.parse(source, "", "func_type") + + # Some checks below will crash if the returned structure is wrong + tree = parse_func_type_input("() -> int") + self.assertEqual(tree.argtypes, []) + self.assertEqual(tree.returns.id, "int") + + tree = parse_func_type_input("(int) -> List[str]") + self.assertEqual(len(tree.argtypes), 1) + arg = tree.argtypes[0] + self.assertEqual(arg.id, "int") + self.assertEqual(tree.returns.value.id, "List") + self.assertEqual(tree.returns.slice.value.id, "str") + + tree = parse_func_type_input("(int, *str, **Any) -> float") + self.assertEqual(tree.argtypes[0].id, "int") + self.assertEqual(tree.argtypes[1].id, "str") + self.assertEqual(tree.argtypes[2].id, "Any") + self.assertEqual(tree.returns.id, "float") + + with self.assertRaises(SyntaxError): + tree = parse_func_type_input("(int, *str, *Any) -> float") + + with self.assertRaises(SyntaxError): + tree = parse_func_type_input("(int, **str, Any) -> float") + + with self.assertRaises(SyntaxError): + tree = parse_func_type_input("(**int, **str) -> float") + + +if __name__ == '__main__': + unittest.main() diff --git a/Lib/token.py b/Lib/token.py index 7224eca32fe0..9bf80a5950a2 100644 --- a/Lib/token.py +++ b/Lib/token.py @@ -58,12 +58,14 @@ ELLIPSIS = 52 COLONEQUAL = 53 OP = 54 +TYPE_IGNORE = 55 +TYPE_COMMENT = 56 # These aren't used by the C tokenizer but are needed for tokenize.py -ERRORTOKEN = 55 -COMMENT = 56 -NL = 57 -ENCODING = 58 -N_TOKENS = 59 +ERRORTOKEN = 57 +COMMENT = 58 +NL = 59 +ENCODING = 60 +N_TOKENS = 61 # Special definitions for cooperation with parser NT_OFFSET = 256 diff --git a/Misc/NEWS.d/next/Core and Builtins/2019-01-22-19-17-27.bpo-35766.gh1tHZ.rst b/Misc/NEWS.d/next/Core and Builtins/2019-01-22-19-17-27.bpo-35766.gh1tHZ.rst new file mode 100644 index 000000000000..29c5f34d6a3e --- /dev/null +++ b/Misc/NEWS.d/next/Core and Builtins/2019-01-22-19-17-27.bpo-35766.gh1tHZ.rst @@ -0,0 +1 @@ +Add the option to parse PEP 484 type comments in the ast module. (Off by default.) This is merging the key functionality of the third party fork thereof, [typed_ast](https://github.com/python/typed_ast). \ No newline at end of file diff --git a/Modules/parsermodule.c b/Modules/parsermodule.c index eabc5c810f6c..87f58d340c2d 100644 --- a/Modules/parsermodule.c +++ b/Modules/parsermodule.c @@ -663,6 +663,12 @@ validate_node(node *tree) for (pos = 0; pos < nch; ++pos) { node *ch = CHILD(tree, pos); int ch_type = TYPE(ch); + if (ch_type == suite && TYPE(tree) == funcdef) { + /* This is the opposite hack of what we do in parser.c + (search for func_body_suite), except we don't ever + support type comments here. */ + ch_type = func_body_suite; + } for (arc = 0; arc < dfa_state->s_narcs; ++arc) { short a_label = dfa_state->s_arc[arc].a_lbl; assert(a_label < _PyParser_Grammar.g_ll.ll_nlabels); diff --git a/Parser/Python.asdl b/Parser/Python.asdl index 7b2a8737ab80..85b686d78a3b 100644 --- a/Parser/Python.asdl +++ b/Parser/Python.asdl @@ -3,17 +3,20 @@ module Python { - mod = Module(stmt* body) + mod = Module(stmt* body, type_ignore *type_ignores) | Interactive(stmt* body) | Expression(expr body) + | FunctionType(expr* argtypes, expr returns) -- not really an actual node but useful in Jython's typesystem. | Suite(stmt* body) stmt = FunctionDef(identifier name, arguments args, - stmt* body, expr* decorator_list, expr? returns) + stmt* body, expr* decorator_list, expr? returns, + string? type_comment) | AsyncFunctionDef(identifier name, arguments args, - stmt* body, expr* decorator_list, expr? returns) + stmt* body, expr* decorator_list, expr? returns, + string? type_comment) | ClassDef(identifier name, expr* bases, @@ -23,18 +26,18 @@ module Python | Return(expr? value) | Delete(expr* targets) - | Assign(expr* targets, expr value) + | Assign(expr* targets, expr value, string? type_comment) | AugAssign(expr target, operator op, expr value) -- 'simple' indicates that we annotate simple name without parens | AnnAssign(expr target, expr annotation, expr? value, int simple) -- use 'orelse' because else is a keyword in target languages - | For(expr target, expr iter, stmt* body, stmt* orelse) - | AsyncFor(expr target, expr iter, stmt* body, stmt* orelse) + | For(expr target, expr iter, stmt* body, stmt* orelse, string? type_comment) + | AsyncFor(expr target, expr iter, stmt* body, stmt* orelse, string? type_comment) | While(expr test, stmt* body, stmt* orelse) | If(expr test, stmt* body, stmt* orelse) - | With(withitem* items, stmt* body) - | AsyncWith(withitem* items, stmt* body) + | With(withitem* items, stmt* body, string? type_comment) + | AsyncWith(withitem* items, stmt* body, string? type_comment) | Raise(expr? exc, expr? cause) | Try(stmt* body, excepthandler* handlers, stmt* orelse, stmt* finalbody) @@ -111,7 +114,7 @@ module Python arguments = (arg* args, arg? vararg, arg* kwonlyargs, expr* kw_defaults, arg? kwarg, expr* defaults) - arg = (identifier arg, expr? annotation) + arg = (identifier arg, expr? annotation, string? type_comment) attributes (int lineno, int col_offset, int? end_lineno, int? end_col_offset) -- keyword arguments supplied to call (NULL identifier for **kwargs) @@ -121,5 +124,7 @@ module Python alias = (identifier name, identifier? asname) withitem = (expr context_expr, expr? optional_vars) + + type_ignore = TypeIgnore(int lineno) } diff --git a/Parser/asdl_c.py b/Parser/asdl_c.py index 8640b29b8f10..a51a5db73904 100644 --- a/Parser/asdl_c.py +++ b/Parser/asdl_c.py @@ -890,6 +890,15 @@ def visitModule(self, mod): return obj2ast_object(obj, out, arena); } +static int obj2ast_string(PyObject* obj, PyObject** out, PyArena* arena) +{ + if (!PyUnicode_CheckExact(obj) && !PyBytes_CheckExact(obj)) { + PyErr_SetString(PyExc_TypeError, "AST string must be of type str"); + return 1; + } + return obj2ast_object(obj, out, arena); +} + static int obj2ast_int(PyObject* obj, int* out, PyArena* arena) { int i; @@ -993,6 +1002,8 @@ def visitModule(self, mod): self.emit('if (PyDict_SetItemString(d, "AST", (PyObject*)&AST_type) < 0) return NULL;', 1) self.emit('if (PyModule_AddIntMacro(m, PyCF_ONLY_AST) < 0)', 1) self.emit("return NULL;", 2) + self.emit('if (PyModule_AddIntMacro(m, PyCF_TYPE_COMMENTS) < 0)', 1) + self.emit("return NULL;", 2) for dfn in mod.dfns: self.visit(dfn) self.emit("return m;", 1) @@ -1176,18 +1187,19 @@ class PartingShots(StaticVisitor): } /* mode is 0 for "exec", 1 for "eval" and 2 for "single" input */ +/* and 3 for "func_type" */ mod_ty PyAST_obj2mod(PyObject* ast, PyArena* arena, int mode) { mod_ty res; PyObject *req_type[3]; - char *req_name[] = {"Module", "Expression", "Interactive"}; + char *req_name[] = {"Module", "Expression", "Interactive", "FunctionType"}; int isinstance; req_type[0] = (PyObject*)Module_type; req_type[1] = (PyObject*)Expression_type; req_type[2] = (PyObject*)Interactive_type; - assert(0 <= mode && mode <= 2); + assert(0 <= mode && mode <= 3); if (!init_types()) return NULL; diff --git a/Parser/parser.c b/Parser/parser.c index a9916d392aab..fa4a8f011ff5 100644 --- a/Parser/parser.c +++ b/Parser/parser.c @@ -12,6 +12,7 @@ #include "node.h" #include "parser.h" #include "errcode.h" +#include "graminit.h" #ifdef Py_DEBUG @@ -260,7 +261,15 @@ PyParser_AddToken(parser_state *ps, int type, char *str, /* Push non-terminal */ int nt = (x >> 8) + NT_OFFSET; int arrow = x & ((1<<7)-1); - dfa *d1 = PyGrammar_FindDFA( + dfa *d1; + if (nt == func_body_suite && !(ps->p_flags & PyCF_TYPE_COMMENTS)) { + /* When parsing type comments is not requested, + we can provide better errors about bad indentation + by using 'suite' for the body of a funcdef */ + D(printf(" [switch func_body_suite to suite]")); + nt = suite; + } + d1 = PyGrammar_FindDFA( ps->p_grammar, nt); if ((err = push(&ps->p_stack, nt, d1, arrow, lineno, col_offset, @@ -268,7 +277,7 @@ PyParser_AddToken(parser_state *ps, int type, char *str, D(printf(" MemError: push\n")); return err; } - D(printf(" Push ...\n")); + D(printf(" Push '%s'\n", d1->d_name)); continue; } diff --git a/Parser/parsetok.c b/Parser/parsetok.c index 2b5254a8be67..7fddc5a0e897 100644 --- a/Parser/parsetok.c +++ b/Parser/parsetok.c @@ -15,6 +15,42 @@ static node *parsetok(struct tok_state *, grammar *, int, perrdetail *, int *); static int initerr(perrdetail *err_ret, PyObject * filename); +typedef struct { + int *items; + size_t size; + size_t num_items; +} growable_int_array; + +static int +growable_int_array_init(growable_int_array *arr, size_t initial_size) { + assert(initial_size > 0); + arr->items = malloc(initial_size * sizeof(*arr->items)); + arr->size = initial_size; + arr->num_items = 0; + + return arr->items != NULL; +} + +static int +growable_int_array_add(growable_int_array *arr, int item) { + if (arr->num_items >= arr->size) { + arr->size *= 2; + arr->items = realloc(arr->items, arr->size * sizeof(*arr->items)); + if (!arr->items) { + return 0; + } + } + + arr->items[arr->num_items] = item; + arr->num_items++; + return 1; +} + +static void +growable_int_array_deallocate(growable_int_array *arr) { + free(arr->items); +} + /* Parse input coming from a string. Return error code, print some errors. */ node * PyParser_ParseString(const char *s, grammar *g, int start, perrdetail *err_ret) @@ -59,6 +95,9 @@ PyParser_ParseStringObject(const char *s, PyObject *filename, err_ret->error = PyErr_Occurred() ? E_DECODE : E_NOMEM; return NULL; } + if (*flags & PyPARSE_TYPE_COMMENTS) { + tok->type_comments = 1; + } #ifndef PGEN Py_INCREF(err_ret->filename); @@ -127,6 +166,9 @@ PyParser_ParseFileObject(FILE *fp, PyObject *filename, err_ret->error = E_NOMEM; return NULL; } + if (*flags & PyPARSE_TYPE_COMMENTS) { + tok->type_comments = 1; + } #ifndef PGEN Py_INCREF(err_ret->filename); tok->filename = err_ret->filename; @@ -188,6 +230,13 @@ parsetok(struct tok_state *tok, grammar *g, int start, perrdetail *err_ret, node *n; int started = 0; int col_offset, end_col_offset; + growable_int_array type_ignores; + + if (!growable_int_array_init(&type_ignores, 10)) { + err_ret->error = E_NOMEM; + PyTokenizer_Free(tok); + return NULL; + } if ((ps = PyParser_New(g, start)) == NULL) { err_ret->error = E_NOMEM; @@ -197,6 +246,8 @@ parsetok(struct tok_state *tok, grammar *g, int start, perrdetail *err_ret, #ifdef PY_PARSER_REQUIRES_FUTURE_KEYWORD if (*flags & PyPARSE_BARRY_AS_BDFL) ps->p_flags |= CO_FUTURE_BARRY_AS_BDFL; + if (*flags & PyPARSE_TYPE_COMMENTS) + ps->p_flags |= PyCF_TYPE_COMMENTS; #endif for (;;) { @@ -277,6 +328,15 @@ parsetok(struct tok_state *tok, grammar *g, int start, perrdetail *err_ret, else { end_col_offset = -1; } + + if (type == TYPE_IGNORE) { + if (!growable_int_array_add(&type_ignores, tok->lineno)) { + err_ret->error = E_NOMEM; + break; + } + continue; + } + if ((err_ret->error = PyParser_AddToken(ps, (int)type, str, lineno, col_offset, tok->lineno, end_col_offset, @@ -293,6 +353,24 @@ parsetok(struct tok_state *tok, grammar *g, int start, perrdetail *err_ret, n = ps->p_tree; ps->p_tree = NULL; + if (n->n_type == file_input) { + /* Put type_ignore nodes in the ENDMARKER of file_input. */ + int num; + node *ch; + size_t i; + + num = NCH(n); + ch = CHILD(n, num - 1); + REQ(ch, ENDMARKER); + + for (i = 0; i < type_ignores.num_items; i++) { + PyNode_AddChild(ch, TYPE_IGNORE, NULL, + type_ignores.items[i], 0, + type_ignores.items[i], 0); + } + } + growable_int_array_deallocate(&type_ignores); + #ifndef PGEN /* Check that the source for a single input statement really is a single statement by looking at what is left in the diff --git a/Parser/token.c b/Parser/token.c index d27f98a34d55..228ecffc8415 100644 --- a/Parser/token.c +++ b/Parser/token.c @@ -61,6 +61,8 @@ const char * const _PyParser_TokenNames[] = { "ELLIPSIS", "COLONEQUAL", "OP", + "TYPE_IGNORE", + "TYPE_COMMENT", "", "", "", diff --git a/Parser/tokenizer.c b/Parser/tokenizer.c index 3e3cf2cd7f58..1ded9ade3771 100644 --- a/Parser/tokenizer.c +++ b/Parser/tokenizer.c @@ -48,6 +48,10 @@ static int tok_nextc(struct tok_state *tok); static void tok_backup(struct tok_state *tok, int c); +/* Spaces in this constant are treated as "zero or more spaces or tabs" when + tokenizing. */ +static const char* type_comment_prefix = "# type: "; + /* Create and initialize a new tok_state structure */ static struct tok_state * @@ -82,6 +86,7 @@ tok_new(void) tok->decoding_readline = NULL; tok->decoding_buffer = NULL; #endif + tok->type_comments = 0; return tok; } @@ -1245,11 +1250,61 @@ tok_get(struct tok_state *tok, char **p_start, char **p_end) /* Set start of current token */ tok->start = tok->cur - 1; - /* Skip comment */ + /* Skip comment, unless it's a type comment */ if (c == '#') { + const char *prefix, *p, *type_start; + while (c != EOF && c != '\n') { c = tok_nextc(tok); } + + if (tok->type_comments) { + p = tok->start; + prefix = type_comment_prefix; + while (*prefix && p < tok->cur) { + if (*prefix == ' ') { + while (*p == ' ' || *p == '\t') { + p++; + } + } else if (*prefix == *p) { + p++; + } else { + break; + } + + prefix++; + } + + /* This is a type comment if we matched all of type_comment_prefix. */ + if (!*prefix) { + int is_type_ignore = 1; + tok_backup(tok, c); /* don't eat the newline or EOF */ + + type_start = p; + + is_type_ignore = tok->cur >= p + 6 && memcmp(p, "ignore", 6) == 0; + p += 6; + while (is_type_ignore && p < tok->cur) { + if (*p == '#') + break; + is_type_ignore = is_type_ignore && (*p == ' ' || *p == '\t'); + p++; + } + + if (is_type_ignore) { + /* If this type ignore is the only thing on the line, consume the newline also. */ + if (blankline) { + tok_nextc(tok); + tok->atbol = 1; + } + return TYPE_IGNORE; + } else { + *p_start = (char *) type_start; /* after type_comment_prefix */ + *p_end = tok->cur; + return TYPE_COMMENT; + } + } + } } /* Check for EOF and errors now */ diff --git a/Parser/tokenizer.h b/Parser/tokenizer.h index 096ce687ec54..9639c658b1c2 100644 --- a/Parser/tokenizer.h +++ b/Parser/tokenizer.h @@ -70,6 +70,8 @@ struct tok_state { const char* enc; /* Encoding for the current str. */ const char* str; const char* input; /* Tokenizer's newline translated copy of the string. */ + + int type_comments; /* Whether to look for type comments */ }; extern struct tok_state *PyTokenizer_FromString(const char *, int); diff --git a/Python/Python-ast.c b/Python/Python-ast.c index a333ff95b110..1a56e90bca09 100644 --- a/Python/Python-ast.c +++ b/Python/Python-ast.c @@ -10,8 +10,10 @@ static PyTypeObject *mod_type; static PyObject* ast2obj_mod(void*); static PyTypeObject *Module_type; _Py_IDENTIFIER(body); +_Py_IDENTIFIER(type_ignores); static char *Module_fields[]={ "body", + "type_ignores", }; static PyTypeObject *Interactive_type; static char *Interactive_fields[]={ @@ -21,6 +23,13 @@ static PyTypeObject *Expression_type; static char *Expression_fields[]={ "body", }; +static PyTypeObject *FunctionType_type; +_Py_IDENTIFIER(argtypes); +_Py_IDENTIFIER(returns); +static char *FunctionType_fields[]={ + "argtypes", + "returns", +}; static PyTypeObject *Suite_type; static char *Suite_fields[]={ "body", @@ -41,13 +50,14 @@ static PyTypeObject *FunctionDef_type; _Py_IDENTIFIER(name); _Py_IDENTIFIER(args); _Py_IDENTIFIER(decorator_list); -_Py_IDENTIFIER(returns); +_Py_IDENTIFIER(type_comment); static char *FunctionDef_fields[]={ "name", "args", "body", "decorator_list", "returns", + "type_comment", }; static PyTypeObject *AsyncFunctionDef_type; static char *AsyncFunctionDef_fields[]={ @@ -56,6 +66,7 @@ static char *AsyncFunctionDef_fields[]={ "body", "decorator_list", "returns", + "type_comment", }; static PyTypeObject *ClassDef_type; _Py_IDENTIFIER(bases); @@ -81,6 +92,7 @@ static PyTypeObject *Assign_type; static char *Assign_fields[]={ "targets", "value", + "type_comment", }; static PyTypeObject *AugAssign_type; _Py_IDENTIFIER(target); @@ -107,6 +119,7 @@ static char *For_fields[]={ "iter", "body", "orelse", + "type_comment", }; static PyTypeObject *AsyncFor_type; static char *AsyncFor_fields[]={ @@ -114,6 +127,7 @@ static char *AsyncFor_fields[]={ "iter", "body", "orelse", + "type_comment", }; static PyTypeObject *While_type; _Py_IDENTIFIER(test); @@ -133,11 +147,13 @@ _Py_IDENTIFIER(items); static char *With_fields[]={ "items", "body", + "type_comment", }; static PyTypeObject *AsyncWith_type; static char *AsyncWith_fields[]={ "items", "body", + "type_comment", }; static PyTypeObject *Raise_type; _Py_IDENTIFIER(exc); @@ -478,6 +494,7 @@ _Py_IDENTIFIER(arg); static char *arg_fields[]={ "arg", "annotation", + "type_comment", }; static PyTypeObject *keyword_type; static PyObject* ast2obj_keyword(void*); @@ -500,6 +517,12 @@ static char *withitem_fields[]={ "context_expr", "optional_vars", }; +static PyTypeObject *type_ignore_type; +static PyObject* ast2obj_type_ignore(void*); +static PyTypeObject *TypeIgnore_type; +static char *TypeIgnore_fields[]={ + "lineno", +}; _Py_IDENTIFIER(_fields); @@ -769,6 +792,15 @@ static int obj2ast_identifier(PyObject* obj, PyObject** out, PyArena* arena) return obj2ast_object(obj, out, arena); } +static int obj2ast_string(PyObject* obj, PyObject** out, PyArena* arena) +{ + if (!PyUnicode_CheckExact(obj) && !PyBytes_CheckExact(obj)) { + PyErr_SetString(PyExc_TypeError, "AST string must be of type str"); + return 1; + } + return obj2ast_object(obj, out, arena); +} + static int obj2ast_int(PyObject* obj, int* out, PyArena* arena) { int i; @@ -810,23 +842,26 @@ static int init_types(void) mod_type = make_type("mod", &AST_type, NULL, 0); if (!mod_type) return 0; if (!add_attributes(mod_type, NULL, 0)) return 0; - Module_type = make_type("Module", mod_type, Module_fields, 1); + Module_type = make_type("Module", mod_type, Module_fields, 2); if (!Module_type) return 0; Interactive_type = make_type("Interactive", mod_type, Interactive_fields, 1); if (!Interactive_type) return 0; Expression_type = make_type("Expression", mod_type, Expression_fields, 1); if (!Expression_type) return 0; + FunctionType_type = make_type("FunctionType", mod_type, + FunctionType_fields, 2); + if (!FunctionType_type) return 0; Suite_type = make_type("Suite", mod_type, Suite_fields, 1); if (!Suite_type) return 0; stmt_type = make_type("stmt", &AST_type, NULL, 0); if (!stmt_type) return 0; if (!add_attributes(stmt_type, stmt_attributes, 4)) return 0; FunctionDef_type = make_type("FunctionDef", stmt_type, FunctionDef_fields, - 5); + 6); if (!FunctionDef_type) return 0; AsyncFunctionDef_type = make_type("AsyncFunctionDef", stmt_type, - AsyncFunctionDef_fields, 5); + AsyncFunctionDef_fields, 6); if (!AsyncFunctionDef_type) return 0; ClassDef_type = make_type("ClassDef", stmt_type, ClassDef_fields, 5); if (!ClassDef_type) return 0; @@ -834,23 +869,23 @@ static int init_types(void) if (!Return_type) return 0; Delete_type = make_type("Delete", stmt_type, Delete_fields, 1); if (!Delete_type) return 0; - Assign_type = make_type("Assign", stmt_type, Assign_fields, 2); + Assign_type = make_type("Assign", stmt_type, Assign_fields, 3); if (!Assign_type) return 0; AugAssign_type = make_type("AugAssign", stmt_type, AugAssign_fields, 3); if (!AugAssign_type) return 0; AnnAssign_type = make_type("AnnAssign", stmt_type, AnnAssign_fields, 4); if (!AnnAssign_type) return 0; - For_type = make_type("For", stmt_type, For_fields, 4); + For_type = make_type("For", stmt_type, For_fields, 5); if (!For_type) return 0; - AsyncFor_type = make_type("AsyncFor", stmt_type, AsyncFor_fields, 4); + AsyncFor_type = make_type("AsyncFor", stmt_type, AsyncFor_fields, 5); if (!AsyncFor_type) return 0; While_type = make_type("While", stmt_type, While_fields, 3); if (!While_type) return 0; If_type = make_type("If", stmt_type, If_fields, 3); if (!If_type) return 0; - With_type = make_type("With", stmt_type, With_fields, 2); + With_type = make_type("With", stmt_type, With_fields, 3); if (!With_type) return 0; - AsyncWith_type = make_type("AsyncWith", stmt_type, AsyncWith_fields, 2); + AsyncWith_type = make_type("AsyncWith", stmt_type, AsyncWith_fields, 3); if (!AsyncWith_type) return 0; Raise_type = make_type("Raise", stmt_type, Raise_fields, 2); if (!Raise_type) return 0; @@ -1113,7 +1148,7 @@ static int init_types(void) arguments_type = make_type("arguments", &AST_type, arguments_fields, 6); if (!arguments_type) return 0; if (!add_attributes(arguments_type, NULL, 0)) return 0; - arg_type = make_type("arg", &AST_type, arg_fields, 2); + arg_type = make_type("arg", &AST_type, arg_fields, 3); if (!arg_type) return 0; if (!add_attributes(arg_type, arg_attributes, 4)) return 0; keyword_type = make_type("keyword", &AST_type, keyword_fields, 2); @@ -1125,6 +1160,12 @@ static int init_types(void) withitem_type = make_type("withitem", &AST_type, withitem_fields, 2); if (!withitem_type) return 0; if (!add_attributes(withitem_type, NULL, 0)) return 0; + type_ignore_type = make_type("type_ignore", &AST_type, NULL, 0); + if (!type_ignore_type) return 0; + if (!add_attributes(type_ignore_type, NULL, 0)) return 0; + TypeIgnore_type = make_type("TypeIgnore", type_ignore_type, + TypeIgnore_fields, 1); + if (!TypeIgnore_type) return 0; initialized = 1; return 1; } @@ -1148,9 +1189,11 @@ static int obj2ast_arg(PyObject* obj, arg_ty* out, PyArena* arena); static int obj2ast_keyword(PyObject* obj, keyword_ty* out, PyArena* arena); static int obj2ast_alias(PyObject* obj, alias_ty* out, PyArena* arena); static int obj2ast_withitem(PyObject* obj, withitem_ty* out, PyArena* arena); +static int obj2ast_type_ignore(PyObject* obj, type_ignore_ty* out, PyArena* + arena); mod_ty -Module(asdl_seq * body, PyArena *arena) +Module(asdl_seq * body, asdl_seq * type_ignores, PyArena *arena) { mod_ty p; p = (mod_ty)PyArena_Malloc(arena, sizeof(*p)); @@ -1158,6 +1201,7 @@ Module(asdl_seq * body, PyArena *arena) return NULL; p->kind = Module_kind; p->v.Module.body = body; + p->v.Module.type_ignores = type_ignores; return p; } @@ -1190,6 +1234,24 @@ Expression(expr_ty body, PyArena *arena) return p; } +mod_ty +FunctionType(asdl_seq * argtypes, expr_ty returns, PyArena *arena) +{ + mod_ty p; + if (!returns) { + PyErr_SetString(PyExc_ValueError, + "field returns is required for FunctionType"); + return NULL; + } + p = (mod_ty)PyArena_Malloc(arena, sizeof(*p)); + if (!p) + return NULL; + p->kind = FunctionType_kind; + p->v.FunctionType.argtypes = argtypes; + p->v.FunctionType.returns = returns; + return p; +} + mod_ty Suite(asdl_seq * body, PyArena *arena) { @@ -1204,8 +1266,8 @@ Suite(asdl_seq * body, PyArena *arena) stmt_ty FunctionDef(identifier name, arguments_ty args, asdl_seq * body, asdl_seq * - decorator_list, expr_ty returns, int lineno, int col_offset, int - end_lineno, int end_col_offset, PyArena *arena) + decorator_list, expr_ty returns, string type_comment, int lineno, + int col_offset, int end_lineno, int end_col_offset, PyArena *arena) { stmt_ty p; if (!name) { @@ -1227,6 +1289,7 @@ FunctionDef(identifier name, arguments_ty args, asdl_seq * body, asdl_seq * p->v.FunctionDef.body = body; p->v.FunctionDef.decorator_list = decorator_list; p->v.FunctionDef.returns = returns; + p->v.FunctionDef.type_comment = type_comment; p->lineno = lineno; p->col_offset = col_offset; p->end_lineno = end_lineno; @@ -1236,8 +1299,9 @@ FunctionDef(identifier name, arguments_ty args, asdl_seq * body, asdl_seq * stmt_ty AsyncFunctionDef(identifier name, arguments_ty args, asdl_seq * body, asdl_seq - * decorator_list, expr_ty returns, int lineno, int col_offset, - int end_lineno, int end_col_offset, PyArena *arena) + * decorator_list, expr_ty returns, string type_comment, int + lineno, int col_offset, int end_lineno, int end_col_offset, + PyArena *arena) { stmt_ty p; if (!name) { @@ -1259,6 +1323,7 @@ AsyncFunctionDef(identifier name, arguments_ty args, asdl_seq * body, asdl_seq p->v.AsyncFunctionDef.body = body; p->v.AsyncFunctionDef.decorator_list = decorator_list; p->v.AsyncFunctionDef.returns = returns; + p->v.AsyncFunctionDef.type_comment = type_comment; p->lineno = lineno; p->col_offset = col_offset; p->end_lineno = end_lineno; @@ -1328,8 +1393,8 @@ Delete(asdl_seq * targets, int lineno, int col_offset, int end_lineno, int } stmt_ty -Assign(asdl_seq * targets, expr_ty value, int lineno, int col_offset, int - end_lineno, int end_col_offset, PyArena *arena) +Assign(asdl_seq * targets, expr_ty value, string type_comment, int lineno, int + col_offset, int end_lineno, int end_col_offset, PyArena *arena) { stmt_ty p; if (!value) { @@ -1343,6 +1408,7 @@ Assign(asdl_seq * targets, expr_ty value, int lineno, int col_offset, int p->kind = Assign_kind; p->v.Assign.targets = targets; p->v.Assign.value = value; + p->v.Assign.type_comment = type_comment; p->lineno = lineno; p->col_offset = col_offset; p->end_lineno = end_lineno; @@ -1416,8 +1482,9 @@ AnnAssign(expr_ty target, expr_ty annotation, expr_ty value, int simple, int } stmt_ty -For(expr_ty target, expr_ty iter, asdl_seq * body, asdl_seq * orelse, int - lineno, int col_offset, int end_lineno, int end_col_offset, PyArena *arena) +For(expr_ty target, expr_ty iter, asdl_seq * body, asdl_seq * orelse, string + type_comment, int lineno, int col_offset, int end_lineno, int + end_col_offset, PyArena *arena) { stmt_ty p; if (!target) { @@ -1438,6 +1505,7 @@ For(expr_ty target, expr_ty iter, asdl_seq * body, asdl_seq * orelse, int p->v.For.iter = iter; p->v.For.body = body; p->v.For.orelse = orelse; + p->v.For.type_comment = type_comment; p->lineno = lineno; p->col_offset = col_offset; p->end_lineno = end_lineno; @@ -1446,9 +1514,9 @@ For(expr_ty target, expr_ty iter, asdl_seq * body, asdl_seq * orelse, int } stmt_ty -AsyncFor(expr_ty target, expr_ty iter, asdl_seq * body, asdl_seq * orelse, int - lineno, int col_offset, int end_lineno, int end_col_offset, PyArena - *arena) +AsyncFor(expr_ty target, expr_ty iter, asdl_seq * body, asdl_seq * orelse, + string type_comment, int lineno, int col_offset, int end_lineno, int + end_col_offset, PyArena *arena) { stmt_ty p; if (!target) { @@ -1469,6 +1537,7 @@ AsyncFor(expr_ty target, expr_ty iter, asdl_seq * body, asdl_seq * orelse, int p->v.AsyncFor.iter = iter; p->v.AsyncFor.body = body; p->v.AsyncFor.orelse = orelse; + p->v.AsyncFor.type_comment = type_comment; p->lineno = lineno; p->col_offset = col_offset; p->end_lineno = end_lineno; @@ -1525,8 +1594,8 @@ If(expr_ty test, asdl_seq * body, asdl_seq * orelse, int lineno, int } stmt_ty -With(asdl_seq * items, asdl_seq * body, int lineno, int col_offset, int - end_lineno, int end_col_offset, PyArena *arena) +With(asdl_seq * items, asdl_seq * body, string type_comment, int lineno, int + col_offset, int end_lineno, int end_col_offset, PyArena *arena) { stmt_ty p; p = (stmt_ty)PyArena_Malloc(arena, sizeof(*p)); @@ -1535,6 +1604,7 @@ With(asdl_seq * items, asdl_seq * body, int lineno, int col_offset, int p->kind = With_kind; p->v.With.items = items; p->v.With.body = body; + p->v.With.type_comment = type_comment; p->lineno = lineno; p->col_offset = col_offset; p->end_lineno = end_lineno; @@ -1543,8 +1613,8 @@ With(asdl_seq * items, asdl_seq * body, int lineno, int col_offset, int } stmt_ty -AsyncWith(asdl_seq * items, asdl_seq * body, int lineno, int col_offset, int - end_lineno, int end_col_offset, PyArena *arena) +AsyncWith(asdl_seq * items, asdl_seq * body, string type_comment, int lineno, + int col_offset, int end_lineno, int end_col_offset, PyArena *arena) { stmt_ty p; p = (stmt_ty)PyArena_Malloc(arena, sizeof(*p)); @@ -1553,6 +1623,7 @@ AsyncWith(asdl_seq * items, asdl_seq * body, int lineno, int col_offset, int p->kind = AsyncWith_kind; p->v.AsyncWith.items = items; p->v.AsyncWith.body = body; + p->v.AsyncWith.type_comment = type_comment; p->lineno = lineno; p->col_offset = col_offset; p->end_lineno = end_lineno; @@ -2518,8 +2589,8 @@ arguments(asdl_seq * args, arg_ty vararg, asdl_seq * kwonlyargs, asdl_seq * } arg_ty -arg(identifier arg, expr_ty annotation, int lineno, int col_offset, int - end_lineno, int end_col_offset, PyArena *arena) +arg(identifier arg, expr_ty annotation, string type_comment, int lineno, int + col_offset, int end_lineno, int end_col_offset, PyArena *arena) { arg_ty p; if (!arg) { @@ -2532,6 +2603,7 @@ arg(identifier arg, expr_ty annotation, int lineno, int col_offset, int return NULL; p->arg = arg; p->annotation = annotation; + p->type_comment = type_comment; p->lineno = lineno; p->col_offset = col_offset; p->end_lineno = end_lineno; @@ -2590,6 +2662,18 @@ withitem(expr_ty context_expr, expr_ty optional_vars, PyArena *arena) return p; } +type_ignore_ty +TypeIgnore(int lineno, PyArena *arena) +{ + type_ignore_ty p; + p = (type_ignore_ty)PyArena_Malloc(arena, sizeof(*p)); + if (!p) + return NULL; + p->kind = TypeIgnore_kind; + p->v.TypeIgnore.lineno = lineno; + return p; +} + PyObject* ast2obj_mod(void* _o) @@ -2609,6 +2693,11 @@ ast2obj_mod(void* _o) if (_PyObject_SetAttrId(result, &PyId_body, value) == -1) goto failed; Py_DECREF(value); + value = ast2obj_list(o->v.Module.type_ignores, ast2obj_type_ignore); + if (!value) goto failed; + if (_PyObject_SetAttrId(result, &PyId_type_ignores, value) == -1) + goto failed; + Py_DECREF(value); break; case Interactive_kind: result = PyType_GenericNew(Interactive_type, NULL, NULL); @@ -2628,6 +2717,20 @@ ast2obj_mod(void* _o) goto failed; Py_DECREF(value); break; + case FunctionType_kind: + result = PyType_GenericNew(FunctionType_type, NULL, NULL); + if (!result) goto failed; + value = ast2obj_list(o->v.FunctionType.argtypes, ast2obj_expr); + if (!value) goto failed; + if (_PyObject_SetAttrId(result, &PyId_argtypes, value) == -1) + goto failed; + Py_DECREF(value); + value = ast2obj_expr(o->v.FunctionType.returns); + if (!value) goto failed; + if (_PyObject_SetAttrId(result, &PyId_returns, value) == -1) + goto failed; + Py_DECREF(value); + break; case Suite_kind: result = PyType_GenericNew(Suite_type, NULL, NULL); if (!result) goto failed; @@ -2683,6 +2786,11 @@ ast2obj_stmt(void* _o) if (_PyObject_SetAttrId(result, &PyId_returns, value) == -1) goto failed; Py_DECREF(value); + value = ast2obj_string(o->v.FunctionDef.type_comment); + if (!value) goto failed; + if (_PyObject_SetAttrId(result, &PyId_type_comment, value) == -1) + goto failed; + Py_DECREF(value); break; case AsyncFunctionDef_kind: result = PyType_GenericNew(AsyncFunctionDef_type, NULL, NULL); @@ -2713,6 +2821,11 @@ ast2obj_stmt(void* _o) if (_PyObject_SetAttrId(result, &PyId_returns, value) == -1) goto failed; Py_DECREF(value); + value = ast2obj_string(o->v.AsyncFunctionDef.type_comment); + if (!value) goto failed; + if (_PyObject_SetAttrId(result, &PyId_type_comment, value) == -1) + goto failed; + Py_DECREF(value); break; case ClassDef_kind: result = PyType_GenericNew(ClassDef_type, NULL, NULL); @@ -2774,6 +2887,11 @@ ast2obj_stmt(void* _o) if (_PyObject_SetAttrId(result, &PyId_value, value) == -1) goto failed; Py_DECREF(value); + value = ast2obj_string(o->v.Assign.type_comment); + if (!value) goto failed; + if (_PyObject_SetAttrId(result, &PyId_type_comment, value) == -1) + goto failed; + Py_DECREF(value); break; case AugAssign_kind: result = PyType_GenericNew(AugAssign_type, NULL, NULL); @@ -2841,6 +2959,11 @@ ast2obj_stmt(void* _o) if (_PyObject_SetAttrId(result, &PyId_orelse, value) == -1) goto failed; Py_DECREF(value); + value = ast2obj_string(o->v.For.type_comment); + if (!value) goto failed; + if (_PyObject_SetAttrId(result, &PyId_type_comment, value) == -1) + goto failed; + Py_DECREF(value); break; case AsyncFor_kind: result = PyType_GenericNew(AsyncFor_type, NULL, NULL); @@ -2865,6 +2988,11 @@ ast2obj_stmt(void* _o) if (_PyObject_SetAttrId(result, &PyId_orelse, value) == -1) goto failed; Py_DECREF(value); + value = ast2obj_string(o->v.AsyncFor.type_comment); + if (!value) goto failed; + if (_PyObject_SetAttrId(result, &PyId_type_comment, value) == -1) + goto failed; + Py_DECREF(value); break; case While_kind: result = PyType_GenericNew(While_type, NULL, NULL); @@ -2917,6 +3045,11 @@ ast2obj_stmt(void* _o) if (_PyObject_SetAttrId(result, &PyId_body, value) == -1) goto failed; Py_DECREF(value); + value = ast2obj_string(o->v.With.type_comment); + if (!value) goto failed; + if (_PyObject_SetAttrId(result, &PyId_type_comment, value) == -1) + goto failed; + Py_DECREF(value); break; case AsyncWith_kind: result = PyType_GenericNew(AsyncWith_type, NULL, NULL); @@ -2931,6 +3064,11 @@ ast2obj_stmt(void* _o) if (_PyObject_SetAttrId(result, &PyId_body, value) == -1) goto failed; Py_DECREF(value); + value = ast2obj_string(o->v.AsyncWith.type_comment); + if (!value) goto failed; + if (_PyObject_SetAttrId(result, &PyId_type_comment, value) == -1) + goto failed; + Py_DECREF(value); break; case Raise_kind: result = PyType_GenericNew(Raise_type, NULL, NULL); @@ -3870,6 +4008,11 @@ ast2obj_arg(void* _o) if (_PyObject_SetAttrId(result, &PyId_annotation, value) == -1) goto failed; Py_DECREF(value); + value = ast2obj_string(o->type_comment); + if (!value) goto failed; + if (_PyObject_SetAttrId(result, &PyId_type_comment, value) == -1) + goto failed; + Py_DECREF(value); value = ast2obj_int(o->lineno); if (!value) goto failed; if (_PyObject_SetAttrId(result, &PyId_lineno, value) < 0) @@ -3981,6 +4124,33 @@ ast2obj_withitem(void* _o) return NULL; } +PyObject* +ast2obj_type_ignore(void* _o) +{ + type_ignore_ty o = (type_ignore_ty)_o; + PyObject *result = NULL, *value = NULL; + if (!o) { + Py_RETURN_NONE; + } + + switch (o->kind) { + case TypeIgnore_kind: + result = PyType_GenericNew(TypeIgnore_type, NULL, NULL); + if (!result) goto failed; + value = ast2obj_int(o->v.TypeIgnore.lineno); + if (!value) goto failed; + if (_PyObject_SetAttrId(result, &PyId_lineno, value) == -1) + goto failed; + Py_DECREF(value); + break; + } + return result; +failed: + Py_XDECREF(value); + Py_XDECREF(result); + return NULL; +} + int obj2ast_mod(PyObject* obj, mod_ty* out, PyArena* arena) @@ -3999,6 +4169,7 @@ obj2ast_mod(PyObject* obj, mod_ty* out, PyArena* arena) } if (isinstance) { asdl_seq* body; + asdl_seq* type_ignores; if (_PyObject_LookupAttrId(obj, &PyId_body, &tmp) < 0) { return 1; @@ -4030,7 +4201,37 @@ obj2ast_mod(PyObject* obj, mod_ty* out, PyArena* arena) } Py_CLEAR(tmp); } - *out = Module(body, arena); + if (_PyObject_LookupAttrId(obj, &PyId_type_ignores, &tmp) < 0) { + return 1; + } + if (tmp == NULL) { + PyErr_SetString(PyExc_TypeError, "required field \"type_ignores\" missing from Module"); + return 1; + } + else { + int res; + Py_ssize_t len; + Py_ssize_t i; + if (!PyList_Check(tmp)) { + PyErr_Format(PyExc_TypeError, "Module field \"type_ignores\" must be a list, not a %.200s", tmp->ob_type->tp_name); + goto failed; + } + len = PyList_GET_SIZE(tmp); + type_ignores = _Py_asdl_seq_new(len, arena); + if (type_ignores == NULL) goto failed; + for (i = 0; i < len; i++) { + type_ignore_ty val; + res = obj2ast_type_ignore(PyList_GET_ITEM(tmp, i), &val, arena); + if (res != 0) goto failed; + if (len != PyList_GET_SIZE(tmp)) { + PyErr_SetString(PyExc_RuntimeError, "Module field \"type_ignores\" changed size during iteration"); + goto failed; + } + asdl_seq_SET(type_ignores, i, val); + } + Py_CLEAR(tmp); + } + *out = Module(body, type_ignores, arena); if (*out == NULL) goto failed; return 0; } @@ -4099,6 +4300,61 @@ obj2ast_mod(PyObject* obj, mod_ty* out, PyArena* arena) if (*out == NULL) goto failed; return 0; } + isinstance = PyObject_IsInstance(obj, (PyObject*)FunctionType_type); + if (isinstance == -1) { + return 1; + } + if (isinstance) { + asdl_seq* argtypes; + expr_ty returns; + + if (_PyObject_LookupAttrId(obj, &PyId_argtypes, &tmp) < 0) { + return 1; + } + if (tmp == NULL) { + PyErr_SetString(PyExc_TypeError, "required field \"argtypes\" missing from FunctionType"); + return 1; + } + else { + int res; + Py_ssize_t len; + Py_ssize_t i; + if (!PyList_Check(tmp)) { + PyErr_Format(PyExc_TypeError, "FunctionType field \"argtypes\" must be a list, not a %.200s", tmp->ob_type->tp_name); + goto failed; + } + len = PyList_GET_SIZE(tmp); + argtypes = _Py_asdl_seq_new(len, arena); + if (argtypes == NULL) goto failed; + for (i = 0; i < len; i++) { + expr_ty val; + res = obj2ast_expr(PyList_GET_ITEM(tmp, i), &val, arena); + if (res != 0) goto failed; + if (len != PyList_GET_SIZE(tmp)) { + PyErr_SetString(PyExc_RuntimeError, "FunctionType field \"argtypes\" changed size during iteration"); + goto failed; + } + asdl_seq_SET(argtypes, i, val); + } + Py_CLEAR(tmp); + } + if (_PyObject_LookupAttrId(obj, &PyId_returns, &tmp) < 0) { + return 1; + } + if (tmp == NULL) { + PyErr_SetString(PyExc_TypeError, "required field \"returns\" missing from FunctionType"); + return 1; + } + else { + int res; + res = obj2ast_expr(tmp, &returns, arena); + if (res != 0) goto failed; + Py_CLEAR(tmp); + } + *out = FunctionType(argtypes, returns, arena); + if (*out == NULL) goto failed; + return 0; + } isinstance = PyObject_IsInstance(obj, (PyObject*)Suite_type); if (isinstance == -1) { return 1; @@ -4224,6 +4480,7 @@ obj2ast_stmt(PyObject* obj, stmt_ty* out, PyArena* arena) asdl_seq* body; asdl_seq* decorator_list; expr_ty returns; + string type_comment; if (_PyObject_LookupAttrId(obj, &PyId_name, &tmp) < 0) { return 1; @@ -4324,8 +4581,22 @@ obj2ast_stmt(PyObject* obj, stmt_ty* out, PyArena* arena) if (res != 0) goto failed; Py_CLEAR(tmp); } - *out = FunctionDef(name, args, body, decorator_list, returns, lineno, - col_offset, end_lineno, end_col_offset, arena); + if (_PyObject_LookupAttrId(obj, &PyId_type_comment, &tmp) < 0) { + return 1; + } + if (tmp == NULL || tmp == Py_None) { + Py_CLEAR(tmp); + type_comment = NULL; + } + else { + int res; + res = obj2ast_string(tmp, &type_comment, arena); + if (res != 0) goto failed; + Py_CLEAR(tmp); + } + *out = FunctionDef(name, args, body, decorator_list, returns, + type_comment, lineno, col_offset, end_lineno, + end_col_offset, arena); if (*out == NULL) goto failed; return 0; } @@ -4339,6 +4610,7 @@ obj2ast_stmt(PyObject* obj, stmt_ty* out, PyArena* arena) asdl_seq* body; asdl_seq* decorator_list; expr_ty returns; + string type_comment; if (_PyObject_LookupAttrId(obj, &PyId_name, &tmp) < 0) { return 1; @@ -4439,9 +4711,22 @@ obj2ast_stmt(PyObject* obj, stmt_ty* out, PyArena* arena) if (res != 0) goto failed; Py_CLEAR(tmp); } + if (_PyObject_LookupAttrId(obj, &PyId_type_comment, &tmp) < 0) { + return 1; + } + if (tmp == NULL || tmp == Py_None) { + Py_CLEAR(tmp); + type_comment = NULL; + } + else { + int res; + res = obj2ast_string(tmp, &type_comment, arena); + if (res != 0) goto failed; + Py_CLEAR(tmp); + } *out = AsyncFunctionDef(name, args, body, decorator_list, returns, - lineno, col_offset, end_lineno, end_col_offset, - arena); + type_comment, lineno, col_offset, end_lineno, + end_col_offset, arena); if (*out == NULL) goto failed; return 0; } @@ -4668,6 +4953,7 @@ obj2ast_stmt(PyObject* obj, stmt_ty* out, PyArena* arena) if (isinstance) { asdl_seq* targets; expr_ty value; + string type_comment; if (_PyObject_LookupAttrId(obj, &PyId_targets, &tmp) < 0) { return 1; @@ -4712,8 +4998,21 @@ obj2ast_stmt(PyObject* obj, stmt_ty* out, PyArena* arena) if (res != 0) goto failed; Py_CLEAR(tmp); } - *out = Assign(targets, value, lineno, col_offset, end_lineno, - end_col_offset, arena); + if (_PyObject_LookupAttrId(obj, &PyId_type_comment, &tmp) < 0) { + return 1; + } + if (tmp == NULL || tmp == Py_None) { + Py_CLEAR(tmp); + type_comment = NULL; + } + else { + int res; + res = obj2ast_string(tmp, &type_comment, arena); + if (res != 0) goto failed; + Py_CLEAR(tmp); + } + *out = Assign(targets, value, type_comment, lineno, col_offset, + end_lineno, end_col_offset, arena); if (*out == NULL) goto failed; return 0; } @@ -4846,6 +5145,7 @@ obj2ast_stmt(PyObject* obj, stmt_ty* out, PyArena* arena) expr_ty iter; asdl_seq* body; asdl_seq* orelse; + string type_comment; if (_PyObject_LookupAttrId(obj, &PyId_target, &tmp) < 0) { return 1; @@ -4933,8 +5233,21 @@ obj2ast_stmt(PyObject* obj, stmt_ty* out, PyArena* arena) } Py_CLEAR(tmp); } - *out = For(target, iter, body, orelse, lineno, col_offset, end_lineno, - end_col_offset, arena); + if (_PyObject_LookupAttrId(obj, &PyId_type_comment, &tmp) < 0) { + return 1; + } + if (tmp == NULL || tmp == Py_None) { + Py_CLEAR(tmp); + type_comment = NULL; + } + else { + int res; + res = obj2ast_string(tmp, &type_comment, arena); + if (res != 0) goto failed; + Py_CLEAR(tmp); + } + *out = For(target, iter, body, orelse, type_comment, lineno, + col_offset, end_lineno, end_col_offset, arena); if (*out == NULL) goto failed; return 0; } @@ -4947,6 +5260,7 @@ obj2ast_stmt(PyObject* obj, stmt_ty* out, PyArena* arena) expr_ty iter; asdl_seq* body; asdl_seq* orelse; + string type_comment; if (_PyObject_LookupAttrId(obj, &PyId_target, &tmp) < 0) { return 1; @@ -5034,8 +5348,21 @@ obj2ast_stmt(PyObject* obj, stmt_ty* out, PyArena* arena) } Py_CLEAR(tmp); } - *out = AsyncFor(target, iter, body, orelse, lineno, col_offset, - end_lineno, end_col_offset, arena); + if (_PyObject_LookupAttrId(obj, &PyId_type_comment, &tmp) < 0) { + return 1; + } + if (tmp == NULL || tmp == Py_None) { + Py_CLEAR(tmp); + type_comment = NULL; + } + else { + int res; + res = obj2ast_string(tmp, &type_comment, arena); + if (res != 0) goto failed; + Py_CLEAR(tmp); + } + *out = AsyncFor(target, iter, body, orelse, type_comment, lineno, + col_offset, end_lineno, end_col_offset, arena); if (*out == NULL) goto failed; return 0; } @@ -5220,6 +5547,7 @@ obj2ast_stmt(PyObject* obj, stmt_ty* out, PyArena* arena) if (isinstance) { asdl_seq* items; asdl_seq* body; + string type_comment; if (_PyObject_LookupAttrId(obj, &PyId_items, &tmp) < 0) { return 1; @@ -5281,7 +5609,20 @@ obj2ast_stmt(PyObject* obj, stmt_ty* out, PyArena* arena) } Py_CLEAR(tmp); } - *out = With(items, body, lineno, col_offset, end_lineno, + if (_PyObject_LookupAttrId(obj, &PyId_type_comment, &tmp) < 0) { + return 1; + } + if (tmp == NULL || tmp == Py_None) { + Py_CLEAR(tmp); + type_comment = NULL; + } + else { + int res; + res = obj2ast_string(tmp, &type_comment, arena); + if (res != 0) goto failed; + Py_CLEAR(tmp); + } + *out = With(items, body, type_comment, lineno, col_offset, end_lineno, end_col_offset, arena); if (*out == NULL) goto failed; return 0; @@ -5293,6 +5634,7 @@ obj2ast_stmt(PyObject* obj, stmt_ty* out, PyArena* arena) if (isinstance) { asdl_seq* items; asdl_seq* body; + string type_comment; if (_PyObject_LookupAttrId(obj, &PyId_items, &tmp) < 0) { return 1; @@ -5354,8 +5696,21 @@ obj2ast_stmt(PyObject* obj, stmt_ty* out, PyArena* arena) } Py_CLEAR(tmp); } - *out = AsyncWith(items, body, lineno, col_offset, end_lineno, - end_col_offset, arena); + if (_PyObject_LookupAttrId(obj, &PyId_type_comment, &tmp) < 0) { + return 1; + } + if (tmp == NULL || tmp == Py_None) { + Py_CLEAR(tmp); + type_comment = NULL; + } + else { + int res; + res = obj2ast_string(tmp, &type_comment, arena); + if (res != 0) goto failed; + Py_CLEAR(tmp); + } + *out = AsyncWith(items, body, type_comment, lineno, col_offset, + end_lineno, end_col_offset, arena); if (*out == NULL) goto failed; return 0; } @@ -8073,6 +8428,7 @@ obj2ast_arg(PyObject* obj, arg_ty* out, PyArena* arena) PyObject* tmp = NULL; identifier arg; expr_ty annotation; + string type_comment; int lineno; int col_offset; int end_lineno; @@ -8104,6 +8460,19 @@ obj2ast_arg(PyObject* obj, arg_ty* out, PyArena* arena) if (res != 0) goto failed; Py_CLEAR(tmp); } + if (_PyObject_LookupAttrId(obj, &PyId_type_comment, &tmp) < 0) { + return 1; + } + if (tmp == NULL || tmp == Py_None) { + Py_CLEAR(tmp); + type_comment = NULL; + } + else { + int res; + res = obj2ast_string(tmp, &type_comment, arena); + if (res != 0) goto failed; + Py_CLEAR(tmp); + } if (_PyObject_LookupAttrId(obj, &PyId_lineno, &tmp) < 0) { return 1; } @@ -8156,8 +8525,8 @@ obj2ast_arg(PyObject* obj, arg_ty* out, PyArena* arena) if (res != 0) goto failed; Py_CLEAR(tmp); } - *out = arg(arg, annotation, lineno, col_offset, end_lineno, end_col_offset, - arena); + *out = arg(arg, annotation, type_comment, lineno, col_offset, end_lineno, + end_col_offset, arena); return 0; failed: Py_XDECREF(tmp); @@ -8284,6 +8653,48 @@ obj2ast_withitem(PyObject* obj, withitem_ty* out, PyArena* arena) return 1; } +int +obj2ast_type_ignore(PyObject* obj, type_ignore_ty* out, PyArena* arena) +{ + int isinstance; + + PyObject *tmp = NULL; + + if (obj == Py_None) { + *out = NULL; + return 0; + } + isinstance = PyObject_IsInstance(obj, (PyObject*)TypeIgnore_type); + if (isinstance == -1) { + return 1; + } + if (isinstance) { + int lineno; + + if (_PyObject_LookupAttrId(obj, &PyId_lineno, &tmp) < 0) { + return 1; + } + if (tmp == NULL) { + PyErr_SetString(PyExc_TypeError, "required field \"lineno\" missing from TypeIgnore"); + return 1; + } + else { + int res; + res = obj2ast_int(tmp, &lineno, arena); + if (res != 0) goto failed; + Py_CLEAR(tmp); + } + *out = TypeIgnore(lineno, arena); + if (*out == NULL) goto failed; + return 0; + } + + PyErr_Format(PyExc_TypeError, "expected some sort of type_ignore, but got %R", obj); + failed: + Py_XDECREF(tmp); + return 1; +} + static struct PyModuleDef _astmodule = { PyModuleDef_HEAD_INIT, "_ast" @@ -8299,6 +8710,8 @@ PyInit__ast(void) if (PyDict_SetItemString(d, "AST", (PyObject*)&AST_type) < 0) return NULL; if (PyModule_AddIntMacro(m, PyCF_ONLY_AST) < 0) return NULL; + if (PyModule_AddIntMacro(m, PyCF_TYPE_COMMENTS) < 0) + return NULL; if (PyDict_SetItemString(d, "mod", (PyObject*)mod_type) < 0) return NULL; if (PyDict_SetItemString(d, "Module", (PyObject*)Module_type) < 0) return NULL; @@ -8306,6 +8719,8 @@ PyInit__ast(void) 0) return NULL; if (PyDict_SetItemString(d, "Expression", (PyObject*)Expression_type) < 0) return NULL; + if (PyDict_SetItemString(d, "FunctionType", (PyObject*)FunctionType_type) < + 0) return NULL; if (PyDict_SetItemString(d, "Suite", (PyObject*)Suite_type) < 0) return NULL; if (PyDict_SetItemString(d, "stmt", (PyObject*)stmt_type) < 0) return NULL; @@ -8486,6 +8901,10 @@ PyInit__ast(void) NULL; if (PyDict_SetItemString(d, "withitem", (PyObject*)withitem_type) < 0) return NULL; + if (PyDict_SetItemString(d, "type_ignore", (PyObject*)type_ignore_type) < + 0) return NULL; + if (PyDict_SetItemString(d, "TypeIgnore", (PyObject*)TypeIgnore_type) < 0) + return NULL; return m; } @@ -8498,18 +8917,19 @@ PyObject* PyAST_mod2obj(mod_ty t) } /* mode is 0 for "exec", 1 for "eval" and 2 for "single" input */ +/* and 3 for "func_type" */ mod_ty PyAST_obj2mod(PyObject* ast, PyArena* arena, int mode) { mod_ty res; PyObject *req_type[3]; - char *req_name[] = {"Module", "Expression", "Interactive"}; + char *req_name[] = {"Module", "Expression", "Interactive", "FunctionType"}; int isinstance; req_type[0] = (PyObject*)Module_type; req_type[1] = (PyObject*)Expression_type; req_type[2] = (PyObject*)Interactive_type; - assert(0 <= mode && mode <= 2); + assert(0 <= mode && mode <= 3); if (!init_types()) return NULL; diff --git a/Python/ast.c b/Python/ast.c index e10e63f539c3..c422e9651ced 100644 --- a/Python/ast.c +++ b/Python/ast.c @@ -698,6 +698,13 @@ ast_error(struct compiling *c, const node *n, const char *errmsg, ...) small_stmt elements is returned. */ +static string +new_type_comment(const char *s) +{ + return PyUnicode_DecodeUTF8(s, strlen(s), NULL); +} +#define NEW_TYPE_COMMENT(n) new_type_comment(STR(n)) + static int num_stmts(const node *n) { @@ -725,11 +732,17 @@ num_stmts(const node *n) case simple_stmt: return NCH(n) / 2; /* Divide by 2 to remove count of semi-colons */ case suite: + case func_body_suite: + /* func_body_suite: simple_stmt | NEWLINE [TYPE_COMMENT NEWLINE] INDENT stmt+ DEDENT */ + /* suite: simple_stmt | NEWLINE INDENT stmt+ DEDENT */ if (NCH(n) == 1) return num_stmts(CHILD(n, 0)); else { + i = 2; l = 0; - for (i = 2; i < (NCH(n) - 1); i++) + if (TYPE(CHILD(n, 1)) == TYPE_COMMENT) + i += 2; + for (; i < (NCH(n) - 1); i++) l += num_stmts(CHILD(n, i)); return l; } @@ -753,10 +766,13 @@ PyAST_FromNodeObject(const node *n, PyCompilerFlags *flags, { int i, j, k, num; asdl_seq *stmts = NULL; + asdl_seq *type_ignores = NULL; stmt_ty s; node *ch; struct compiling c; mod_ty res = NULL; + asdl_seq *argtypes = NULL; + expr_ty ret, arg; c.c_arena = arena; /* borrowed reference */ @@ -795,7 +811,23 @@ PyAST_FromNodeObject(const node *n, PyCompilerFlags *flags, } } } - res = Module(stmts, arena); + + /* Type ignores are stored under the ENDMARKER in file_input. */ + ch = CHILD(n, NCH(n) - 1); + REQ(ch, ENDMARKER); + num = NCH(ch); + type_ignores = _Py_asdl_seq_new(num, arena); + if (!type_ignores) + goto out; + + for (i = 0; i < num; i++) { + type_ignore_ty ti = TypeIgnore(LINENO(CHILD(ch, i)), arena); + if (!ti) + goto out; + asdl_seq_SET(type_ignores, i, ti); + } + + res = Module(stmts, type_ignores, arena); break; case eval_input: { expr_ty testlist_ast; @@ -847,6 +879,46 @@ PyAST_FromNodeObject(const node *n, PyCompilerFlags *flags, res = Interactive(stmts, arena); } break; + case func_type_input: + n = CHILD(n, 0); + REQ(n, func_type); + + if (TYPE(CHILD(n, 1)) == typelist) { + ch = CHILD(n, 1); + /* this is overly permissive -- we don't pay any attention to + * stars on the args -- just parse them into an ordered list */ + num = 0; + for (i = 0; i < NCH(ch); i++) { + if (TYPE(CHILD(ch, i)) == test) { + num++; + } + } + + argtypes = _Py_asdl_seq_new(num, arena); + if (!argtypes) + goto out; + + j = 0; + for (i = 0; i < NCH(ch); i++) { + if (TYPE(CHILD(ch, i)) == test) { + arg = ast_for_expr(&c, CHILD(ch, i)); + if (!arg) + goto out; + asdl_seq_SET(argtypes, j++, arg); + } + } + } + else { + argtypes = _Py_asdl_seq_new(0, arena); + if (!argtypes) + goto out; + } + + ret = ast_for_expr(&c, CHILD(n, NCH(n) - 1)); + if (!ret) + goto out; + res = FunctionType(argtypes, ret, arena); + break; default: PyErr_Format(PyExc_SystemError, "invalid node %d for PyAST_FromNode", TYPE(n)); @@ -1269,7 +1341,7 @@ ast_for_arg(struct compiling *c, const node *n) return NULL; } - ret = arg(name, annotation, LINENO(n), n->n_col_offset, + ret = arg(name, annotation, NULL, LINENO(n), n->n_col_offset, n->n_end_lineno, n->n_end_col_offset, c->c_arena); if (!ret) return NULL; @@ -1328,13 +1400,22 @@ handle_keywordonly_args(struct compiling *c, const node *n, int start, goto error; if (forbidden_name(c, argname, ch, 0)) goto error; - arg = arg(argname, annotation, LINENO(ch), ch->n_col_offset, + arg = arg(argname, annotation, NULL, LINENO(ch), ch->n_col_offset, ch->n_end_lineno, ch->n_end_col_offset, c->c_arena); if (!arg) goto error; asdl_seq_SET(kwonlyargs, j++, arg); - i += 2; /* the name and the comma */ + i += 1; /* the name */ + if (TYPE(CHILD(n, i)) == COMMA) + i += 1; /* the comma, if present */ + break; + case TYPE_COMMENT: + /* arg will be equal to the last argument processed */ + arg->type_comment = NEW_TYPE_COMMENT(ch); + if (!arg->type_comment) + goto error; + i += 1; break; case DOUBLESTAR: return i; @@ -1464,19 +1545,29 @@ ast_for_arguments(struct compiling *c, const node *n) if (!arg) return NULL; asdl_seq_SET(posargs, k++, arg); - i += 2; /* the name and the comma */ + i += 1; /* the name */ + if (i < NCH(n) && TYPE(CHILD(n, i)) == COMMA) + i += 1; /* the comma, if present */ break; case STAR: if (i+1 >= NCH(n) || - (i+2 == NCH(n) && TYPE(CHILD(n, i+1)) == COMMA)) { + (i+2 == NCH(n) && (TYPE(CHILD(n, i+1)) == COMMA + || TYPE(CHILD(n, i+1)) == TYPE_COMMENT))) { ast_error(c, CHILD(n, i), - "named arguments must follow bare *"); + "named arguments must follow bare *"); return NULL; } ch = CHILD(n, i+1); /* tfpdef or COMMA */ if (TYPE(ch) == COMMA) { int res = 0; i += 2; /* now follows keyword only arguments */ + + if (i < NCH(n) && TYPE(CHILD(n, i)) == TYPE_COMMENT) { + ast_error(c, CHILD(n, i), + "bare * has associated type comment"); + return NULL; + } + res = handle_keywordonly_args(c, n, i, kwonlyargs, kwdefaults); if (res == -1) return NULL; @@ -1487,7 +1578,17 @@ ast_for_arguments(struct compiling *c, const node *n) if (!vararg) return NULL; - i += 3; + i += 2; /* the star and the name */ + if (i < NCH(n) && TYPE(CHILD(n, i)) == COMMA) + i += 1; /* the comma, if present */ + + if (i < NCH(n) && TYPE(CHILD(n, i)) == TYPE_COMMENT) { + vararg->type_comment = NEW_TYPE_COMMENT(CHILD(n, i)); + if (!vararg->type_comment) + return NULL; + i += 1; + } + if (i < NCH(n) && (TYPE(CHILD(n, i)) == tfpdef || TYPE(CHILD(n, i)) == vfpdef)) { int res = 0; @@ -1504,7 +1605,21 @@ ast_for_arguments(struct compiling *c, const node *n) kwarg = ast_for_arg(c, ch); if (!kwarg) return NULL; - i += 3; + i += 2; /* the double star and the name */ + if (TYPE(CHILD(n, i)) == COMMA) + i += 1; /* the comma, if present */ + break; + case TYPE_COMMENT: + assert(i); + + if (kwarg) + arg = kwarg; + + /* arg will be equal to the last argument processed */ + arg->type_comment = NEW_TYPE_COMMENT(ch); + if (!arg->type_comment) + return NULL; + i += 1; break; default: PyErr_Format(PyExc_SystemError, @@ -1613,7 +1728,7 @@ static stmt_ty ast_for_funcdef_impl(struct compiling *c, const node *n0, asdl_seq *decorator_seq, bool is_async) { - /* funcdef: 'def' NAME parameters ['->' test] ':' suite */ + /* funcdef: 'def' NAME parameters ['->' test] ':' [TYPE_COMMENT] suite */ const node * const n = is_async ? CHILD(n0, 1) : n0; identifier name; arguments_ty args; @@ -1621,6 +1736,8 @@ ast_for_funcdef_impl(struct compiling *c, const node *n0, expr_ty returns = NULL; int name_i = 1; int end_lineno, end_col_offset; + node *tc; + string type_comment = NULL; REQ(n, funcdef); @@ -1638,16 +1755,37 @@ ast_for_funcdef_impl(struct compiling *c, const node *n0, return NULL; name_i += 2; } + if (TYPE(CHILD(n, name_i + 3)) == TYPE_COMMENT) { + type_comment = NEW_TYPE_COMMENT(CHILD(n, name_i + 3)); + if (!type_comment) + return NULL; + name_i += 1; + } body = ast_for_suite(c, CHILD(n, name_i + 3)); if (!body) return NULL; get_last_end_pos(body, &end_lineno, &end_col_offset); + if (NCH(CHILD(n, name_i + 3)) > 1) { + /* Check if the suite has a type comment in it. */ + tc = CHILD(CHILD(n, name_i + 3), 1); + + if (TYPE(tc) == TYPE_COMMENT) { + if (type_comment != NULL) { + ast_error(c, n, "Cannot have two type comments on def"); + return NULL; + } + type_comment = NEW_TYPE_COMMENT(tc); + if (!type_comment) + return NULL; + } + } + if (is_async) - return AsyncFunctionDef(name, args, body, decorator_seq, returns, + return AsyncFunctionDef(name, args, body, decorator_seq, returns, type_comment, LINENO(n0), n0->n_col_offset, end_lineno, end_col_offset, c->c_arena); else - return FunctionDef(name, args, body, decorator_seq, returns, + return FunctionDef(name, args, body, decorator_seq, returns, type_comment, LINENO(n), n->n_col_offset, end_lineno, end_col_offset, c->c_arena); } @@ -2295,7 +2433,7 @@ ast_for_atom(struct compiling *c, const node *n) /* It's a dictionary comprehension. */ if (is_dict) { ast_error(c, n, "dict unpacking cannot be used in " - "dict comprehension"); + "dict comprehension"); return NULL; } res = ast_for_dictcomp(c, ch); @@ -2870,13 +3008,13 @@ ast_for_call(struct compiling *c, const node *n, expr_ty func, if (nkeywords) { if (ndoublestars) { ast_error(c, chch, - "positional argument follows " - "keyword argument unpacking"); + "positional argument follows " + "keyword argument unpacking"); } else { ast_error(c, chch, - "positional argument follows " - "keyword argument"); + "positional argument follows " + "keyword argument"); } return NULL; } @@ -2890,8 +3028,8 @@ ast_for_call(struct compiling *c, const node *n, expr_ty func, expr_ty starred; if (ndoublestars) { ast_error(c, chch, - "iterable argument unpacking follows " - "keyword argument unpacking"); + "iterable argument unpacking follows " + "keyword argument unpacking"); return NULL; } e = ast_for_expr(c, CHILD(ch, 1)); @@ -2929,13 +3067,13 @@ ast_for_call(struct compiling *c, const node *n, expr_ty func, if (nkeywords) { if (ndoublestars) { ast_error(c, chch, - "positional argument follows " - "keyword argument unpacking"); + "positional argument follows " + "keyword argument unpacking"); } else { ast_error(c, chch, - "positional argument follows " - "keyword argument"); + "positional argument follows " + "keyword argument"); } return NULL; } @@ -2996,7 +3134,7 @@ ast_for_call(struct compiling *c, const node *n, expr_ty func, tmp = ((keyword_ty)asdl_seq_GET(keywords, k))->arg; if (tmp && !PyUnicode_Compare(tmp, key)) { ast_error(c, chch, - "keyword argument repeated"); + "keyword argument repeated"); return NULL; } } @@ -3045,15 +3183,16 @@ ast_for_expr_stmt(struct compiling *c, const node *n) { REQ(n, expr_stmt); /* expr_stmt: testlist_star_expr (annassign | augassign (yield_expr|testlist) | - ('=' (yield_expr|testlist_star_expr))*) - annassign: ':' test ['=' test] - testlist_star_expr: (test|star_expr) (',' test|star_expr)* [','] - augassign: '+=' | '-=' | '*=' | '@=' | '/=' | '%=' | '&=' | '|=' | '^=' - | '<<=' | '>>=' | '**=' | '//=' + [('=' (yield_expr|testlist_star_expr))+ [TYPE_COMMENT]] ) + annassign: ':' test ['=' (yield_expr|testlist)] + testlist_star_expr: (test|star_expr) (',' (test|star_expr))* [','] + augassign: ('+=' | '-=' | '*=' | '@=' | '/=' | '%=' | '&=' | '|=' | '^=' | + '<<=' | '>>=' | '**=' | '//=') test: ... here starts the operator precedence dance */ + int num = NCH(n); - if (NCH(n) == 1) { + if (num == 1) { expr_ty e = ast_for_testlist(c, CHILD(n, 0)); if (!e) return NULL; @@ -3178,17 +3317,22 @@ ast_for_expr_stmt(struct compiling *c, const node *n) } } else { - int i; + int i, nch_minus_type, has_type_comment; asdl_seq *targets; node *value; expr_ty expression; + string type_comment; /* a normal assignment */ REQ(CHILD(n, 1), EQUAL); - targets = _Py_asdl_seq_new(NCH(n) / 2, c->c_arena); + + has_type_comment = TYPE(CHILD(n, num - 1)) == TYPE_COMMENT; + nch_minus_type = num - has_type_comment; + + targets = _Py_asdl_seq_new(nch_minus_type / 2, c->c_arena); if (!targets) return NULL; - for (i = 0; i < NCH(n) - 2; i += 2) { + for (i = 0; i < nch_minus_type - 2; i += 2) { expr_ty e; node *ch = CHILD(n, i); if (TYPE(ch) == yield_expr) { @@ -3205,14 +3349,21 @@ ast_for_expr_stmt(struct compiling *c, const node *n) asdl_seq_SET(targets, i / 2, e); } - value = CHILD(n, NCH(n) - 1); + value = CHILD(n, nch_minus_type - 1); if (TYPE(value) == testlist_star_expr) expression = ast_for_testlist(c, value); else expression = ast_for_expr(c, value); if (!expression) return NULL; - return Assign(targets, expression, LINENO(n), n->n_col_offset, + if (has_type_comment) { + type_comment = NEW_TYPE_COMMENT(CHILD(n, nch_minus_type)); + if (!type_comment) + return NULL; + } + else + type_comment = NULL; + return Assign(targets, expression, type_comment, LINENO(n), n->n_col_offset, n->n_end_lineno, n->n_end_col_offset, c->c_arena); } } @@ -3520,8 +3671,9 @@ ast_for_import_stmt(struct compiling *c, const node *n) n = CHILD(n, idx); n_children = NCH(n); if (n_children % 2 == 0) { - ast_error(c, n, "trailing comma not allowed without" - " surrounding parentheses"); + ast_error(c, n, + "trailing comma not allowed without" + " surrounding parentheses"); return NULL; } break; @@ -3639,13 +3791,15 @@ ast_for_assert_stmt(struct compiling *c, const node *n) static asdl_seq * ast_for_suite(struct compiling *c, const node *n) { - /* suite: simple_stmt | NEWLINE INDENT stmt+ DEDENT */ + /* suite: simple_stmt | NEWLINE [TYPE_COMMENT NEWLINE] INDENT stmt+ DEDENT */ asdl_seq *seq; stmt_ty s; int i, total, num, end, pos = 0; node *ch; - REQ(n, suite); + if (TYPE(n) != func_body_suite) { + REQ(n, suite); + } total = num_stmts(n); seq = _Py_asdl_seq_new(total, c->c_arena); @@ -3669,7 +3823,13 @@ ast_for_suite(struct compiling *c, const node *n) } } else { - for (i = 2; i < (NCH(n) - 1); i++) { + i = 2; + if (TYPE(CHILD(n, 1)) == TYPE_COMMENT) { + i += 2; + REQ(CHILD(n, 2), NEWLINE); + } + + for (; i < (NCH(n) - 1); i++) { ch = CHILD(n, i); REQ(ch, stmt); num = num_stmts(ch); @@ -3903,11 +4063,15 @@ ast_for_for_stmt(struct compiling *c, const node *n0, bool is_async) expr_ty target, first; const node *node_target; int end_lineno, end_col_offset; - /* for_stmt: 'for' exprlist 'in' testlist ':' suite ['else' ':' suite] */ + int has_type_comment; + string type_comment; + /* for_stmt: 'for' exprlist 'in' testlist ':' [TYPE_COMMENT] suite ['else' ':' suite] */ REQ(n, for_stmt); - if (NCH(n) == 9) { - seq = ast_for_suite(c, CHILD(n, 8)); + has_type_comment = TYPE(CHILD(n, 5)) == TYPE_COMMENT; + + if (NCH(n) == 9 + has_type_comment) { + seq = ast_for_suite(c, CHILD(n, 8 + has_type_comment)); if (!seq) return NULL; } @@ -3929,7 +4093,7 @@ ast_for_for_stmt(struct compiling *c, const node *n0, bool is_async) expression = ast_for_testlist(c, CHILD(n, 3)); if (!expression) return NULL; - suite_seq = ast_for_suite(c, CHILD(n, 5)); + suite_seq = ast_for_suite(c, CHILD(n, 5 + has_type_comment)); if (!suite_seq) return NULL; @@ -3938,12 +4102,21 @@ ast_for_for_stmt(struct compiling *c, const node *n0, bool is_async) } else { get_last_end_pos(suite_seq, &end_lineno, &end_col_offset); } + + if (has_type_comment) { + type_comment = NEW_TYPE_COMMENT(CHILD(n, 5)); + if (!type_comment) + return NULL; + } + else + type_comment = NULL; + if (is_async) - return AsyncFor(target, expression, suite_seq, seq, + return AsyncFor(target, expression, suite_seq, seq, type_comment, LINENO(n0), n0->n_col_offset, end_lineno, end_col_offset, c->c_arena); else - return For(target, expression, suite_seq, seq, + return For(target, expression, suite_seq, seq, type_comment, LINENO(n), n->n_col_offset, end_lineno, end_col_offset, c->c_arena); } @@ -4111,21 +4284,25 @@ ast_for_with_item(struct compiling *c, const node *n) return withitem(context_expr, optional_vars, c->c_arena); } -/* with_stmt: 'with' with_item (',' with_item)* ':' suite */ +/* with_stmt: 'with' with_item (',' with_item)* ':' [TYPE_COMMENT] suite */ static stmt_ty ast_for_with_stmt(struct compiling *c, const node *n0, bool is_async) { const node * const n = is_async ? CHILD(n0, 1) : n0; - int i, n_items, end_lineno, end_col_offset; + int i, n_items, nch_minus_type, has_type_comment, end_lineno, end_col_offset; asdl_seq *items, *body; + string type_comment; REQ(n, with_stmt); - n_items = (NCH(n) - 2) / 2; + has_type_comment = TYPE(CHILD(n, NCH(n) - 2)) == TYPE_COMMENT; + nch_minus_type = NCH(n) - has_type_comment; + + n_items = (nch_minus_type - 2) / 2; items = _Py_asdl_seq_new(n_items, c->c_arena); if (!items) return NULL; - for (i = 1; i < NCH(n) - 2; i += 2) { + for (i = 1; i < nch_minus_type - 2; i += 2) { withitem_ty item = ast_for_with_item(c, CHILD(n, i)); if (!item) return NULL; @@ -4137,11 +4314,19 @@ ast_for_with_stmt(struct compiling *c, const node *n0, bool is_async) return NULL; get_last_end_pos(body, &end_lineno, &end_col_offset); + if (has_type_comment) { + type_comment = NEW_TYPE_COMMENT(CHILD(n, NCH(n) - 2)); + if (!type_comment) + return NULL; + } + else + type_comment = NULL; + if (is_async) - return AsyncWith(items, body, LINENO(n0), n0->n_col_offset, + return AsyncWith(items, body, type_comment, LINENO(n0), n0->n_col_offset, end_lineno, end_col_offset, c->c_arena); else - return With(items, body, LINENO(n), n->n_col_offset, + return With(items, body, type_comment, LINENO(n), n->n_col_offset, end_lineno, end_col_offset, c->c_arena); } @@ -4768,8 +4953,9 @@ fstring_find_expr(const char **str, const char *end, int raw, int recurse_lvl, if (ch == '\\') { /* Error: can't include a backslash character, inside parens or strings or not. */ - ast_error(c, n, "f-string expression part " - "cannot include a backslash"); + ast_error(c, n, + "f-string expression part " + "cannot include a backslash"); return -1; } if (quote_char) { @@ -4893,8 +5079,9 @@ fstring_find_expr(const char **str, const char *end, int raw, int recurse_lvl, /* Validate the conversion. */ if (!(conversion == 's' || conversion == 'r' || conversion == 'a')) { - ast_error(c, n, "f-string: invalid conversion character: " - "expected 's', 'r', or 'a'"); + ast_error(c, n, + "f-string: invalid conversion character: " + "expected 's', 'r', or 'a'"); return -1; } } @@ -5446,7 +5633,8 @@ parsestr(struct compiling *c, const node *n, int *bytesmode, int *rawmode, const char *ch; for (ch = s; *ch; ch++) { if (Py_CHARMASK(*ch) >= 0x80) { - ast_error(c, n, "bytes can only contain ASCII " + ast_error(c, n, + "bytes can only contain ASCII " "literal characters."); return -1; } diff --git a/Python/bltinmodule.c b/Python/bltinmodule.c index 332142fc6ffc..f9b901f7e59f 100644 --- a/Python/bltinmodule.c +++ b/Python/bltinmodule.c @@ -765,13 +765,13 @@ builtin_compile_impl(PyObject *module, PyObject *source, PyObject *filename, int compile_mode = -1; int is_ast; PyCompilerFlags cf; - int start[] = {Py_file_input, Py_eval_input, Py_single_input}; + int start[] = {Py_file_input, Py_eval_input, Py_single_input, Py_func_type_input}; PyObject *result; cf.cf_flags = flags | PyCF_SOURCE_IS_UTF8; if (flags & - ~(PyCF_MASK | PyCF_MASK_OBSOLETE | PyCF_DONT_IMPLY_DEDENT | PyCF_ONLY_AST)) + ~(PyCF_MASK | PyCF_MASK_OBSOLETE | PyCF_DONT_IMPLY_DEDENT | PyCF_ONLY_AST | PyCF_TYPE_COMMENTS)) { PyErr_SetString(PyExc_ValueError, "compile(): unrecognised flags"); @@ -795,9 +795,21 @@ builtin_compile_impl(PyObject *module, PyObject *source, PyObject *filename, compile_mode = 1; else if (strcmp(mode, "single") == 0) compile_mode = 2; + else if (strcmp(mode, "func_type") == 0) { + if (!(flags & PyCF_ONLY_AST)) { + PyErr_SetString(PyExc_ValueError, + "compile() mode 'func_type' requires flag PyCF_ONLY_AST"); + goto error; + } + compile_mode = 3; + } else { - PyErr_SetString(PyExc_ValueError, - "compile() mode must be 'exec', 'eval' or 'single'"); + const char *msg; + if (flags & PyCF_ONLY_AST) + msg = "compile() mode must be 'exec', 'eval', 'single' or 'func_type'"; + else + msg = "compile() mode must be 'exec', 'eval' or 'single'"; + PyErr_SetString(PyExc_ValueError, msg); goto error; } diff --git a/Python/graminit.c b/Python/graminit.c index 225d32793906..6e0f19891baa 100644 --- a/Python/graminit.c +++ b/Python/graminit.c @@ -135,30 +135,35 @@ static arc arcs_7_3[2] = { static arc arcs_7_4[1] = { {26, 6}, }; -static arc arcs_7_5[1] = { +static arc arcs_7_5[2] = { {28, 7}, + {29, 8}, }; static arc arcs_7_6[1] = { {27, 5}, }; static arc arcs_7_7[1] = { - {0, 7}, + {29, 8}, +}; +static arc arcs_7_8[1] = { + {0, 8}, }; -static state states_7[8] = { +static state states_7[9] = { {1, arcs_7_0}, {1, arcs_7_1}, {1, arcs_7_2}, {2, arcs_7_3}, {1, arcs_7_4}, - {1, arcs_7_5}, + {2, arcs_7_5}, {1, arcs_7_6}, {1, arcs_7_7}, + {1, arcs_7_8}, }; static arc arcs_8_0[1] = { {13, 1}, }; static arc arcs_8_1[2] = { - {29, 2}, + {30, 2}, {15, 3}, }; static arc arcs_8_2[1] = { @@ -174,107 +179,144 @@ static state states_8[4] = { {1, arcs_8_3}, }; static arc arcs_9_0[3] = { - {30, 1}, - {33, 2}, - {34, 3}, + {31, 1}, + {34, 2}, + {35, 3}, }; -static arc arcs_9_1[3] = { - {31, 4}, - {32, 5}, +static arc arcs_9_1[4] = { + {32, 4}, + {33, 5}, + {28, 6}, {0, 1}, }; -static arc arcs_9_2[3] = { - {30, 6}, - {32, 7}, +static arc arcs_9_2[4] = { + {31, 7}, + {33, 8}, + {28, 6}, {0, 2}, }; static arc arcs_9_3[1] = { - {30, 8}, + {31, 9}, }; static arc arcs_9_4[1] = { - {26, 9}, + {26, 10}, }; -static arc arcs_9_5[4] = { - {30, 10}, - {33, 11}, - {34, 3}, +static arc arcs_9_5[5] = { + {28, 11}, + {31, 12}, + {34, 13}, + {35, 3}, {0, 5}, }; -static arc arcs_9_6[2] = { - {32, 7}, +static arc arcs_9_6[1] = { {0, 6}, }; static arc arcs_9_7[3] = { - {30, 12}, - {34, 3}, + {33, 8}, + {28, 6}, {0, 7}, }; -static arc arcs_9_8[2] = { - {32, 13}, +static arc arcs_9_8[4] = { + {28, 14}, + {31, 15}, + {35, 3}, {0, 8}, }; -static arc arcs_9_9[2] = { - {32, 5}, +static arc arcs_9_9[3] = { + {33, 16}, + {28, 6}, {0, 9}, }; static arc arcs_9_10[3] = { - {32, 5}, - {31, 4}, + {33, 5}, + {28, 6}, {0, 10}, }; -static arc arcs_9_11[3] = { - {30, 14}, - {32, 15}, +static arc arcs_9_11[4] = { + {31, 12}, + {34, 13}, + {35, 3}, {0, 11}, }; -static arc arcs_9_12[3] = { - {32, 7}, - {31, 16}, +static arc arcs_9_12[4] = { + {33, 5}, + {32, 4}, + {28, 6}, {0, 12}, }; -static arc arcs_9_13[1] = { +static arc arcs_9_13[4] = { + {31, 17}, + {33, 18}, + {28, 6}, {0, 13}, }; -static arc arcs_9_14[2] = { - {32, 15}, +static arc arcs_9_14[3] = { + {31, 15}, + {35, 3}, {0, 14}, }; -static arc arcs_9_15[3] = { - {30, 17}, - {34, 3}, +static arc arcs_9_15[4] = { + {33, 8}, + {32, 19}, + {28, 6}, {0, 15}, }; -static arc arcs_9_16[1] = { - {26, 6}, +static arc arcs_9_16[2] = { + {28, 6}, + {0, 16}, }; static arc arcs_9_17[3] = { - {32, 15}, - {31, 18}, + {33, 18}, + {28, 6}, {0, 17}, }; -static arc arcs_9_18[1] = { - {26, 14}, +static arc arcs_9_18[4] = { + {28, 20}, + {31, 21}, + {35, 3}, + {0, 18}, +}; +static arc arcs_9_19[1] = { + {26, 7}, }; -static state states_9[19] = { +static arc arcs_9_20[3] = { + {31, 21}, + {35, 3}, + {0, 20}, +}; +static arc arcs_9_21[4] = { + {33, 18}, + {32, 22}, + {28, 6}, + {0, 21}, +}; +static arc arcs_9_22[1] = { + {26, 17}, +}; +static state states_9[23] = { {3, arcs_9_0}, - {3, arcs_9_1}, - {3, arcs_9_2}, + {4, arcs_9_1}, + {4, arcs_9_2}, {1, arcs_9_3}, {1, arcs_9_4}, - {4, arcs_9_5}, - {2, arcs_9_6}, + {5, arcs_9_5}, + {1, arcs_9_6}, {3, arcs_9_7}, - {2, arcs_9_8}, - {2, arcs_9_9}, + {4, arcs_9_8}, + {3, arcs_9_9}, {3, arcs_9_10}, - {3, arcs_9_11}, - {3, arcs_9_12}, - {1, arcs_9_13}, - {2, arcs_9_14}, - {3, arcs_9_15}, - {1, arcs_9_16}, + {4, arcs_9_11}, + {4, arcs_9_12}, + {4, arcs_9_13}, + {3, arcs_9_14}, + {4, arcs_9_15}, + {2, arcs_9_16}, {3, arcs_9_17}, - {1, arcs_9_18}, + {4, arcs_9_18}, + {1, arcs_9_19}, + {3, arcs_9_20}, + {4, arcs_9_21}, + {1, arcs_9_22}, }; static arc arcs_10_0[1] = { {23, 1}, @@ -296,82 +338,82 @@ static state states_10[4] = { {1, arcs_10_3}, }; static arc arcs_11_0[3] = { - {36, 1}, - {33, 2}, - {34, 3}, + {37, 1}, + {34, 2}, + {35, 3}, }; static arc arcs_11_1[3] = { - {31, 4}, - {32, 5}, + {32, 4}, + {33, 5}, {0, 1}, }; static arc arcs_11_2[3] = { - {36, 6}, - {32, 7}, + {37, 6}, + {33, 7}, {0, 2}, }; static arc arcs_11_3[1] = { - {36, 8}, + {37, 8}, }; static arc arcs_11_4[1] = { {26, 9}, }; static arc arcs_11_5[4] = { - {36, 10}, - {33, 11}, - {34, 3}, + {37, 10}, + {34, 11}, + {35, 3}, {0, 5}, }; static arc arcs_11_6[2] = { - {32, 7}, + {33, 7}, {0, 6}, }; static arc arcs_11_7[3] = { - {36, 12}, - {34, 3}, + {37, 12}, + {35, 3}, {0, 7}, }; static arc arcs_11_8[2] = { - {32, 13}, + {33, 13}, {0, 8}, }; static arc arcs_11_9[2] = { - {32, 5}, + {33, 5}, {0, 9}, }; static arc arcs_11_10[3] = { - {32, 5}, - {31, 4}, + {33, 5}, + {32, 4}, {0, 10}, }; static arc arcs_11_11[3] = { - {36, 14}, - {32, 15}, + {37, 14}, + {33, 15}, {0, 11}, }; static arc arcs_11_12[3] = { - {32, 7}, - {31, 16}, + {33, 7}, + {32, 16}, {0, 12}, }; static arc arcs_11_13[1] = { {0, 13}, }; static arc arcs_11_14[2] = { - {32, 15}, + {33, 15}, {0, 14}, }; static arc arcs_11_15[3] = { - {36, 17}, - {34, 3}, + {37, 17}, + {35, 3}, {0, 15}, }; static arc arcs_11_16[1] = { {26, 6}, }; static arc arcs_11_17[3] = { - {32, 15}, - {31, 18}, + {33, 15}, + {32, 18}, {0, 17}, }; static arc arcs_11_18[1] = { @@ -420,14 +462,14 @@ static state states_13[2] = { {1, arcs_13_1}, }; static arc arcs_14_0[1] = { - {37, 1}, + {38, 1}, }; static arc arcs_14_1[2] = { - {38, 2}, + {39, 2}, {2, 3}, }; static arc arcs_14_2[2] = { - {37, 1}, + {38, 1}, {2, 3}, }; static arc arcs_14_3[1] = { @@ -440,7 +482,6 @@ static state states_14[4] = { {1, arcs_14_3}, }; static arc arcs_15_0[8] = { - {39, 1}, {40, 1}, {41, 1}, {42, 1}, @@ -448,6 +489,7 @@ static arc arcs_15_0[8] = { {44, 1}, {45, 1}, {46, 1}, + {47, 1}, }; static arc arcs_15_1[1] = { {0, 1}, @@ -457,27 +499,28 @@ static state states_15[2] = { {1, arcs_15_1}, }; static arc arcs_16_0[1] = { - {47, 1}, + {48, 1}, }; static arc arcs_16_1[4] = { - {48, 2}, - {49, 3}, - {31, 4}, + {49, 2}, + {50, 3}, + {32, 4}, {0, 1}, }; static arc arcs_16_2[1] = { {0, 2}, }; static arc arcs_16_3[2] = { - {50, 2}, + {51, 2}, {9, 2}, }; static arc arcs_16_4[2] = { - {50, 5}, - {47, 5}, + {51, 5}, + {48, 5}, }; -static arc arcs_16_5[2] = { - {31, 4}, +static arc arcs_16_5[3] = { + {32, 4}, + {28, 2}, {0, 5}, }; static state states_16[6] = { @@ -486,7 +529,7 @@ static state states_16[6] = { {1, arcs_16_2}, {2, arcs_16_3}, {2, arcs_16_4}, - {2, arcs_16_5}, + {3, arcs_16_5}, }; static arc arcs_17_0[1] = { {27, 1}, @@ -495,11 +538,11 @@ static arc arcs_17_1[1] = { {26, 2}, }; static arc arcs_17_2[2] = { - {31, 3}, + {32, 3}, {0, 2}, }; static arc arcs_17_3[2] = { - {50, 4}, + {51, 4}, {9, 4}, }; static arc arcs_17_4[1] = { @@ -514,15 +557,15 @@ static state states_17[5] = { }; static arc arcs_18_0[2] = { {26, 1}, - {51, 1}, + {52, 1}, }; static arc arcs_18_1[2] = { - {32, 2}, + {33, 2}, {0, 1}, }; static arc arcs_18_2[3] = { {26, 1}, - {51, 1}, + {52, 1}, {0, 2}, }; static state states_18[3] = { @@ -531,7 +574,6 @@ static state states_18[3] = { {3, arcs_18_2}, }; static arc arcs_19_0[13] = { - {52, 1}, {53, 1}, {54, 1}, {55, 1}, @@ -544,6 +586,7 @@ static arc arcs_19_0[13] = { {62, 1}, {63, 1}, {64, 1}, + {65, 1}, }; static arc arcs_19_1[1] = { {0, 1}, @@ -553,10 +596,10 @@ static state states_19[2] = { {1, arcs_19_1}, }; static arc arcs_20_0[1] = { - {65, 1}, + {66, 1}, }; static arc arcs_20_1[1] = { - {66, 2}, + {67, 2}, }; static arc arcs_20_2[1] = { {0, 2}, @@ -567,7 +610,7 @@ static state states_20[3] = { {1, arcs_20_2}, }; static arc arcs_21_0[1] = { - {67, 1}, + {68, 1}, }; static arc arcs_21_1[1] = { {0, 1}, @@ -577,11 +620,11 @@ static state states_21[2] = { {1, arcs_21_1}, }; static arc arcs_22_0[5] = { - {68, 1}, {69, 1}, {70, 1}, {71, 1}, {72, 1}, + {73, 1}, }; static arc arcs_22_1[1] = { {0, 1}, @@ -591,7 +634,7 @@ static state states_22[2] = { {1, arcs_22_1}, }; static arc arcs_23_0[1] = { - {73, 1}, + {74, 1}, }; static arc arcs_23_1[1] = { {0, 1}, @@ -601,7 +644,7 @@ static state states_23[2] = { {1, arcs_23_1}, }; static arc arcs_24_0[1] = { - {74, 1}, + {75, 1}, }; static arc arcs_24_1[1] = { {0, 1}, @@ -611,10 +654,10 @@ static state states_24[2] = { {1, arcs_24_1}, }; static arc arcs_25_0[1] = { - {75, 1}, + {76, 1}, }; static arc arcs_25_1[2] = { - {47, 2}, + {48, 2}, {0, 1}, }; static arc arcs_25_2[1] = { @@ -626,7 +669,7 @@ static state states_25[3] = { {1, arcs_25_2}, }; static arc arcs_26_0[1] = { - {50, 1}, + {51, 1}, }; static arc arcs_26_1[1] = { {0, 1}, @@ -636,14 +679,14 @@ static state states_26[2] = { {1, arcs_26_1}, }; static arc arcs_27_0[1] = { - {76, 1}, + {77, 1}, }; static arc arcs_27_1[2] = { {26, 2}, {0, 1}, }; static arc arcs_27_2[2] = { - {77, 3}, + {78, 3}, {0, 2}, }; static arc arcs_27_3[1] = { @@ -660,8 +703,8 @@ static state states_27[5] = { {1, arcs_27_4}, }; static arc arcs_28_0[2] = { - {78, 1}, {79, 1}, + {80, 1}, }; static arc arcs_28_1[1] = { {0, 1}, @@ -671,10 +714,10 @@ static state states_28[2] = { {1, arcs_28_1}, }; static arc arcs_29_0[1] = { - {80, 1}, + {81, 1}, }; static arc arcs_29_1[1] = { - {81, 2}, + {82, 2}, }; static arc arcs_29_2[1] = { {0, 2}, @@ -685,32 +728,32 @@ static state states_29[3] = { {1, arcs_29_2}, }; static arc arcs_30_0[1] = { - {77, 1}, + {78, 1}, }; static arc arcs_30_1[3] = { - {82, 2}, {83, 2}, + {84, 2}, {12, 3}, }; static arc arcs_30_2[4] = { - {82, 2}, {83, 2}, + {84, 2}, {12, 3}, - {80, 4}, + {81, 4}, }; static arc arcs_30_3[1] = { - {80, 4}, + {81, 4}, }; static arc arcs_30_4[3] = { - {33, 5}, + {34, 5}, {13, 6}, - {84, 5}, + {85, 5}, }; static arc arcs_30_5[1] = { {0, 5}, }; static arc arcs_30_6[1] = { - {84, 7}, + {85, 7}, }; static arc arcs_30_7[1] = { {15, 5}, @@ -729,7 +772,7 @@ static arc arcs_31_0[1] = { {23, 1}, }; static arc arcs_31_1[2] = { - {86, 2}, + {87, 2}, {0, 1}, }; static arc arcs_31_2[1] = { @@ -748,7 +791,7 @@ static arc arcs_32_0[1] = { {12, 1}, }; static arc arcs_32_1[2] = { - {86, 2}, + {87, 2}, {0, 1}, }; static arc arcs_32_2[1] = { @@ -764,14 +807,14 @@ static state states_32[4] = { {1, arcs_32_3}, }; static arc arcs_33_0[1] = { - {85, 1}, + {86, 1}, }; static arc arcs_33_1[2] = { - {32, 2}, + {33, 2}, {0, 1}, }; static arc arcs_33_2[2] = { - {85, 1}, + {86, 1}, {0, 2}, }; static state states_33[3] = { @@ -780,10 +823,10 @@ static state states_33[3] = { {2, arcs_33_2}, }; static arc arcs_34_0[1] = { - {87, 1}, + {88, 1}, }; static arc arcs_34_1[2] = { - {32, 0}, + {33, 0}, {0, 1}, }; static state states_34[2] = { @@ -794,7 +837,7 @@ static arc arcs_35_0[1] = { {23, 1}, }; static arc arcs_35_1[2] = { - {82, 0}, + {83, 0}, {0, 1}, }; static state states_35[2] = { @@ -802,13 +845,13 @@ static state states_35[2] = { {2, arcs_35_1}, }; static arc arcs_36_0[1] = { - {88, 1}, + {89, 1}, }; static arc arcs_36_1[1] = { {23, 2}, }; static arc arcs_36_2[2] = { - {32, 1}, + {33, 1}, {0, 2}, }; static state states_36[3] = { @@ -817,13 +860,13 @@ static state states_36[3] = { {2, arcs_36_2}, }; static arc arcs_37_0[1] = { - {89, 1}, + {90, 1}, }; static arc arcs_37_1[1] = { {23, 2}, }; static arc arcs_37_2[2] = { - {32, 1}, + {33, 1}, {0, 2}, }; static state states_37[3] = { @@ -832,13 +875,13 @@ static state states_37[3] = { {2, arcs_37_2}, }; static arc arcs_38_0[1] = { - {90, 1}, + {91, 1}, }; static arc arcs_38_1[1] = { {26, 2}, }; static arc arcs_38_2[2] = { - {32, 3}, + {33, 3}, {0, 2}, }; static arc arcs_38_3[1] = { @@ -855,15 +898,15 @@ static state states_38[5] = { {1, arcs_38_4}, }; static arc arcs_39_0[9] = { - {91, 1}, {92, 1}, {93, 1}, {94, 1}, {95, 1}, + {96, 1}, {19, 1}, {18, 1}, {17, 1}, - {96, 1}, + {97, 1}, }; static arc arcs_39_1[1] = { {0, 1}, @@ -877,8 +920,8 @@ static arc arcs_40_0[1] = { }; static arc arcs_40_1[3] = { {19, 2}, - {95, 2}, - {93, 2}, + {96, 2}, + {94, 2}, }; static arc arcs_40_2[1] = { {0, 2}, @@ -889,27 +932,27 @@ static state states_40[3] = { {1, arcs_40_2}, }; static arc arcs_41_0[1] = { - {97, 1}, + {98, 1}, }; static arc arcs_41_1[1] = { - {98, 2}, + {99, 2}, }; static arc arcs_41_2[1] = { {27, 3}, }; static arc arcs_41_3[1] = { - {28, 4}, + {100, 4}, }; static arc arcs_41_4[3] = { - {99, 1}, - {100, 5}, + {101, 1}, + {102, 5}, {0, 4}, }; static arc arcs_41_5[1] = { {27, 6}, }; static arc arcs_41_6[1] = { - {28, 7}, + {100, 7}, }; static arc arcs_41_7[1] = { {0, 7}, @@ -925,7 +968,7 @@ static state states_41[8] = { {1, arcs_41_7}, }; static arc arcs_42_0[1] = { - {101, 1}, + {103, 1}, }; static arc arcs_42_1[1] = { {26, 2}, @@ -934,17 +977,17 @@ static arc arcs_42_2[1] = { {27, 3}, }; static arc arcs_42_3[1] = { - {28, 4}, + {100, 4}, }; static arc arcs_42_4[2] = { - {100, 5}, + {102, 5}, {0, 4}, }; static arc arcs_42_5[1] = { {27, 6}, }; static arc arcs_42_6[1] = { - {28, 7}, + {100, 7}, }; static arc arcs_42_7[1] = { {0, 7}, @@ -960,13 +1003,13 @@ static state states_42[8] = { {1, arcs_42_7}, }; static arc arcs_43_0[1] = { - {102, 1}, + {104, 1}, }; static arc arcs_43_1[1] = { - {66, 2}, + {67, 2}, }; static arc arcs_43_2[1] = { - {103, 3}, + {105, 3}, }; static arc arcs_43_3[1] = { {9, 4}, @@ -974,46 +1017,51 @@ static arc arcs_43_3[1] = { static arc arcs_43_4[1] = { {27, 5}, }; -static arc arcs_43_5[1] = { +static arc arcs_43_5[2] = { {28, 6}, + {100, 7}, }; -static arc arcs_43_6[2] = { +static arc arcs_43_6[1] = { {100, 7}, - {0, 6}, }; -static arc arcs_43_7[1] = { - {27, 8}, +static arc arcs_43_7[2] = { + {102, 8}, + {0, 7}, }; static arc arcs_43_8[1] = { - {28, 9}, + {27, 9}, }; static arc arcs_43_9[1] = { - {0, 9}, + {100, 10}, +}; +static arc arcs_43_10[1] = { + {0, 10}, }; -static state states_43[10] = { +static state states_43[11] = { {1, arcs_43_0}, {1, arcs_43_1}, {1, arcs_43_2}, {1, arcs_43_3}, {1, arcs_43_4}, - {1, arcs_43_5}, - {2, arcs_43_6}, - {1, arcs_43_7}, + {2, arcs_43_5}, + {1, arcs_43_6}, + {2, arcs_43_7}, {1, arcs_43_8}, {1, arcs_43_9}, + {1, arcs_43_10}, }; static arc arcs_44_0[1] = { - {104, 1}, + {106, 1}, }; static arc arcs_44_1[1] = { {27, 2}, }; static arc arcs_44_2[1] = { - {28, 3}, + {100, 3}, }; static arc arcs_44_3[2] = { - {105, 4}, - {106, 5}, + {107, 4}, + {108, 5}, }; static arc arcs_44_4[1] = { {27, 6}, @@ -1022,15 +1070,15 @@ static arc arcs_44_5[1] = { {27, 7}, }; static arc arcs_44_6[1] = { - {28, 8}, + {100, 8}, }; static arc arcs_44_7[1] = { - {28, 9}, + {100, 9}, }; static arc arcs_44_8[4] = { - {105, 4}, - {100, 10}, - {106, 5}, + {107, 4}, + {102, 10}, + {108, 5}, {0, 8}, }; static arc arcs_44_9[1] = { @@ -1040,10 +1088,10 @@ static arc arcs_44_10[1] = { {27, 11}, }; static arc arcs_44_11[1] = { - {28, 12}, + {100, 12}, }; static arc arcs_44_12[2] = { - {106, 5}, + {108, 5}, {0, 12}, }; static state states_44[13] = { @@ -1062,37 +1110,42 @@ static state states_44[13] = { {2, arcs_44_12}, }; static arc arcs_45_0[1] = { - {107, 1}, + {109, 1}, }; static arc arcs_45_1[1] = { - {108, 2}, + {110, 2}, }; static arc arcs_45_2[2] = { - {32, 1}, + {33, 1}, {27, 3}, }; -static arc arcs_45_3[1] = { +static arc arcs_45_3[2] = { {28, 4}, + {100, 5}, }; static arc arcs_45_4[1] = { - {0, 4}, + {100, 5}, }; -static state states_45[5] = { +static arc arcs_45_5[1] = { + {0, 5}, +}; +static state states_45[6] = { {1, arcs_45_0}, {1, arcs_45_1}, {2, arcs_45_2}, - {1, arcs_45_3}, + {2, arcs_45_3}, {1, arcs_45_4}, + {1, arcs_45_5}, }; static arc arcs_46_0[1] = { {26, 1}, }; static arc arcs_46_1[2] = { - {86, 2}, + {87, 2}, {0, 1}, }; static arc arcs_46_2[1] = { - {109, 3}, + {111, 3}, }; static arc arcs_46_3[1] = { {0, 3}, @@ -1104,14 +1157,14 @@ static state states_46[4] = { {1, arcs_46_3}, }; static arc arcs_47_0[1] = { - {110, 1}, + {112, 1}, }; static arc arcs_47_1[2] = { {26, 2}, {0, 1}, }; static arc arcs_47_2[2] = { - {86, 3}, + {87, 3}, {0, 2}, }; static arc arcs_47_3[1] = { @@ -1135,14 +1188,14 @@ static arc arcs_48_1[1] = { {0, 1}, }; static arc arcs_48_2[1] = { - {111, 3}, + {113, 3}, }; static arc arcs_48_3[1] = { {6, 4}, }; static arc arcs_48_4[2] = { {6, 4}, - {112, 1}, + {114, 1}, }; static state states_48[5] = { {2, arcs_48_0}, @@ -1155,7 +1208,7 @@ static arc arcs_49_0[1] = { {26, 1}, }; static arc arcs_49_1[2] = { - {113, 2}, + {115, 2}, {0, 1}, }; static arc arcs_49_2[1] = { @@ -1171,21 +1224,21 @@ static state states_49[4] = { {1, arcs_49_3}, }; static arc arcs_50_0[2] = { - {114, 1}, - {115, 2}, + {116, 1}, + {117, 2}, }; static arc arcs_50_1[2] = { - {97, 3}, + {98, 3}, {0, 1}, }; static arc arcs_50_2[1] = { {0, 2}, }; static arc arcs_50_3[1] = { - {114, 4}, + {116, 4}, }; static arc arcs_50_4[1] = { - {100, 5}, + {102, 5}, }; static arc arcs_50_5[1] = { {26, 2}, @@ -1199,8 +1252,8 @@ static state states_50[6] = { {1, arcs_50_5}, }; static arc arcs_51_0[2] = { - {114, 1}, - {117, 1}, + {116, 1}, + {119, 1}, }; static arc arcs_51_1[1] = { {0, 1}, @@ -1210,10 +1263,10 @@ static state states_51[2] = { {1, arcs_51_1}, }; static arc arcs_52_0[1] = { - {118, 1}, + {120, 1}, }; static arc arcs_52_1[2] = { - {35, 2}, + {36, 2}, {27, 3}, }; static arc arcs_52_2[1] = { @@ -1233,17 +1286,17 @@ static state states_52[5] = { {1, arcs_52_4}, }; static arc arcs_53_0[1] = { - {118, 1}, + {120, 1}, }; static arc arcs_53_1[2] = { - {35, 2}, + {36, 2}, {27, 3}, }; static arc arcs_53_2[1] = { {27, 3}, }; static arc arcs_53_3[1] = { - {116, 4}, + {118, 4}, }; static arc arcs_53_4[1] = { {0, 4}, @@ -1256,10 +1309,10 @@ static state states_53[5] = { {1, arcs_53_4}, }; static arc arcs_54_0[1] = { - {119, 1}, + {121, 1}, }; static arc arcs_54_1[2] = { - {120, 0}, + {122, 0}, {0, 1}, }; static state states_54[2] = { @@ -1267,10 +1320,10 @@ static state states_54[2] = { {2, arcs_54_1}, }; static arc arcs_55_0[1] = { - {121, 1}, + {123, 1}, }; static arc arcs_55_1[2] = { - {122, 0}, + {124, 0}, {0, 1}, }; static state states_55[2] = { @@ -1278,11 +1331,11 @@ static state states_55[2] = { {2, arcs_55_1}, }; static arc arcs_56_0[2] = { - {123, 1}, - {124, 2}, + {125, 1}, + {126, 2}, }; static arc arcs_56_1[1] = { - {121, 2}, + {123, 2}, }; static arc arcs_56_2[1] = { {0, 2}, @@ -1293,10 +1346,10 @@ static state states_56[3] = { {1, arcs_56_2}, }; static arc arcs_57_0[1] = { - {109, 1}, + {111, 1}, }; static arc arcs_57_1[2] = { - {125, 0}, + {127, 0}, {0, 1}, }; static state states_57[2] = { @@ -1304,25 +1357,25 @@ static state states_57[2] = { {2, arcs_57_1}, }; static arc arcs_58_0[10] = { - {126, 1}, - {127, 1}, {128, 1}, {129, 1}, {130, 1}, {131, 1}, {132, 1}, - {103, 1}, - {123, 2}, - {133, 3}, + {133, 1}, + {134, 1}, + {105, 1}, + {125, 2}, + {135, 3}, }; static arc arcs_58_1[1] = { {0, 1}, }; static arc arcs_58_2[1] = { - {103, 1}, + {105, 1}, }; static arc arcs_58_3[2] = { - {123, 1}, + {125, 1}, {0, 3}, }; static state states_58[4] = { @@ -1332,10 +1385,10 @@ static state states_58[4] = { {2, arcs_58_3}, }; static arc arcs_59_0[1] = { - {33, 1}, + {34, 1}, }; static arc arcs_59_1[1] = { - {109, 2}, + {111, 2}, }; static arc arcs_59_2[1] = { {0, 2}, @@ -1346,10 +1399,10 @@ static state states_59[3] = { {1, arcs_59_2}, }; static arc arcs_60_0[1] = { - {134, 1}, + {136, 1}, }; static arc arcs_60_1[2] = { - {135, 0}, + {137, 0}, {0, 1}, }; static state states_60[2] = { @@ -1357,10 +1410,10 @@ static state states_60[2] = { {2, arcs_60_1}, }; static arc arcs_61_0[1] = { - {136, 1}, + {138, 1}, }; static arc arcs_61_1[2] = { - {137, 0}, + {139, 0}, {0, 1}, }; static state states_61[2] = { @@ -1368,10 +1421,10 @@ static state states_61[2] = { {2, arcs_61_1}, }; static arc arcs_62_0[1] = { - {138, 1}, + {140, 1}, }; static arc arcs_62_1[2] = { - {139, 0}, + {141, 0}, {0, 1}, }; static state states_62[2] = { @@ -1379,11 +1432,11 @@ static state states_62[2] = { {2, arcs_62_1}, }; static arc arcs_63_0[1] = { - {140, 1}, + {142, 1}, }; static arc arcs_63_1[3] = { - {141, 0}, - {142, 0}, + {143, 0}, + {144, 0}, {0, 1}, }; static state states_63[2] = { @@ -1391,11 +1444,11 @@ static state states_63[2] = { {3, arcs_63_1}, }; static arc arcs_64_0[1] = { - {143, 1}, + {145, 1}, }; static arc arcs_64_1[3] = { - {144, 0}, - {145, 0}, + {146, 0}, + {147, 0}, {0, 1}, }; static state states_64[2] = { @@ -1403,14 +1456,14 @@ static state states_64[2] = { {3, arcs_64_1}, }; static arc arcs_65_0[1] = { - {146, 1}, + {148, 1}, }; static arc arcs_65_1[6] = { - {33, 0}, + {34, 0}, {11, 0}, - {147, 0}, - {148, 0}, {149, 0}, + {150, 0}, + {151, 0}, {0, 1}, }; static state states_65[2] = { @@ -1418,13 +1471,13 @@ static state states_65[2] = { {6, arcs_65_1}, }; static arc arcs_66_0[4] = { - {144, 1}, - {145, 1}, - {150, 1}, - {151, 2}, + {146, 1}, + {147, 1}, + {152, 1}, + {153, 2}, }; static arc arcs_66_1[1] = { - {146, 2}, + {148, 2}, }; static arc arcs_66_2[1] = { {0, 2}, @@ -1435,14 +1488,14 @@ static state states_66[3] = { {1, arcs_66_2}, }; static arc arcs_67_0[1] = { - {152, 1}, + {154, 1}, }; static arc arcs_67_1[2] = { - {34, 2}, + {35, 2}, {0, 1}, }; static arc arcs_67_2[1] = { - {146, 3}, + {148, 3}, }; static arc arcs_67_3[1] = { {0, 3}, @@ -1454,14 +1507,14 @@ static state states_67[4] = { {1, arcs_67_3}, }; static arc arcs_68_0[2] = { - {153, 1}, - {154, 2}, + {155, 1}, + {156, 2}, }; static arc arcs_68_1[1] = { - {154, 2}, + {156, 2}, }; static arc arcs_68_2[2] = { - {155, 2}, + {157, 2}, {0, 2}, }; static state states_68[3] = { @@ -1471,44 +1524,44 @@ static state states_68[3] = { }; static arc arcs_69_0[10] = { {13, 1}, - {157, 2}, - {159, 3}, + {159, 2}, + {161, 3}, {23, 4}, - {162, 4}, - {163, 5}, - {83, 4}, {164, 4}, - {165, 4}, + {165, 5}, + {84, 4}, {166, 4}, + {167, 4}, + {168, 4}, }; static arc arcs_69_1[3] = { - {50, 6}, - {156, 6}, + {51, 6}, + {158, 6}, {15, 4}, }; static arc arcs_69_2[2] = { - {156, 7}, - {158, 4}, + {158, 7}, + {160, 4}, }; static arc arcs_69_3[2] = { - {160, 8}, - {161, 4}, + {162, 8}, + {163, 4}, }; static arc arcs_69_4[1] = { {0, 4}, }; static arc arcs_69_5[2] = { - {163, 5}, + {165, 5}, {0, 5}, }; static arc arcs_69_6[1] = { {15, 4}, }; static arc arcs_69_7[1] = { - {158, 4}, + {160, 4}, }; static arc arcs_69_8[1] = { - {161, 4}, + {163, 4}, }; static state states_69[9] = { {10, arcs_69_0}, @@ -1522,24 +1575,24 @@ static state states_69[9] = { {1, arcs_69_8}, }; static arc arcs_70_0[2] = { - {98, 1}, - {51, 1}, + {99, 1}, + {52, 1}, }; static arc arcs_70_1[3] = { - {167, 2}, - {32, 3}, + {169, 2}, + {33, 3}, {0, 1}, }; static arc arcs_70_2[1] = { {0, 2}, }; static arc arcs_70_3[3] = { - {98, 4}, - {51, 4}, + {99, 4}, + {52, 4}, {0, 3}, }; static arc arcs_70_4[2] = { - {32, 3}, + {33, 3}, {0, 4}, }; static state states_70[5] = { @@ -1551,15 +1604,15 @@ static state states_70[5] = { }; static arc arcs_71_0[3] = { {13, 1}, - {157, 2}, - {82, 3}, + {159, 2}, + {83, 3}, }; static arc arcs_71_1[2] = { {14, 4}, {15, 5}, }; static arc arcs_71_2[1] = { - {168, 6}, + {170, 6}, }; static arc arcs_71_3[1] = { {23, 5}, @@ -1571,7 +1624,7 @@ static arc arcs_71_5[1] = { {0, 5}, }; static arc arcs_71_6[1] = { - {158, 5}, + {160, 5}, }; static state states_71[7] = { {3, arcs_71_0}, @@ -1583,14 +1636,14 @@ static state states_71[7] = { {1, arcs_71_6}, }; static arc arcs_72_0[1] = { - {169, 1}, + {171, 1}, }; static arc arcs_72_1[2] = { - {32, 2}, + {33, 2}, {0, 1}, }; static arc arcs_72_2[2] = { - {169, 1}, + {171, 1}, {0, 2}, }; static state states_72[3] = { @@ -1608,11 +1661,11 @@ static arc arcs_73_1[2] = { }; static arc arcs_73_2[3] = { {26, 3}, - {170, 4}, + {172, 4}, {0, 2}, }; static arc arcs_73_3[2] = { - {170, 4}, + {172, 4}, {0, 3}, }; static arc arcs_73_4[1] = { @@ -1641,16 +1694,16 @@ static state states_74[3] = { {1, arcs_74_2}, }; static arc arcs_75_0[2] = { - {109, 1}, - {51, 1}, + {111, 1}, + {52, 1}, }; static arc arcs_75_1[2] = { - {32, 2}, + {33, 2}, {0, 1}, }; static arc arcs_75_2[3] = { - {109, 1}, - {51, 1}, + {111, 1}, + {52, 1}, {0, 2}, }; static state states_75[3] = { @@ -1662,7 +1715,7 @@ static arc arcs_76_0[1] = { {26, 1}, }; static arc arcs_76_1[2] = { - {32, 2}, + {33, 2}, {0, 1}, }; static arc arcs_76_2[2] = { @@ -1676,21 +1729,21 @@ static state states_76[3] = { }; static arc arcs_77_0[3] = { {26, 1}, - {34, 2}, - {51, 3}, + {35, 2}, + {52, 3}, }; static arc arcs_77_1[4] = { {27, 4}, - {167, 5}, - {32, 6}, + {169, 5}, + {33, 6}, {0, 1}, }; static arc arcs_77_2[1] = { - {109, 7}, + {111, 7}, }; static arc arcs_77_3[3] = { - {167, 5}, - {32, 6}, + {169, 5}, + {33, 6}, {0, 3}, }; static arc arcs_77_4[1] = { @@ -1701,34 +1754,34 @@ static arc arcs_77_5[1] = { }; static arc arcs_77_6[3] = { {26, 8}, - {51, 8}, + {52, 8}, {0, 6}, }; static arc arcs_77_7[3] = { - {167, 5}, - {32, 9}, + {169, 5}, + {33, 9}, {0, 7}, }; static arc arcs_77_8[2] = { - {32, 6}, + {33, 6}, {0, 8}, }; static arc arcs_77_9[3] = { {26, 10}, - {34, 11}, + {35, 11}, {0, 9}, }; static arc arcs_77_10[1] = { {27, 12}, }; static arc arcs_77_11[1] = { - {109, 13}, + {111, 13}, }; static arc arcs_77_12[1] = { {26, 13}, }; static arc arcs_77_13[2] = { - {32, 9}, + {33, 9}, {0, 13}, }; static state states_77[14] = { @@ -1748,7 +1801,7 @@ static state states_77[14] = { {2, arcs_77_13}, }; static arc arcs_78_0[1] = { - {171, 1}, + {173, 1}, }; static arc arcs_78_1[1] = { {23, 2}, @@ -1762,7 +1815,7 @@ static arc arcs_78_3[2] = { {15, 6}, }; static arc arcs_78_4[1] = { - {28, 7}, + {100, 7}, }; static arc arcs_78_5[1] = { {15, 6}, @@ -1784,14 +1837,14 @@ static state states_78[8] = { {1, arcs_78_7}, }; static arc arcs_79_0[1] = { - {172, 1}, + {174, 1}, }; static arc arcs_79_1[2] = { - {32, 2}, + {33, 2}, {0, 1}, }; static arc arcs_79_2[2] = { - {172, 1}, + {174, 1}, {0, 2}, }; static state states_79[3] = { @@ -1801,13 +1854,13 @@ static state states_79[3] = { }; static arc arcs_80_0[3] = { {26, 1}, + {35, 2}, {34, 2}, - {33, 2}, }; static arc arcs_80_1[4] = { - {167, 3}, - {113, 2}, - {31, 2}, + {169, 3}, + {115, 2}, + {32, 2}, {0, 1}, }; static arc arcs_80_2[1] = { @@ -1823,8 +1876,8 @@ static state states_80[4] = { {1, arcs_80_3}, }; static arc arcs_81_0[2] = { - {167, 1}, - {174, 1}, + {169, 1}, + {176, 1}, }; static arc arcs_81_1[1] = { {0, 1}, @@ -1834,19 +1887,19 @@ static state states_81[2] = { {1, arcs_81_1}, }; static arc arcs_82_0[1] = { - {102, 1}, + {104, 1}, }; static arc arcs_82_1[1] = { - {66, 2}, + {67, 2}, }; static arc arcs_82_2[1] = { - {103, 3}, + {105, 3}, }; static arc arcs_82_3[1] = { - {114, 4}, + {116, 4}, }; static arc arcs_82_4[2] = { - {173, 5}, + {175, 5}, {0, 4}, }; static arc arcs_82_5[1] = { @@ -1862,10 +1915,10 @@ static state states_82[6] = { }; static arc arcs_83_0[2] = { {21, 1}, - {175, 2}, + {177, 2}, }; static arc arcs_83_1[1] = { - {175, 2}, + {177, 2}, }; static arc arcs_83_2[1] = { {0, 2}, @@ -1876,13 +1929,13 @@ static state states_83[3] = { {1, arcs_83_2}, }; static arc arcs_84_0[1] = { - {97, 1}, + {98, 1}, }; static arc arcs_84_1[1] = { - {116, 2}, + {118, 2}, }; static arc arcs_84_2[2] = { - {173, 3}, + {175, 3}, {0, 2}, }; static arc arcs_84_3[1] = { @@ -1905,10 +1958,10 @@ static state states_85[2] = { {1, arcs_85_1}, }; static arc arcs_86_0[1] = { - {177, 1}, + {179, 1}, }; static arc arcs_86_1[2] = { - {178, 2}, + {180, 2}, {0, 1}, }; static arc arcs_86_2[1] = { @@ -1920,8 +1973,8 @@ static state states_86[3] = { {1, arcs_86_2}, }; static arc arcs_87_0[2] = { - {77, 1}, - {47, 2}, + {78, 1}, + {48, 2}, }; static arc arcs_87_1[1] = { {26, 2}, @@ -1934,13 +1987,148 @@ static state states_87[3] = { {1, arcs_87_1}, {1, arcs_87_2}, }; -static dfa dfas[88] = { +static arc arcs_88_0[2] = { + {3, 1}, + {2, 2}, +}; +static arc arcs_88_1[1] = { + {0, 1}, +}; +static arc arcs_88_2[2] = { + {28, 3}, + {113, 4}, +}; +static arc arcs_88_3[1] = { + {2, 5}, +}; +static arc arcs_88_4[1] = { + {6, 6}, +}; +static arc arcs_88_5[1] = { + {113, 4}, +}; +static arc arcs_88_6[2] = { + {6, 6}, + {114, 1}, +}; +static state states_88[7] = { + {2, arcs_88_0}, + {1, arcs_88_1}, + {2, arcs_88_2}, + {1, arcs_88_3}, + {1, arcs_88_4}, + {1, arcs_88_5}, + {2, arcs_88_6}, +}; +static arc arcs_89_0[1] = { + {182, 1}, +}; +static arc arcs_89_1[2] = { + {2, 1}, + {7, 2}, +}; +static arc arcs_89_2[1] = { + {0, 2}, +}; +static state states_89[3] = { + {1, arcs_89_0}, + {2, arcs_89_1}, + {1, arcs_89_2}, +}; +static arc arcs_90_0[1] = { + {13, 1}, +}; +static arc arcs_90_1[2] = { + {183, 2}, + {15, 3}, +}; +static arc arcs_90_2[1] = { + {15, 3}, +}; +static arc arcs_90_3[1] = { + {25, 4}, +}; +static arc arcs_90_4[1] = { + {26, 5}, +}; +static arc arcs_90_5[1] = { + {0, 5}, +}; +static state states_90[6] = { + {1, arcs_90_0}, + {2, arcs_90_1}, + {1, arcs_90_2}, + {1, arcs_90_3}, + {1, arcs_90_4}, + {1, arcs_90_5}, +}; +static arc arcs_91_0[3] = { + {26, 1}, + {34, 2}, + {35, 3}, +}; +static arc arcs_91_1[2] = { + {33, 4}, + {0, 1}, +}; +static arc arcs_91_2[3] = { + {26, 5}, + {33, 6}, + {0, 2}, +}; +static arc arcs_91_3[1] = { + {26, 7}, +}; +static arc arcs_91_4[4] = { + {26, 1}, + {34, 8}, + {35, 3}, + {0, 4}, +}; +static arc arcs_91_5[2] = { + {33, 6}, + {0, 5}, +}; +static arc arcs_91_6[2] = { + {26, 5}, + {35, 3}, +}; +static arc arcs_91_7[1] = { + {0, 7}, +}; +static arc arcs_91_8[3] = { + {26, 9}, + {33, 10}, + {0, 8}, +}; +static arc arcs_91_9[2] = { + {33, 10}, + {0, 9}, +}; +static arc arcs_91_10[2] = { + {26, 9}, + {35, 3}, +}; +static state states_91[11] = { + {3, arcs_91_0}, + {2, arcs_91_1}, + {3, arcs_91_2}, + {1, arcs_91_3}, + {4, arcs_91_4}, + {2, arcs_91_5}, + {2, arcs_91_6}, + {1, arcs_91_7}, + {3, arcs_91_8}, + {2, arcs_91_9}, + {2, arcs_91_10}, +}; +static dfa dfas[92] = { {256, "single_input", 0, 3, states_0, - "\004\050\340\000\002\000\000\000\012\076\011\007\142\011\100\010\000\000\103\242\174\010\002"}, + "\004\050\340\000\004\000\000\000\024\174\022\016\204\045\000\041\000\000\014\211\362\041\010"}, {257, "file_input", 0, 2, states_1, - "\204\050\340\000\002\000\000\000\012\076\011\007\142\011\100\010\000\000\103\242\174\010\002"}, + "\204\050\340\000\004\000\000\000\024\174\022\016\204\045\000\041\000\000\014\211\362\041\010"}, {258, "eval_input", 0, 3, states_2, - "\000\040\200\000\000\000\000\000\000\000\010\000\000\000\100\010\000\000\103\242\174\000\000"}, + "\000\040\200\000\000\000\000\000\000\000\020\000\000\000\000\041\000\000\014\211\362\001\000"}, {259, "decorator", 0, 7, states_3, "\000\010\000\000\000\000\000\000\000\000\000\000\000\000\000\000\000\000\000\000\000\000\000"}, {260, "decorators", 0, 2, states_4, @@ -1949,54 +2137,54 @@ static dfa dfas[88] = { "\000\010\000\000\000\000\000\000\000\000\000\000\000\000\000\000\000\000\000\000\000\000\000"}, {262, "async_funcdef", 0, 3, states_6, "\000\000\040\000\000\000\000\000\000\000\000\000\000\000\000\000\000\000\000\000\000\000\000"}, - {263, "funcdef", 0, 8, states_7, + {263, "funcdef", 0, 9, states_7, "\000\000\100\000\000\000\000\000\000\000\000\000\000\000\000\000\000\000\000\000\000\000\000"}, {264, "parameters", 0, 4, states_8, "\000\040\000\000\000\000\000\000\000\000\000\000\000\000\000\000\000\000\000\000\000\000\000"}, - {265, "typedargslist", 0, 19, states_9, - "\000\000\200\000\006\000\000\000\000\000\000\000\000\000\000\000\000\000\000\000\000\000\000"}, + {265, "typedargslist", 0, 23, states_9, + "\000\000\200\000\014\000\000\000\000\000\000\000\000\000\000\000\000\000\000\000\000\000\000"}, {266, "tfpdef", 0, 4, states_10, "\000\000\200\000\000\000\000\000\000\000\000\000\000\000\000\000\000\000\000\000\000\000\000"}, {267, "varargslist", 0, 19, states_11, - "\000\000\200\000\006\000\000\000\000\000\000\000\000\000\000\000\000\000\000\000\000\000\000"}, + "\000\000\200\000\014\000\000\000\000\000\000\000\000\000\000\000\000\000\000\000\000\000\000"}, {268, "vfpdef", 0, 2, states_12, "\000\000\200\000\000\000\000\000\000\000\000\000\000\000\000\000\000\000\000\000\000\000\000"}, {269, "stmt", 0, 2, states_13, - "\000\050\340\000\002\000\000\000\012\076\011\007\142\011\100\010\000\000\103\242\174\010\002"}, + "\000\050\340\000\004\000\000\000\024\174\022\016\204\045\000\041\000\000\014\211\362\041\010"}, {270, "simple_stmt", 0, 4, states_14, - "\000\040\200\000\002\000\000\000\012\076\011\007\000\000\100\010\000\000\103\242\174\000\002"}, + "\000\040\200\000\004\000\000\000\024\174\022\016\000\000\000\041\000\000\014\211\362\001\010"}, {271, "small_stmt", 0, 2, states_15, - "\000\040\200\000\002\000\000\000\012\076\011\007\000\000\100\010\000\000\103\242\174\000\002"}, + "\000\040\200\000\004\000\000\000\024\174\022\016\000\000\000\041\000\000\014\211\362\001\010"}, {272, "expr_stmt", 0, 6, states_16, - "\000\040\200\000\002\000\000\000\000\000\010\000\000\000\100\010\000\000\103\242\174\000\000"}, + "\000\040\200\000\004\000\000\000\000\000\020\000\000\000\000\041\000\000\014\211\362\001\000"}, {273, "annassign", 0, 5, states_17, "\000\000\000\010\000\000\000\000\000\000\000\000\000\000\000\000\000\000\000\000\000\000\000"}, {274, "testlist_star_expr", 0, 3, states_18, - "\000\040\200\000\002\000\000\000\000\000\010\000\000\000\100\010\000\000\103\242\174\000\000"}, + "\000\040\200\000\004\000\000\000\000\000\020\000\000\000\000\041\000\000\014\211\362\001\000"}, {275, "augassign", 0, 2, states_19, - "\000\000\000\000\000\000\360\377\001\000\000\000\000\000\000\000\000\000\000\000\000\000\000"}, + "\000\000\000\000\000\000\340\377\003\000\000\000\000\000\000\000\000\000\000\000\000\000\000"}, {276, "del_stmt", 0, 3, states_20, - "\000\000\000\000\000\000\000\000\002\000\000\000\000\000\000\000\000\000\000\000\000\000\000"}, + "\000\000\000\000\000\000\000\000\004\000\000\000\000\000\000\000\000\000\000\000\000\000\000"}, {277, "pass_stmt", 0, 2, states_21, - "\000\000\000\000\000\000\000\000\010\000\000\000\000\000\000\000\000\000\000\000\000\000\000"}, + "\000\000\000\000\000\000\000\000\020\000\000\000\000\000\000\000\000\000\000\000\000\000\000"}, {278, "flow_stmt", 0, 2, states_22, - "\000\000\000\000\000\000\000\000\000\036\000\000\000\000\000\000\000\000\000\000\000\000\002"}, + "\000\000\000\000\000\000\000\000\000\074\000\000\000\000\000\000\000\000\000\000\000\000\010"}, {279, "break_stmt", 0, 2, states_23, - "\000\000\000\000\000\000\000\000\000\002\000\000\000\000\000\000\000\000\000\000\000\000\000"}, - {280, "continue_stmt", 0, 2, states_24, "\000\000\000\000\000\000\000\000\000\004\000\000\000\000\000\000\000\000\000\000\000\000\000"}, - {281, "return_stmt", 0, 3, states_25, + {280, "continue_stmt", 0, 2, states_24, "\000\000\000\000\000\000\000\000\000\010\000\000\000\000\000\000\000\000\000\000\000\000\000"}, + {281, "return_stmt", 0, 3, states_25, + "\000\000\000\000\000\000\000\000\000\020\000\000\000\000\000\000\000\000\000\000\000\000\000"}, {282, "yield_stmt", 0, 2, states_26, - "\000\000\000\000\000\000\000\000\000\000\000\000\000\000\000\000\000\000\000\000\000\000\002"}, + "\000\000\000\000\000\000\000\000\000\000\000\000\000\000\000\000\000\000\000\000\000\000\010"}, {283, "raise_stmt", 0, 5, states_27, - "\000\000\000\000\000\000\000\000\000\020\000\000\000\000\000\000\000\000\000\000\000\000\000"}, + "\000\000\000\000\000\000\000\000\000\040\000\000\000\000\000\000\000\000\000\000\000\000\000"}, {284, "import_stmt", 0, 2, states_28, - "\000\000\000\000\000\000\000\000\000\040\001\000\000\000\000\000\000\000\000\000\000\000\000"}, + "\000\000\000\000\000\000\000\000\000\100\002\000\000\000\000\000\000\000\000\000\000\000\000"}, {285, "import_name", 0, 3, states_29, - "\000\000\000\000\000\000\000\000\000\000\001\000\000\000\000\000\000\000\000\000\000\000\000"}, + "\000\000\000\000\000\000\000\000\000\000\002\000\000\000\000\000\000\000\000\000\000\000\000"}, {286, "import_from", 0, 8, states_30, - "\000\000\000\000\000\000\000\000\000\040\000\000\000\000\000\000\000\000\000\000\000\000\000"}, + "\000\000\000\000\000\000\000\000\000\100\000\000\000\000\000\000\000\000\000\000\000\000\000"}, {287, "import_as_name", 0, 4, states_31, "\000\000\200\000\000\000\000\000\000\000\000\000\000\000\000\000\000\000\000\000\000\000\000"}, {288, "dotted_as_name", 0, 4, states_32, @@ -2008,111 +2196,119 @@ static dfa dfas[88] = { {291, "dotted_name", 0, 2, states_35, "\000\000\200\000\000\000\000\000\000\000\000\000\000\000\000\000\000\000\000\000\000\000\000"}, {292, "global_stmt", 0, 3, states_36, - "\000\000\000\000\000\000\000\000\000\000\000\001\000\000\000\000\000\000\000\000\000\000\000"}, - {293, "nonlocal_stmt", 0, 3, states_37, "\000\000\000\000\000\000\000\000\000\000\000\002\000\000\000\000\000\000\000\000\000\000\000"}, - {294, "assert_stmt", 0, 5, states_38, + {293, "nonlocal_stmt", 0, 3, states_37, "\000\000\000\000\000\000\000\000\000\000\000\004\000\000\000\000\000\000\000\000\000\000\000"}, + {294, "assert_stmt", 0, 5, states_38, + "\000\000\000\000\000\000\000\000\000\000\000\010\000\000\000\000\000\000\000\000\000\000\000"}, {295, "compound_stmt", 0, 2, states_39, - "\000\010\140\000\000\000\000\000\000\000\000\000\142\011\000\000\000\000\000\000\000\010\000"}, + "\000\010\140\000\000\000\000\000\000\000\000\000\204\045\000\000\000\000\000\000\000\040\000"}, {296, "async_stmt", 0, 3, states_40, "\000\000\040\000\000\000\000\000\000\000\000\000\000\000\000\000\000\000\000\000\000\000\000"}, {297, "if_stmt", 0, 8, states_41, - "\000\000\000\000\000\000\000\000\000\000\000\000\002\000\000\000\000\000\000\000\000\000\000"}, + "\000\000\000\000\000\000\000\000\000\000\000\000\004\000\000\000\000\000\000\000\000\000\000"}, {298, "while_stmt", 0, 8, states_42, - "\000\000\000\000\000\000\000\000\000\000\000\000\040\000\000\000\000\000\000\000\000\000\000"}, - {299, "for_stmt", 0, 10, states_43, - "\000\000\000\000\000\000\000\000\000\000\000\000\100\000\000\000\000\000\000\000\000\000\000"}, - {300, "try_stmt", 0, 13, states_44, + "\000\000\000\000\000\000\000\000\000\000\000\000\200\000\000\000\000\000\000\000\000\000\000"}, + {299, "for_stmt", 0, 11, states_43, "\000\000\000\000\000\000\000\000\000\000\000\000\000\001\000\000\000\000\000\000\000\000\000"}, - {301, "with_stmt", 0, 5, states_45, - "\000\000\000\000\000\000\000\000\000\000\000\000\000\010\000\000\000\000\000\000\000\000\000"}, + {300, "try_stmt", 0, 13, states_44, + "\000\000\000\000\000\000\000\000\000\000\000\000\000\004\000\000\000\000\000\000\000\000\000"}, + {301, "with_stmt", 0, 6, states_45, + "\000\000\000\000\000\000\000\000\000\000\000\000\000\040\000\000\000\000\000\000\000\000\000"}, {302, "with_item", 0, 4, states_46, - "\000\040\200\000\000\000\000\000\000\000\010\000\000\000\100\010\000\000\103\242\174\000\000"}, + "\000\040\200\000\000\000\000\000\000\000\020\000\000\000\000\041\000\000\014\211\362\001\000"}, {303, "except_clause", 0, 5, states_47, - "\000\000\000\000\000\000\000\000\000\000\000\000\000\100\000\000\000\000\000\000\000\000\000"}, + "\000\000\000\000\000\000\000\000\000\000\000\000\000\000\001\000\000\000\000\000\000\000\000"}, {304, "suite", 0, 5, states_48, - "\004\040\200\000\002\000\000\000\012\076\011\007\000\000\100\010\000\000\103\242\174\000\002"}, + "\004\040\200\000\004\000\000\000\024\174\022\016\000\000\000\041\000\000\014\211\362\001\010"}, {305, "namedexpr_test", 0, 4, states_49, - "\000\040\200\000\000\000\000\000\000\000\010\000\000\000\100\010\000\000\103\242\174\000\000"}, + "\000\040\200\000\000\000\000\000\000\000\020\000\000\000\000\041\000\000\014\211\362\001\000"}, {306, "test", 0, 6, states_50, - "\000\040\200\000\000\000\000\000\000\000\010\000\000\000\100\010\000\000\103\242\174\000\000"}, + "\000\040\200\000\000\000\000\000\000\000\020\000\000\000\000\041\000\000\014\211\362\001\000"}, {307, "test_nocond", 0, 2, states_51, - "\000\040\200\000\000\000\000\000\000\000\010\000\000\000\100\010\000\000\103\242\174\000\000"}, + "\000\040\200\000\000\000\000\000\000\000\020\000\000\000\000\041\000\000\014\211\362\001\000"}, {308, "lambdef", 0, 5, states_52, - "\000\000\000\000\000\000\000\000\000\000\000\000\000\000\100\000\000\000\000\000\000\000\000"}, + "\000\000\000\000\000\000\000\000\000\000\000\000\000\000\000\001\000\000\000\000\000\000\000"}, {309, "lambdef_nocond", 0, 5, states_53, - "\000\000\000\000\000\000\000\000\000\000\000\000\000\000\100\000\000\000\000\000\000\000\000"}, + "\000\000\000\000\000\000\000\000\000\000\000\000\000\000\000\001\000\000\000\000\000\000\000"}, {310, "or_test", 0, 2, states_54, - "\000\040\200\000\000\000\000\000\000\000\010\000\000\000\000\010\000\000\103\242\174\000\000"}, + "\000\040\200\000\000\000\000\000\000\000\020\000\000\000\000\040\000\000\014\211\362\001\000"}, {311, "and_test", 0, 2, states_55, - "\000\040\200\000\000\000\000\000\000\000\010\000\000\000\000\010\000\000\103\242\174\000\000"}, + "\000\040\200\000\000\000\000\000\000\000\020\000\000\000\000\040\000\000\014\211\362\001\000"}, {312, "not_test", 0, 3, states_56, - "\000\040\200\000\000\000\000\000\000\000\010\000\000\000\000\010\000\000\103\242\174\000\000"}, + "\000\040\200\000\000\000\000\000\000\000\020\000\000\000\000\040\000\000\014\211\362\001\000"}, {313, "comparison", 0, 2, states_57, - "\000\040\200\000\000\000\000\000\000\000\010\000\000\000\000\000\000\000\103\242\174\000\000"}, + "\000\040\200\000\000\000\000\000\000\000\020\000\000\000\000\000\000\000\014\211\362\001\000"}, {314, "comp_op", 0, 4, states_58, - "\000\000\000\000\000\000\000\000\000\000\000\000\200\000\000\310\077\000\000\000\000\000\000"}, + "\000\000\000\000\000\000\000\000\000\000\000\000\000\002\000\040\377\000\000\000\000\000\000"}, {315, "star_expr", 0, 3, states_59, - "\000\000\000\000\002\000\000\000\000\000\000\000\000\000\000\000\000\000\000\000\000\000\000"}, + "\000\000\000\000\004\000\000\000\000\000\000\000\000\000\000\000\000\000\000\000\000\000\000"}, {316, "expr", 0, 2, states_60, - "\000\040\200\000\000\000\000\000\000\000\010\000\000\000\000\000\000\000\103\242\174\000\000"}, + "\000\040\200\000\000\000\000\000\000\000\020\000\000\000\000\000\000\000\014\211\362\001\000"}, {317, "xor_expr", 0, 2, states_61, - "\000\040\200\000\000\000\000\000\000\000\010\000\000\000\000\000\000\000\103\242\174\000\000"}, + "\000\040\200\000\000\000\000\000\000\000\020\000\000\000\000\000\000\000\014\211\362\001\000"}, {318, "and_expr", 0, 2, states_62, - "\000\040\200\000\000\000\000\000\000\000\010\000\000\000\000\000\000\000\103\242\174\000\000"}, + "\000\040\200\000\000\000\000\000\000\000\020\000\000\000\000\000\000\000\014\211\362\001\000"}, {319, "shift_expr", 0, 2, states_63, - "\000\040\200\000\000\000\000\000\000\000\010\000\000\000\000\000\000\000\103\242\174\000\000"}, + "\000\040\200\000\000\000\000\000\000\000\020\000\000\000\000\000\000\000\014\211\362\001\000"}, {320, "arith_expr", 0, 2, states_64, - "\000\040\200\000\000\000\000\000\000\000\010\000\000\000\000\000\000\000\103\242\174\000\000"}, + "\000\040\200\000\000\000\000\000\000\000\020\000\000\000\000\000\000\000\014\211\362\001\000"}, {321, "term", 0, 2, states_65, - "\000\040\200\000\000\000\000\000\000\000\010\000\000\000\000\000\000\000\103\242\174\000\000"}, + "\000\040\200\000\000\000\000\000\000\000\020\000\000\000\000\000\000\000\014\211\362\001\000"}, {322, "factor", 0, 3, states_66, - "\000\040\200\000\000\000\000\000\000\000\010\000\000\000\000\000\000\000\103\242\174\000\000"}, + "\000\040\200\000\000\000\000\000\000\000\020\000\000\000\000\000\000\000\014\211\362\001\000"}, {323, "power", 0, 4, states_67, - "\000\040\200\000\000\000\000\000\000\000\010\000\000\000\000\000\000\000\000\242\174\000\000"}, + "\000\040\200\000\000\000\000\000\000\000\020\000\000\000\000\000\000\000\000\210\362\001\000"}, {324, "atom_expr", 0, 3, states_68, - "\000\040\200\000\000\000\000\000\000\000\010\000\000\000\000\000\000\000\000\242\174\000\000"}, + "\000\040\200\000\000\000\000\000\000\000\020\000\000\000\000\000\000\000\000\210\362\001\000"}, {325, "atom", 0, 9, states_69, - "\000\040\200\000\000\000\000\000\000\000\010\000\000\000\000\000\000\000\000\240\174\000\000"}, + "\000\040\200\000\000\000\000\000\000\000\020\000\000\000\000\000\000\000\000\200\362\001\000"}, {326, "testlist_comp", 0, 5, states_70, - "\000\040\200\000\002\000\000\000\000\000\010\000\000\000\100\010\000\000\103\242\174\000\000"}, + "\000\040\200\000\004\000\000\000\000\000\020\000\000\000\000\041\000\000\014\211\362\001\000"}, {327, "trailer", 0, 7, states_71, - "\000\040\000\000\000\000\000\000\000\000\004\000\000\000\000\000\000\000\000\040\000\000\000"}, + "\000\040\000\000\000\000\000\000\000\000\010\000\000\000\000\000\000\000\000\200\000\000\000"}, {328, "subscriptlist", 0, 3, states_72, - "\000\040\200\010\000\000\000\000\000\000\010\000\000\000\100\010\000\000\103\242\174\000\000"}, + "\000\040\200\010\000\000\000\000\000\000\020\000\000\000\000\041\000\000\014\211\362\001\000"}, {329, "subscript", 0, 5, states_73, - "\000\040\200\010\000\000\000\000\000\000\010\000\000\000\100\010\000\000\103\242\174\000\000"}, + "\000\040\200\010\000\000\000\000\000\000\020\000\000\000\000\041\000\000\014\211\362\001\000"}, {330, "sliceop", 0, 3, states_74, "\000\000\000\010\000\000\000\000\000\000\000\000\000\000\000\000\000\000\000\000\000\000\000"}, {331, "exprlist", 0, 3, states_75, - "\000\040\200\000\002\000\000\000\000\000\010\000\000\000\000\000\000\000\103\242\174\000\000"}, + "\000\040\200\000\004\000\000\000\000\000\020\000\000\000\000\000\000\000\014\211\362\001\000"}, {332, "testlist", 0, 3, states_76, - "\000\040\200\000\000\000\000\000\000\000\010\000\000\000\100\010\000\000\103\242\174\000\000"}, + "\000\040\200\000\000\000\000\000\000\000\020\000\000\000\000\041\000\000\014\211\362\001\000"}, {333, "dictorsetmaker", 0, 14, states_77, - "\000\040\200\000\006\000\000\000\000\000\010\000\000\000\100\010\000\000\103\242\174\000\000"}, + "\000\040\200\000\014\000\000\000\000\000\020\000\000\000\000\041\000\000\014\211\362\001\000"}, {334, "classdef", 0, 8, states_78, - "\000\000\000\000\000\000\000\000\000\000\000\000\000\000\000\000\000\000\000\000\000\010\000"}, + "\000\000\000\000\000\000\000\000\000\000\000\000\000\000\000\000\000\000\000\000\000\040\000"}, {335, "arglist", 0, 3, states_79, - "\000\040\200\000\006\000\000\000\000\000\010\000\000\000\100\010\000\000\103\242\174\000\000"}, + "\000\040\200\000\014\000\000\000\000\000\020\000\000\000\000\041\000\000\014\211\362\001\000"}, {336, "argument", 0, 4, states_80, - "\000\040\200\000\006\000\000\000\000\000\010\000\000\000\100\010\000\000\103\242\174\000\000"}, + "\000\040\200\000\014\000\000\000\000\000\020\000\000\000\000\041\000\000\014\211\362\001\000"}, {337, "comp_iter", 0, 2, states_81, - "\000\000\040\000\000\000\000\000\000\000\000\000\102\000\000\000\000\000\000\000\000\000\000"}, + "\000\000\040\000\000\000\000\000\000\000\000\000\004\001\000\000\000\000\000\000\000\000\000"}, {338, "sync_comp_for", 0, 6, states_82, - "\000\000\000\000\000\000\000\000\000\000\000\000\100\000\000\000\000\000\000\000\000\000\000"}, + "\000\000\000\000\000\000\000\000\000\000\000\000\000\001\000\000\000\000\000\000\000\000\000"}, {339, "comp_for", 0, 3, states_83, - "\000\000\040\000\000\000\000\000\000\000\000\000\100\000\000\000\000\000\000\000\000\000\000"}, + "\000\000\040\000\000\000\000\000\000\000\000\000\000\001\000\000\000\000\000\000\000\000\000"}, {340, "comp_if", 0, 4, states_84, - "\000\000\000\000\000\000\000\000\000\000\000\000\002\000\000\000\000\000\000\000\000\000\000"}, + "\000\000\000\000\000\000\000\000\000\000\000\000\004\000\000\000\000\000\000\000\000\000\000"}, {341, "encoding_decl", 0, 2, states_85, "\000\000\200\000\000\000\000\000\000\000\000\000\000\000\000\000\000\000\000\000\000\000\000"}, {342, "yield_expr", 0, 3, states_86, - "\000\000\000\000\000\000\000\000\000\000\000\000\000\000\000\000\000\000\000\000\000\000\002"}, + "\000\000\000\000\000\000\000\000\000\000\000\000\000\000\000\000\000\000\000\000\000\000\010"}, {343, "yield_arg", 0, 3, states_87, - "\000\040\200\000\002\000\000\000\000\040\010\000\000\000\100\010\000\000\103\242\174\000\000"}, + "\000\040\200\000\004\000\000\000\000\100\020\000\000\000\000\041\000\000\014\211\362\001\000"}, + {344, "func_body_suite", 0, 7, states_88, + "\004\040\200\000\004\000\000\000\024\174\022\016\000\000\000\041\000\000\014\211\362\001\010"}, + {345, "func_type_input", 0, 3, states_89, + "\000\040\000\000\000\000\000\000\000\000\000\000\000\000\000\000\000\000\000\000\000\000\000"}, + {346, "func_type", 0, 6, states_90, + "\000\040\000\000\000\000\000\000\000\000\000\000\000\000\000\000\000\000\000\000\000\000\000"}, + {347, "typelist", 0, 11, states_91, + "\000\040\200\000\014\000\000\000\000\000\020\000\000\000\000\041\000\000\014\211\362\001\000"}, }; -static label labels[179] = { +static label labels[184] = { {0, "EMPTY"}, {256, 0}, {4, 0}, @@ -2141,7 +2337,8 @@ static label labels[179] = { {51, 0}, {306, 0}, {11, 0}, - {304, 0}, + {56, 0}, + {344, 0}, {265, 0}, {266, 0}, {22, 0}, @@ -2212,6 +2409,7 @@ static label labels[179] = { {296, 0}, {1, "if"}, {305, 0}, + {304, 0}, {1, "elif"}, {1, "else"}, {1, "while"}, @@ -2292,10 +2490,13 @@ static label labels[179] = { {341, 0}, {1, "yield"}, {343, 0}, + {345, 0}, + {346, 0}, + {347, 0}, }; grammar _PyParser_Grammar = { - 88, + 92, dfas, - {179, labels}, + {184, labels}, 256 }; diff --git a/Python/pythonrun.c b/Python/pythonrun.c index 9b6371d9c0da..c7a622c83d37 100644 --- a/Python/pythonrun.c +++ b/Python/pythonrun.c @@ -158,6 +158,8 @@ static int PARSER_FLAGS(PyCompilerFlags *flags) parser_flags |= PyPARSE_IGNORE_COOKIE; if (flags->cf_flags & CO_FUTURE_BARRY_AS_BDFL) parser_flags |= PyPARSE_BARRY_AS_BDFL; + if (flags->cf_flags & PyCF_TYPE_COMMENTS) + parser_flags |= PyPARSE_TYPE_COMMENTS; return parser_flags; } From webhook-mailer at python.org Thu Jan 31 18:03:55 2019 From: webhook-mailer at python.org (Raymond Hettinger) Date: Thu, 31 Jan 2019 23:03:55 -0000 Subject: [Python-checkins] Consistently move the misses update to just before the user function call (GH-11715) Message-ID: https://github.com/python/cpython/commit/ffdf1c30ab6940a5efe6f33e61678021d9fd14b6 commit: ffdf1c30ab6940a5efe6f33e61678021d9fd14b6 branch: master author: Raymond Hettinger committer: GitHub date: 2019-01-31T15:03:38-08:00 summary: Consistently move the misses update to just before the user function call (GH-11715) files: M Lib/functools.py M Modules/_functoolsmodule.c diff --git a/Lib/functools.py b/Lib/functools.py index 6233c30c203e..fe47600caa1a 100644 --- a/Lib/functools.py +++ b/Lib/functools.py @@ -541,10 +541,10 @@ def _lru_cache_wrapper(user_function, maxsize, typed, _CacheInfo): if maxsize == 0: def wrapper(*args, **kwds): - # No caching -- just a statistics update after a successful call + # No caching -- just a statistics update nonlocal misses - result = user_function(*args, **kwds) misses += 1 + result = user_function(*args, **kwds) return result elif maxsize is None: @@ -557,9 +557,9 @@ def wrapper(*args, **kwds): if result is not sentinel: hits += 1 return result + misses += 1 result = user_function(*args, **kwds) cache[key] = result - misses += 1 return result else: diff --git a/Modules/_functoolsmodule.c b/Modules/_functoolsmodule.c index 141210204ca5..d72aaff2b13c 100644 --- a/Modules/_functoolsmodule.c +++ b/Modules/_functoolsmodule.c @@ -796,10 +796,12 @@ lru_cache_make_key(PyObject *args, PyObject *kwds, int typed) static PyObject * uncached_lru_cache_wrapper(lru_cache_object *self, PyObject *args, PyObject *kwds) { - PyObject *result = PyObject_Call(self->func, args, kwds); + PyObject *result; + + self->misses++; + result = PyObject_Call(self->func, args, kwds); if (!result) return NULL; - self->misses++; return result; } @@ -827,6 +829,7 @@ infinite_lru_cache_wrapper(lru_cache_object *self, PyObject *args, PyObject *kwd Py_DECREF(key); return NULL; } + self->misses++; result = PyObject_Call(self->func, args, kwds); if (!result) { Py_DECREF(key); @@ -838,7 +841,6 @@ infinite_lru_cache_wrapper(lru_cache_object *self, PyObject *args, PyObject *kwd return NULL; } Py_DECREF(key); - self->misses++; return result; } From webhook-mailer at python.org Thu Jan 31 18:35:05 2019 From: webhook-mailer at python.org (Raymond Hettinger) Date: Thu, 31 Jan 2019 23:35:05 -0000 Subject: [Python-checkins] Consistently move the misses update to just before the user function call (GH-11715) (GH-11716) Message-ID: https://github.com/python/cpython/commit/533a9b459b29e8c2254b6bf6d82fd1cfa83be8cd commit: 533a9b459b29e8c2254b6bf6d82fd1cfa83be8cd branch: 3.7 author: Miss Islington (bot) <31488909+miss-islington at users.noreply.github.com> committer: Raymond Hettinger date: 2019-01-31T15:35:00-08:00 summary: Consistently move the misses update to just before the user function call (GH-11715) (GH-11716) files: M Lib/functools.py M Modules/_functoolsmodule.c diff --git a/Lib/functools.py b/Lib/functools.py index 592f156fe426..b734899b56de 100644 --- a/Lib/functools.py +++ b/Lib/functools.py @@ -500,10 +500,10 @@ def _lru_cache_wrapper(user_function, maxsize, typed, _CacheInfo): if maxsize == 0: def wrapper(*args, **kwds): - # No caching -- just a statistics update after a successful call + # No caching -- just a statistics update nonlocal misses - result = user_function(*args, **kwds) misses += 1 + result = user_function(*args, **kwds) return result elif maxsize is None: @@ -516,9 +516,9 @@ def wrapper(*args, **kwds): if result is not sentinel: hits += 1 return result + misses += 1 result = user_function(*args, **kwds) cache[key] = result - misses += 1 return result else: diff --git a/Modules/_functoolsmodule.c b/Modules/_functoolsmodule.c index 4aca63e38c8b..90c698ee41e4 100644 --- a/Modules/_functoolsmodule.c +++ b/Modules/_functoolsmodule.c @@ -796,10 +796,12 @@ lru_cache_make_key(PyObject *args, PyObject *kwds, int typed) static PyObject * uncached_lru_cache_wrapper(lru_cache_object *self, PyObject *args, PyObject *kwds) { - PyObject *result = PyObject_Call(self->func, args, kwds); + PyObject *result; + + self->misses++; + result = PyObject_Call(self->func, args, kwds); if (!result) return NULL; - self->misses++; return result; } @@ -827,6 +829,7 @@ infinite_lru_cache_wrapper(lru_cache_object *self, PyObject *args, PyObject *kwd Py_DECREF(key); return NULL; } + self->misses++; result = PyObject_Call(self->func, args, kwds); if (!result) { Py_DECREF(key); @@ -838,7 +841,6 @@ infinite_lru_cache_wrapper(lru_cache_object *self, PyObject *args, PyObject *kwd return NULL; } Py_DECREF(key); - self->misses++; return result; }