From python-checkins at python.org Thu Jan 1 01:12:48 2015 From: python-checkins at python.org (benjamin.peterson) Date: Thu, 01 Jan 2015 00:12:48 +0000 Subject: [Python-checkins] =?utf-8?q?cpython_=282=2E7=29=3A_update_for_cop?= =?utf-8?q?yright_for_2015?= Message-ID: <20150101001232.11567.23068@psf.io> https://hg.python.org/cpython/rev/7d97f4d5fd36 changeset: 93999:7d97f4d5fd36 branch: 2.7 parent: 93984:8f92ab37dd3a user: Benjamin Peterson date: Wed Dec 31 18:09:36 2014 -0600 summary: update for copyright for 2015 files: Doc/copyright.rst | 2 +- Doc/license.rst | 2 +- LICENSE | 4 ++-- PC/python_nt.rc | 2 +- Python/getcopyright.c | 2 +- README | 2 +- 6 files changed, 7 insertions(+), 7 deletions(-) diff --git a/Doc/copyright.rst b/Doc/copyright.rst --- a/Doc/copyright.rst +++ b/Doc/copyright.rst @@ -4,7 +4,7 @@ Python and this documentation is: -Copyright ?? 2001-2014 Python Software Foundation. All rights reserved. +Copyright ?? 2001-2015 Python Software Foundation. All rights reserved. Copyright ?? 2000 BeOpen.com. All rights reserved. diff --git a/Doc/license.rst b/Doc/license.rst --- a/Doc/license.rst +++ b/Doc/license.rst @@ -84,7 +84,7 @@ analyze, test, perform and/or display publicly, prepare derivative works, distribute, and otherwise use Python |release| alone or in any derivative version, provided, however, that PSF's License Agreement and PSF's notice of - copyright, i.e., "Copyright ?? 2001-2014 Python Software Foundation; All Rights + copyright, i.e., "Copyright ?? 2001-2015 Python Software Foundation; All Rights Reserved" are retained in Python |release| alone or in any derivative version prepared by Licensee. diff --git a/LICENSE b/LICENSE --- a/LICENSE +++ b/LICENSE @@ -74,8 +74,8 @@ distribute, and otherwise use Python alone or in any derivative version, provided, however, that PSF's License Agreement and PSF's notice of copyright, i.e., "Copyright (c) 2001, 2002, 2003, 2004, 2005, 2006, 2007, 2008, 2009, 2010, -2011, 2012, 2013, 2014 Python Software Foundation; All Rights Reserved" are retained -in Python alone or in any derivative version prepared by Licensee. +2011, 2012, 2013, 2014, 2015 Python Software Foundation; All Rights Reserved" +are retained in Python alone or in any derivative version prepared by Licensee. 3. In the event Licensee prepares a derivative work that is based on or incorporates Python or any part thereof, and wants to make diff --git a/PC/python_nt.rc b/PC/python_nt.rc --- a/PC/python_nt.rc +++ b/PC/python_nt.rc @@ -61,7 +61,7 @@ VALUE "FileDescription", "Python Core\0" VALUE "FileVersion", PYTHON_VERSION VALUE "InternalName", "Python DLL\0" - VALUE "LegalCopyright", "Copyright ? 2001-2014 Python Software Foundation. Copyright ? 2000 BeOpen.com. Copyright ? 1995-2001 CNRI. Copyright ? 1991-1995 SMC.\0" + VALUE "LegalCopyright", "Copyright ? 2001-2015 Python Software Foundation. Copyright ? 2000 BeOpen.com. Copyright ? 1995-2001 CNRI. Copyright ? 1991-1995 SMC.\0" VALUE "OriginalFilename", PYTHON_DLL_NAME "\0" VALUE "ProductName", "Python\0" VALUE "ProductVersion", PYTHON_VERSION diff --git a/Python/getcopyright.c b/Python/getcopyright.c --- a/Python/getcopyright.c +++ b/Python/getcopyright.c @@ -4,7 +4,7 @@ static char cprt[] = "\ -Copyright (c) 2001-2014 Python Software Foundation.\n\ +Copyright (c) 2001-2015 Python Software Foundation.\n\ All Rights Reserved.\n\ \n\ Copyright (c) 2000 BeOpen.com.\n\ diff --git a/README b/README --- a/README +++ b/README @@ -2,7 +2,7 @@ ============================ Copyright (c) 2001, 2002, 2003, 2004, 2005, 2006, 2007, 2008, 2009, 2010, 2011, -2012, 2013, 2014 Python Software Foundation. All rights reserved. +2012, 2013, 2014, 2015 Python Software Foundation. All rights reserved. Copyright (c) 2000 BeOpen.com. All rights reserved. -- Repository URL: https://hg.python.org/cpython From python-checkins at python.org Thu Jan 1 01:12:49 2015 From: python-checkins at python.org (benjamin.peterson) Date: Thu, 01 Jan 2015 00:12:49 +0000 Subject: [Python-checkins] =?utf-8?q?cpython_=283=2E2=29=3A_update_for_cop?= =?utf-8?q?yright_for_2015?= Message-ID: <20150101001231.72575.63690@psf.io> https://hg.python.org/cpython/rev/1806a00fd6b8 changeset: 93995:1806a00fd6b8 branch: 3.2 parent: 93985:223d0927e27d user: Benjamin Peterson date: Wed Dec 31 18:09:36 2014 -0600 summary: update for copyright for 2015 files: Doc/copyright.rst | 2 +- Doc/license.rst | 2 +- LICENSE | 4 ++-- PC/python_nt.rc | 2 +- Python/getcopyright.c | 2 +- README | 2 +- 6 files changed, 7 insertions(+), 7 deletions(-) diff --git a/Doc/copyright.rst b/Doc/copyright.rst --- a/Doc/copyright.rst +++ b/Doc/copyright.rst @@ -4,7 +4,7 @@ Python and this documentation is: -Copyright ?? 2001-2013 Python Software Foundation. All rights reserved. +Copyright ?? 2001-2015 Python Software Foundation. All rights reserved. Copyright ?? 2000 BeOpen.com. All rights reserved. diff --git a/Doc/license.rst b/Doc/license.rst --- a/Doc/license.rst +++ b/Doc/license.rst @@ -154,7 +154,7 @@ analyze, test, perform and/or display publicly, prepare derivative works, distribute, and otherwise use Python |release| alone or in any derivative version, provided, however, that PSF's License Agreement and PSF's notice of - copyright, i.e., "Copyright ?? 2001-2014 Python Software Foundation; All Rights + copyright, i.e., "Copyright ?? 2001-2015 Python Software Foundation; All Rights Reserved" are retained in Python |release| alone or in any derivative version prepared by Licensee. diff --git a/LICENSE b/LICENSE --- a/LICENSE +++ b/LICENSE @@ -112,8 +112,8 @@ distribute, and otherwise use Python alone or in any derivative version, provided, however, that PSF's License Agreement and PSF's notice of copyright, i.e., "Copyright (c) 2001, 2002, 2003, 2004, 2005, 2006, 2007, 2008, 2009, 2010, -2011, 2012, 2013, 2014 Python Software Foundation; All Rights Reserved" are retained -in Python alone or in any derivative version prepared by Licensee. +2011, 2012, 2013, 2014, 2015 Python Software Foundation; All Rights Reserved" +are retained in Python alone or in any derivative version prepared by Licensee. 3. In the event Licensee prepares a derivative work that is based on or incorporates Python or any part thereof, and wants to make diff --git a/PC/python_nt.rc b/PC/python_nt.rc --- a/PC/python_nt.rc +++ b/PC/python_nt.rc @@ -61,7 +61,7 @@ VALUE "FileDescription", "Python Core\0" VALUE "FileVersion", PYTHON_VERSION VALUE "InternalName", "Python DLL\0" - VALUE "LegalCopyright", "Copyright ? 2001-2014 Python Software Foundation. Copyright ? 2000 BeOpen.com. Copyright ? 1995-2001 CNRI. Copyright ? 1991-1995 SMC.\0" + VALUE "LegalCopyright", "Copyright ? 2001-2015 Python Software Foundation. Copyright ? 2000 BeOpen.com. Copyright ? 1995-2001 CNRI. Copyright ? 1991-1995 SMC.\0" VALUE "OriginalFilename", PYTHON_DLL_NAME "\0" VALUE "ProductName", "Python\0" VALUE "ProductVersion", PYTHON_VERSION diff --git a/Python/getcopyright.c b/Python/getcopyright.c --- a/Python/getcopyright.c +++ b/Python/getcopyright.c @@ -4,7 +4,7 @@ static char cprt[] = "\ -Copyright (c) 2001-2014 Python Software Foundation.\n\ +Copyright (c) 2001-2015 Python Software Foundation.\n\ All Rights Reserved.\n\ \n\ Copyright (c) 2000 BeOpen.com.\n\ diff --git a/README b/README --- a/README +++ b/README @@ -2,7 +2,7 @@ ============================ Copyright (c) 2001, 2002, 2003, 2004, 2005, 2006, 2007, 2008, 2009, 2010, 2011, -2012, 2013, 2014 Python Software Foundation. All rights reserved. +2012, 2013, 2014, 2015 Python Software Foundation. All rights reserved. Python 3.x is a new version of the language, which is incompatible with the 2.x line of releases. The language is mostly the same, but many details, especially -- Repository URL: https://hg.python.org/cpython From python-checkins at python.org Thu Jan 1 01:12:49 2015 From: python-checkins at python.org (benjamin.peterson) Date: Thu, 01 Jan 2015 00:12:49 +0000 Subject: [Python-checkins] =?utf-8?b?Y3B5dGhvbiAobWVyZ2UgMy4zIC0+IDMuNCk6?= =?utf-8?q?_merge_3=2E3?= Message-ID: <20150101001232.72555.99543@psf.io> https://hg.python.org/cpython/rev/9c6bbd4fea86 changeset: 93997:9c6bbd4fea86 branch: 3.4 parent: 93993:29689050ec78 parent: 93996:3b202cc79a38 user: Benjamin Peterson date: Wed Dec 31 18:11:22 2014 -0600 summary: merge 3.3 files: Doc/copyright.rst | 2 +- Doc/license.rst | 2 +- LICENSE | 4 ++-- PC/python_nt.rc | 2 +- Python/getcopyright.c | 2 +- README | 2 +- 6 files changed, 7 insertions(+), 7 deletions(-) diff --git a/Doc/copyright.rst b/Doc/copyright.rst --- a/Doc/copyright.rst +++ b/Doc/copyright.rst @@ -4,7 +4,7 @@ Python and this documentation is: -Copyright ?? 2001-2014 Python Software Foundation. All rights reserved. +Copyright ?? 2001-2015 Python Software Foundation. All rights reserved. Copyright ?? 2000 BeOpen.com. All rights reserved. diff --git a/Doc/license.rst b/Doc/license.rst --- a/Doc/license.rst +++ b/Doc/license.rst @@ -84,7 +84,7 @@ analyze, test, perform and/or display publicly, prepare derivative works, distribute, and otherwise use Python |release| alone or in any derivative version, provided, however, that PSF's License Agreement and PSF's notice of - copyright, i.e., "Copyright ?? 2001-2014 Python Software Foundation; All + copyright, i.e., "Copyright ?? 2001-2015 Python Software Foundation; All Rights Reserved" are retained in Python |release| alone or in any derivative version prepared by Licensee. diff --git a/LICENSE b/LICENSE --- a/LICENSE +++ b/LICENSE @@ -74,8 +74,8 @@ distribute, and otherwise use Python alone or in any derivative version, provided, however, that PSF's License Agreement and PSF's notice of copyright, i.e., "Copyright (c) 2001, 2002, 2003, 2004, 2005, 2006, 2007, 2008, 2009, 2010, -2011, 2012, 2013, 2014 Python Software Foundation; All Rights Reserved" are -retained in Python alone or in any derivative version prepared by Licensee. +2011, 2012, 2013, 2014, 2015 Python Software Foundation; All Rights Reserved" +are retained in Python alone or in any derivative version prepared by Licensee. 3. In the event Licensee prepares a derivative work that is based on or incorporates Python or any part thereof, and wants to make diff --git a/PC/python_nt.rc b/PC/python_nt.rc --- a/PC/python_nt.rc +++ b/PC/python_nt.rc @@ -61,7 +61,7 @@ VALUE "FileDescription", "Python Core\0" VALUE "FileVersion", PYTHON_VERSION VALUE "InternalName", "Python DLL\0" - VALUE "LegalCopyright", "Copyright ? 2001-2014 Python Software Foundation. Copyright ? 2000 BeOpen.com. Copyright ? 1995-2001 CNRI. Copyright ? 1991-1995 SMC.\0" + VALUE "LegalCopyright", "Copyright ? 2001-2015 Python Software Foundation. Copyright ? 2000 BeOpen.com. Copyright ? 1995-2001 CNRI. Copyright ? 1991-1995 SMC.\0" VALUE "OriginalFilename", PYTHON_DLL_NAME "\0" VALUE "ProductName", "Python\0" VALUE "ProductVersion", PYTHON_VERSION diff --git a/Python/getcopyright.c b/Python/getcopyright.c --- a/Python/getcopyright.c +++ b/Python/getcopyright.c @@ -4,7 +4,7 @@ static const char cprt[] = "\ -Copyright (c) 2001-2014 Python Software Foundation.\n\ +Copyright (c) 2001-2015 Python Software Foundation.\n\ All Rights Reserved.\n\ \n\ Copyright (c) 2000 BeOpen.com.\n\ diff --git a/README b/README --- a/README +++ b/README @@ -2,7 +2,7 @@ ============================ Copyright (c) 2001, 2002, 2003, 2004, 2005, 2006, 2007, 2008, 2009, 2010, 2011, -2012, 2013, 2014 Python Software Foundation. All rights reserved. +2012, 2013, 2014, 2015 Python Software Foundation. All rights reserved. Python 3.x is a new version of the language, which is incompatible with the 2.x line of releases. The language is mostly the same, but many details, especially -- Repository URL: https://hg.python.org/cpython From python-checkins at python.org Thu Jan 1 01:12:48 2015 From: python-checkins at python.org (benjamin.peterson) Date: Thu, 01 Jan 2015 00:12:48 +0000 Subject: [Python-checkins] =?utf-8?b?Y3B5dGhvbiAobWVyZ2UgMy4yIC0+IDMuMyk6?= =?utf-8?q?_merge_3=2E2?= Message-ID: <20150101001232.72581.27067@psf.io> https://hg.python.org/cpython/rev/3b202cc79a38 changeset: 93996:3b202cc79a38 branch: 3.3 parent: 93986:e15d93926e47 parent: 93995:1806a00fd6b8 user: Benjamin Peterson date: Wed Dec 31 18:10:13 2014 -0600 summary: merge 3.2 files: Doc/copyright.rst | 2 +- Doc/license.rst | 2 +- LICENSE | 4 ++-- PC/python_nt.rc | 2 +- Python/getcopyright.c | 2 +- README | 2 +- 6 files changed, 7 insertions(+), 7 deletions(-) diff --git a/Doc/copyright.rst b/Doc/copyright.rst --- a/Doc/copyright.rst +++ b/Doc/copyright.rst @@ -4,7 +4,7 @@ Python and this documentation is: -Copyright ?? 2001-2014 Python Software Foundation. All rights reserved. +Copyright ?? 2001-2015 Python Software Foundation. All rights reserved. Copyright ?? 2000 BeOpen.com. All rights reserved. diff --git a/Doc/license.rst b/Doc/license.rst --- a/Doc/license.rst +++ b/Doc/license.rst @@ -84,7 +84,7 @@ analyze, test, perform and/or display publicly, prepare derivative works, distribute, and otherwise use Python |release| alone or in any derivative version, provided, however, that PSF's License Agreement and PSF's notice of - copyright, i.e., "Copyright ?? 2001-2014 Python Software Foundation; All Rights + copyright, i.e., "Copyright ?? 2001-2015 Python Software Foundation; All Rights Reserved" are retained in Python |release| alone or in any derivative version prepared by Licensee. diff --git a/LICENSE b/LICENSE --- a/LICENSE +++ b/LICENSE @@ -74,8 +74,8 @@ distribute, and otherwise use Python alone or in any derivative version, provided, however, that PSF's License Agreement and PSF's notice of copyright, i.e., "Copyright (c) 2001, 2002, 2003, 2004, 2005, 2006, 2007, 2008, 2009, 2010, -2011, 2012, 2013, 2014 Python Software Foundation; All Rights Reserved" are retained -in Python alone or in any derivative version prepared by Licensee. +2011, 2012, 2013, 2014, 2015 Python Software Foundation; All Rights Reserved" +are retained in Python alone or in any derivative version prepared by Licensee. 3. In the event Licensee prepares a derivative work that is based on or incorporates Python or any part thereof, and wants to make diff --git a/PC/python_nt.rc b/PC/python_nt.rc --- a/PC/python_nt.rc +++ b/PC/python_nt.rc @@ -61,7 +61,7 @@ VALUE "FileDescription", "Python Core\0" VALUE "FileVersion", PYTHON_VERSION VALUE "InternalName", "Python DLL\0" - VALUE "LegalCopyright", "Copyright ? 2001-2014 Python Software Foundation. Copyright ? 2000 BeOpen.com. Copyright ? 1995-2001 CNRI. Copyright ? 1991-1995 SMC.\0" + VALUE "LegalCopyright", "Copyright ? 2001-2015 Python Software Foundation. Copyright ? 2000 BeOpen.com. Copyright ? 1995-2001 CNRI. Copyright ? 1991-1995 SMC.\0" VALUE "OriginalFilename", PYTHON_DLL_NAME "\0" VALUE "ProductName", "Python\0" VALUE "ProductVersion", PYTHON_VERSION diff --git a/Python/getcopyright.c b/Python/getcopyright.c --- a/Python/getcopyright.c +++ b/Python/getcopyright.c @@ -4,7 +4,7 @@ static const char cprt[] = "\ -Copyright (c) 2001-2014 Python Software Foundation.\n\ +Copyright (c) 2001-2015 Python Software Foundation.\n\ All Rights Reserved.\n\ \n\ Copyright (c) 2000 BeOpen.com.\n\ diff --git a/README b/README --- a/README +++ b/README @@ -2,7 +2,7 @@ ============================ Copyright (c) 2001, 2002, 2003, 2004, 2005, 2006, 2007, 2008, 2009, 2010, 2011, -2012, 2013, 2014 Python Software Foundation. All rights reserved. +2012, 2013, 2014, 2015 Python Software Foundation. All rights reserved. Python 3.x is a new version of the language, which is incompatible with the 2.x line of releases. The language is mostly the same, but many details, especially -- Repository URL: https://hg.python.org/cpython From python-checkins at python.org Thu Jan 1 01:12:48 2015 From: python-checkins at python.org (benjamin.peterson) Date: Thu, 01 Jan 2015 00:12:48 +0000 Subject: [Python-checkins] =?utf-8?q?cpython_=28merge_3=2E4_-=3E_default?= =?utf-8?b?KTogbWVyZ2UgMy40?= Message-ID: <20150101001232.125892.62097@psf.io> https://hg.python.org/cpython/rev/3dba1fed51a0 changeset: 93998:3dba1fed51a0 parent: 93994:01437956ea67 parent: 93997:9c6bbd4fea86 user: Benjamin Peterson date: Wed Dec 31 18:11:34 2014 -0600 summary: merge 3.4 files: Doc/copyright.rst | 2 +- Doc/license.rst | 2 +- LICENSE | 4 ++-- PC/python_nt.rc | 2 +- Python/getcopyright.c | 2 +- README | 2 +- 6 files changed, 7 insertions(+), 7 deletions(-) diff --git a/Doc/copyright.rst b/Doc/copyright.rst --- a/Doc/copyright.rst +++ b/Doc/copyright.rst @@ -4,7 +4,7 @@ Python and this documentation is: -Copyright ?? 2001-2014 Python Software Foundation. All rights reserved. +Copyright ?? 2001-2015 Python Software Foundation. All rights reserved. Copyright ?? 2000 BeOpen.com. All rights reserved. diff --git a/Doc/license.rst b/Doc/license.rst --- a/Doc/license.rst +++ b/Doc/license.rst @@ -84,7 +84,7 @@ analyze, test, perform and/or display publicly, prepare derivative works, distribute, and otherwise use Python |release| alone or in any derivative version, provided, however, that PSF's License Agreement and PSF's notice of - copyright, i.e., "Copyright ?? 2001-2014 Python Software Foundation; All + copyright, i.e., "Copyright ?? 2001-2015 Python Software Foundation; All Rights Reserved" are retained in Python |release| alone or in any derivative version prepared by Licensee. diff --git a/LICENSE b/LICENSE --- a/LICENSE +++ b/LICENSE @@ -74,8 +74,8 @@ distribute, and otherwise use Python alone or in any derivative version, provided, however, that PSF's License Agreement and PSF's notice of copyright, i.e., "Copyright (c) 2001, 2002, 2003, 2004, 2005, 2006, 2007, 2008, 2009, 2010, -2011, 2012, 2013, 2014 Python Software Foundation; All Rights Reserved" are -retained in Python alone or in any derivative version prepared by Licensee. +2011, 2012, 2013, 2014, 2015 Python Software Foundation; All Rights Reserved" +are retained in Python alone or in any derivative version prepared by Licensee. 3. In the event Licensee prepares a derivative work that is based on or incorporates Python or any part thereof, and wants to make diff --git a/PC/python_nt.rc b/PC/python_nt.rc --- a/PC/python_nt.rc +++ b/PC/python_nt.rc @@ -61,7 +61,7 @@ VALUE "FileDescription", "Python Core\0" VALUE "FileVersion", PYTHON_VERSION VALUE "InternalName", "Python DLL\0" - VALUE "LegalCopyright", "Copyright ? 2001-2014 Python Software Foundation. Copyright ? 2000 BeOpen.com. Copyright ? 1995-2001 CNRI. Copyright ? 1991-1995 SMC.\0" + VALUE "LegalCopyright", "Copyright ? 2001-2015 Python Software Foundation. Copyright ? 2000 BeOpen.com. Copyright ? 1995-2001 CNRI. Copyright ? 1991-1995 SMC.\0" VALUE "OriginalFilename", PYTHON_DLL_NAME "\0" VALUE "ProductName", "Python\0" VALUE "ProductVersion", PYTHON_VERSION diff --git a/Python/getcopyright.c b/Python/getcopyright.c --- a/Python/getcopyright.c +++ b/Python/getcopyright.c @@ -4,7 +4,7 @@ static const char cprt[] = "\ -Copyright (c) 2001-2014 Python Software Foundation.\n\ +Copyright (c) 2001-2015 Python Software Foundation.\n\ All Rights Reserved.\n\ \n\ Copyright (c) 2000 BeOpen.com.\n\ diff --git a/README b/README --- a/README +++ b/README @@ -2,7 +2,7 @@ ==================================== Copyright (c) 2001, 2002, 2003, 2004, 2005, 2006, 2007, 2008, 2009, 2010, 2011, -2012, 2013, 2014 Python Software Foundation. All rights reserved. +2012, 2013, 2014, 2015 Python Software Foundation. All rights reserved. Python 3.x is a new version of the language, which is incompatible with the 2.x line of releases. The language is mostly the same, but many details, especially -- Repository URL: https://hg.python.org/cpython From python-checkins at python.org Thu Jan 1 01:31:34 2015 From: python-checkins at python.org (ned.deily) Date: Thu, 01 Jan 2015 00:31:34 +0000 Subject: [Python-checkins] =?utf-8?q?cpython_=282=2E7=29=3A_Update_copyrig?= =?utf-8?q?ht_dates_in_OS_X_installer=2E?= Message-ID: <20150101003131.8749.5306@psf.io> https://hg.python.org/cpython/rev/4e067cf7e299 changeset: 94000:4e067cf7e299 branch: 2.7 user: Ned Deily date: Wed Dec 31 16:30:09 2014 -0800 summary: Update copyright dates in OS X installer. files: Mac/BuildScript/resources/License.rtf | 14 +++++--------- Mac/IDLE/Info.plist.in | 2 +- Mac/PythonLauncher/Info.plist.in | 2 +- Mac/Resources/app/Info.plist.in | 6 +++--- Mac/Resources/framework/Info.plist.in | 4 ++-- 5 files changed, 12 insertions(+), 16 deletions(-) diff --git a/Mac/BuildScript/resources/License.rtf b/Mac/BuildScript/resources/License.rtf --- a/Mac/BuildScript/resources/License.rtf +++ b/Mac/BuildScript/resources/License.rtf @@ -54,7 +54,7 @@ \b0 \ 1. This LICENSE AGREEMENT is between the Python Software Foundation ("PSF"), and the Individual or Organization ("Licensee") accessing and otherwise using this software ("Python") in source or binary form and its associated documentation.\ \ -2. Subject to the terms and conditions of this License Agreement, PSF hereby grants Licensee a nonexclusive, royalty-free, world-wide license to reproduce, analyze, test, perform and/or display publicly, prepare derivative works, distribute, and otherwise use Python alone or in any derivative version, provided, however, that PSF's License Agreement and PSF's notice of copyright, i.e., "Copyright (c) 2001, 2002, 2003, 2004, 2005, 2006, 2007, 2008, 2009, 2010, 2011, 2012, 2013, 2014 Python Software Foundation; All Rights Reserved" are retained in Python alone or in any derivative version prepared by Licensee.\ +2. Subject to the terms and conditions of this License Agreement, PSF hereby grants Licensee a nonexclusive, royalty-free, world-wide license to reproduce, analyze, test, perform and/or display publicly, prepare derivative works, distribute, and otherwise use Python alone or in any derivative version, provided, however, that PSF's License Agreement and PSF's notice of copyright, i.e., "Copyright (c) 2001, 2002, 2003, 2004, 2005, 2006, 2007, 2008, 2009, 2010, 2011, 2012, 2013, 2014, 2015 Python Software Foundation; All Rights Reserved" are retained in Python alone or in any derivative version prepared by Licensee.\ \ 3. In the event Licensee prepares a derivative work that is based on or incorporates Python or any part thereof, and wants to make the derivative work available to others as provided herein, then Licensee hereby agrees to include in any such work a brief summary of the changes made to Python.\ \ @@ -124,22 +124,18 @@ STICHTING MATHEMATISCH CENTRUM DISCLAIMS ALL WARRANTIES WITH REGARD TO THIS SOFTWARE, INCLUDING ALL IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS, IN NO EVENT SHALL STICHTING MATHEMATISCH CENTRUM BE LIABLE FOR ANY SPECIAL, INDIRECT OR CONSEQUENTIAL DAMAGES OR ANY DAMAGES WHATSOEVER RESULTING FROM LOSS OF USE, DATA OR PROFITS, WHETHER IN AN ACTION OF CONTRACT, NEGLIGENCE OR OTHER TORTIOUS ACTION, ARISING OUT OF OR IN CONNECTION WITH THE USE OR PERFORMANCE OF THIS SOFTWARE.\ \ \ -\pard\tx720\tx1440\tx2160\tx2880\tx3600\tx4320\tx5040\tx5760\tx6480\tx7200\tx7920\tx8640\pardirnatural -\b \cf0 \ul \ulc0 LICENSES AND ACKNOWLEDGEMENTS FOR INCORPORATED SOFTWARE\ -\pard\tx720\tx1440\tx2160\tx2880\tx3600\tx4320\tx5040\tx5760\tx6480\tx7200\tx7920\tx8640\pardirnatural +\b \ul LICENSES AND ACKNOWLEDGEMENTS FOR INCORPORATED SOFTWARE\ -\b0 \cf0 \ulnone \ +\b0 \ulnone \ This installer incorporates portions of the following third-party software:\ \ \f2 $THIRD_PARTY_LIBS\ \ -\pard\tx720\tx1440\tx2160\tx2880\tx3600\tx4320\tx5040\tx5760\tx6480\tx7200\tx7920\tx8640\pardirnatural -\f0 \cf0 For licenses and acknowledgements for these and other third-party software incorporated in this Python distribution, please refer to the on-line documentation {\field{\*\fldinst{HYPERLINK "https://docs.python.org/$VERSION/license.html#licenses-and-acknowledgements-for-incorporated-software"}}{\fldrslt here}}.\ -\pard\tx720\tx1440\tx2160\tx2880\tx3600\tx4320\tx5040\tx5760\tx6480\tx7200\tx7920\tx8640\pardirnatural -\cf0 \ +\f0 For licenses and acknowledgements for these and other third-party software incorporated in this Python distribution, please refer to the on-line documentation {\field{\*\fldinst{HYPERLINK "https://docs.python.org/$VERSION/license.html#licenses-and-acknowledgements-for-incorporated-software"}}{\fldrslt here}}.\ +\ \ \ \ diff --git a/Mac/IDLE/Info.plist.in b/Mac/IDLE/Info.plist.in --- a/Mac/IDLE/Info.plist.in +++ b/Mac/IDLE/Info.plist.in @@ -36,7 +36,7 @@ CFBundleExecutable IDLE CFBundleGetInfoString - %VERSION%, ? 2001-2014 Python Software Foundation + %VERSION%, ? 2001-2015 Python Software Foundation CFBundleIconFile IDLE.icns CFBundleIdentifier diff --git a/Mac/PythonLauncher/Info.plist.in b/Mac/PythonLauncher/Info.plist.in --- a/Mac/PythonLauncher/Info.plist.in +++ b/Mac/PythonLauncher/Info.plist.in @@ -40,7 +40,7 @@ CFBundleExecutable PythonLauncher CFBundleGetInfoString - %VERSION%, ? 2001-2014 Python Software Foundation + %VERSION%, ? 2001-2015 Python Software Foundation CFBundleIconFile PythonLauncher.icns CFBundleIdentifier diff --git a/Mac/Resources/app/Info.plist.in b/Mac/Resources/app/Info.plist.in --- a/Mac/Resources/app/Info.plist.in +++ b/Mac/Resources/app/Info.plist.in @@ -20,7 +20,7 @@ CFBundleExecutable Python CFBundleGetInfoString - %version%, (c) 2004-2014 Python Software Foundation. + %version%, (c) 2001-2015 Python Software Foundation. CFBundleHelpBookFolder Documentation @@ -37,7 +37,7 @@ CFBundleInfoDictionaryVersion 6.0 CFBundleLongVersionString - %version%, (c) 2004-2014 Python Software Foundation. + %version%, (c) 2001-2015 Python Software Foundation. CFBundleName Python CFBundlePackageType @@ -55,7 +55,7 @@ NSAppleScriptEnabled NSHumanReadableCopyright - (c) 2014 Python Software Foundation. + (c) 2001-2015 Python Software Foundation. NSHighResolutionCapable diff --git a/Mac/Resources/framework/Info.plist.in b/Mac/Resources/framework/Info.plist.in --- a/Mac/Resources/framework/Info.plist.in +++ b/Mac/Resources/framework/Info.plist.in @@ -17,9 +17,9 @@ CFBundlePackageType FMWK CFBundleShortVersionString - %VERSION%, (c) 2004-2014 Python Software Foundation. + %VERSION%, (c) 2001-2015 Python Software Foundation. CFBundleLongVersionString - %VERSION%, (c) 2004-2014 Python Software Foundation. + %VERSION%, (c) 2001-2015 Python Software Foundation. CFBundleSignature ???? CFBundleVersion -- Repository URL: https://hg.python.org/cpython From python-checkins at python.org Thu Jan 1 01:31:34 2015 From: python-checkins at python.org (ned.deily) Date: Thu, 01 Jan 2015 00:31:34 +0000 Subject: [Python-checkins] =?utf-8?q?cpython_=283=2E4=29=3A_Update_copyrig?= =?utf-8?q?ht_dates_in_OS_X_installer=2E?= Message-ID: <20150101003131.8743.91385@psf.io> https://hg.python.org/cpython/rev/50d581f69a73 changeset: 94001:50d581f69a73 branch: 3.4 parent: 93997:9c6bbd4fea86 user: Ned Deily date: Wed Dec 31 16:30:26 2014 -0800 summary: Update copyright dates in OS X installer. files: Mac/BuildScript/resources/License.rtf | 14 +++++--------- Mac/IDLE/IDLE.app/Contents/Info.plist | 2 +- Mac/PythonLauncher/Info.plist.in | 2 +- Mac/Resources/app/Info.plist.in | 6 +++--- Mac/Resources/framework/Info.plist.in | 4 ++-- 5 files changed, 12 insertions(+), 16 deletions(-) diff --git a/Mac/BuildScript/resources/License.rtf b/Mac/BuildScript/resources/License.rtf --- a/Mac/BuildScript/resources/License.rtf +++ b/Mac/BuildScript/resources/License.rtf @@ -54,7 +54,7 @@ \b0 \ 1. This LICENSE AGREEMENT is between the Python Software Foundation ("PSF"), and the Individual or Organization ("Licensee") accessing and otherwise using this software ("Python") in source or binary form and its associated documentation.\ \ -2. Subject to the terms and conditions of this License Agreement, PSF hereby grants Licensee a nonexclusive, royalty-free, world-wide license to reproduce, analyze, test, perform and/or display publicly, prepare derivative works, distribute, and otherwise use Python alone or in any derivative version, provided, however, that PSF's License Agreement and PSF's notice of copyright, i.e., "Copyright (c) 2001, 2002, 2003, 2004, 2005, 2006, 2007, 2008, 2009, 2010, 2011, 2012, 2013, 2014 Python Software Foundation; All Rights Reserved" are retained in Python alone or in any derivative version prepared by Licensee.\ +2. Subject to the terms and conditions of this License Agreement, PSF hereby grants Licensee a nonexclusive, royalty-free, world-wide license to reproduce, analyze, test, perform and/or display publicly, prepare derivative works, distribute, and otherwise use Python alone or in any derivative version, provided, however, that PSF's License Agreement and PSF's notice of copyright, i.e., "Copyright (c) 2001, 2002, 2003, 2004, 2005, 2006, 2007, 2008, 2009, 2010, 2011, 2012, 2013, 2014, 2015 Python Software Foundation; All Rights Reserved" are retained in Python alone or in any derivative version prepared by Licensee.\ \ 3. In the event Licensee prepares a derivative work that is based on or incorporates Python or any part thereof, and wants to make the derivative work available to others as provided herein, then Licensee hereby agrees to include in any such work a brief summary of the changes made to Python.\ \ @@ -124,22 +124,18 @@ STICHTING MATHEMATISCH CENTRUM DISCLAIMS ALL WARRANTIES WITH REGARD TO THIS SOFTWARE, INCLUDING ALL IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS, IN NO EVENT SHALL STICHTING MATHEMATISCH CENTRUM BE LIABLE FOR ANY SPECIAL, INDIRECT OR CONSEQUENTIAL DAMAGES OR ANY DAMAGES WHATSOEVER RESULTING FROM LOSS OF USE, DATA OR PROFITS, WHETHER IN AN ACTION OF CONTRACT, NEGLIGENCE OR OTHER TORTIOUS ACTION, ARISING OUT OF OR IN CONNECTION WITH THE USE OR PERFORMANCE OF THIS SOFTWARE.\ \ \ -\pard\tx720\tx1440\tx2160\tx2880\tx3600\tx4320\tx5040\tx5760\tx6480\tx7200\tx7920\tx8640\pardirnatural -\b \cf0 \ul \ulc0 LICENSES AND ACKNOWLEDGEMENTS FOR INCORPORATED SOFTWARE\ -\pard\tx720\tx1440\tx2160\tx2880\tx3600\tx4320\tx5040\tx5760\tx6480\tx7200\tx7920\tx8640\pardirnatural +\b \ul LICENSES AND ACKNOWLEDGEMENTS FOR INCORPORATED SOFTWARE\ -\b0 \cf0 \ulnone \ +\b0 \ulnone \ This installer incorporates portions of the following third-party software:\ \ \f2 $THIRD_PARTY_LIBS\ \ -\pard\tx720\tx1440\tx2160\tx2880\tx3600\tx4320\tx5040\tx5760\tx6480\tx7200\tx7920\tx8640\pardirnatural -\f0 \cf0 For licenses and acknowledgements for these and other third-party software incorporated in this Python distribution, please refer to the on-line documentation {\field{\*\fldinst{HYPERLINK "https://docs.python.org/$VERSION/license.html#licenses-and-acknowledgements-for-incorporated-software"}}{\fldrslt here}}.\ -\pard\tx720\tx1440\tx2160\tx2880\tx3600\tx4320\tx5040\tx5760\tx6480\tx7200\tx7920\tx8640\pardirnatural -\cf0 \ +\f0 For licenses and acknowledgements for these and other third-party software incorporated in this Python distribution, please refer to the on-line documentation {\field{\*\fldinst{HYPERLINK "https://docs.python.org/$VERSION/license.html#licenses-and-acknowledgements-for-incorporated-software"}}{\fldrslt here}}.\ +\ \ \ \ diff --git a/Mac/IDLE/IDLE.app/Contents/Info.plist b/Mac/IDLE/IDLE.app/Contents/Info.plist --- a/Mac/IDLE/IDLE.app/Contents/Info.plist +++ b/Mac/IDLE/IDLE.app/Contents/Info.plist @@ -36,7 +36,7 @@ CFBundleExecutable IDLE CFBundleGetInfoString - %version%, ? 2001-2014 Python Software Foundation + %version%, ? 2001-2015 Python Software Foundation CFBundleIconFile IDLE.icns CFBundleIdentifier diff --git a/Mac/PythonLauncher/Info.plist.in b/Mac/PythonLauncher/Info.plist.in --- a/Mac/PythonLauncher/Info.plist.in +++ b/Mac/PythonLauncher/Info.plist.in @@ -40,7 +40,7 @@ CFBundleExecutable PythonLauncher CFBundleGetInfoString - %VERSION%, ? 2001-2014 Python Software Foundation + %VERSION%, ? 2001-2015 Python Software Foundation CFBundleIconFile PythonLauncher.icns CFBundleIdentifier diff --git a/Mac/Resources/app/Info.plist.in b/Mac/Resources/app/Info.plist.in --- a/Mac/Resources/app/Info.plist.in +++ b/Mac/Resources/app/Info.plist.in @@ -20,7 +20,7 @@ CFBundleExecutable Python CFBundleGetInfoString - %version%, (c) 2004-2014 Python Software Foundation. + %version%, (c) 2001-2015 Python Software Foundation. CFBundleHelpBookFolder Documentation @@ -37,7 +37,7 @@ CFBundleInfoDictionaryVersion 6.0 CFBundleLongVersionString - %version%, (c) 2004-2014 Python Software Foundation. + %version%, (c) 2001-2015 Python Software Foundation. CFBundleName Python CFBundlePackageType @@ -55,7 +55,7 @@ NSAppleScriptEnabled NSHumanReadableCopyright - (c) 2014 Python Software Foundation. + (c) 2001-2015 Python Software Foundation. NSHighResolutionCapable diff --git a/Mac/Resources/framework/Info.plist.in b/Mac/Resources/framework/Info.plist.in --- a/Mac/Resources/framework/Info.plist.in +++ b/Mac/Resources/framework/Info.plist.in @@ -17,9 +17,9 @@ CFBundlePackageType FMWK CFBundleShortVersionString - %VERSION%, (c) 2004-2014 Python Software Foundation. + %VERSION%, (c) 2001-2015 Python Software Foundation. CFBundleLongVersionString - %VERSION%, (c) 2004-2014 Python Software Foundation. + %VERSION%, (c) 2001-2015 Python Software Foundation. CFBundleSignature ???? CFBundleVersion -- Repository URL: https://hg.python.org/cpython From python-checkins at python.org Thu Jan 1 01:31:34 2015 From: python-checkins at python.org (ned.deily) Date: Thu, 01 Jan 2015 00:31:34 +0000 Subject: [Python-checkins] =?utf-8?q?cpython_=28merge_3=2E4_-=3E_default?= =?utf-8?q?=29=3A_Update_copyright_dates_in_OS_X_installer=2E?= Message-ID: <20150101003131.11563.90564@psf.io> https://hg.python.org/cpython/rev/4d1ae7eec0d4 changeset: 94002:4d1ae7eec0d4 parent: 93998:3dba1fed51a0 parent: 94001:50d581f69a73 user: Ned Deily date: Wed Dec 31 16:31:06 2014 -0800 summary: Update copyright dates in OS X installer. files: Mac/BuildScript/resources/License.rtf | 14 +++++--------- Mac/IDLE/IDLE.app/Contents/Info.plist | 2 +- Mac/PythonLauncher/Info.plist.in | 2 +- Mac/Resources/app/Info.plist.in | 6 +++--- Mac/Resources/framework/Info.plist.in | 4 ++-- 5 files changed, 12 insertions(+), 16 deletions(-) diff --git a/Mac/BuildScript/resources/License.rtf b/Mac/BuildScript/resources/License.rtf --- a/Mac/BuildScript/resources/License.rtf +++ b/Mac/BuildScript/resources/License.rtf @@ -54,7 +54,7 @@ \b0 \ 1. This LICENSE AGREEMENT is between the Python Software Foundation ("PSF"), and the Individual or Organization ("Licensee") accessing and otherwise using this software ("Python") in source or binary form and its associated documentation.\ \ -2. Subject to the terms and conditions of this License Agreement, PSF hereby grants Licensee a nonexclusive, royalty-free, world-wide license to reproduce, analyze, test, perform and/or display publicly, prepare derivative works, distribute, and otherwise use Python alone or in any derivative version, provided, however, that PSF's License Agreement and PSF's notice of copyright, i.e., "Copyright (c) 2001, 2002, 2003, 2004, 2005, 2006, 2007, 2008, 2009, 2010, 2011, 2012, 2013, 2014 Python Software Foundation; All Rights Reserved" are retained in Python alone or in any derivative version prepared by Licensee.\ +2. Subject to the terms and conditions of this License Agreement, PSF hereby grants Licensee a nonexclusive, royalty-free, world-wide license to reproduce, analyze, test, perform and/or display publicly, prepare derivative works, distribute, and otherwise use Python alone or in any derivative version, provided, however, that PSF's License Agreement and PSF's notice of copyright, i.e., "Copyright (c) 2001, 2002, 2003, 2004, 2005, 2006, 2007, 2008, 2009, 2010, 2011, 2012, 2013, 2014, 2015 Python Software Foundation; All Rights Reserved" are retained in Python alone or in any derivative version prepared by Licensee.\ \ 3. In the event Licensee prepares a derivative work that is based on or incorporates Python or any part thereof, and wants to make the derivative work available to others as provided herein, then Licensee hereby agrees to include in any such work a brief summary of the changes made to Python.\ \ @@ -124,22 +124,18 @@ STICHTING MATHEMATISCH CENTRUM DISCLAIMS ALL WARRANTIES WITH REGARD TO THIS SOFTWARE, INCLUDING ALL IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS, IN NO EVENT SHALL STICHTING MATHEMATISCH CENTRUM BE LIABLE FOR ANY SPECIAL, INDIRECT OR CONSEQUENTIAL DAMAGES OR ANY DAMAGES WHATSOEVER RESULTING FROM LOSS OF USE, DATA OR PROFITS, WHETHER IN AN ACTION OF CONTRACT, NEGLIGENCE OR OTHER TORTIOUS ACTION, ARISING OUT OF OR IN CONNECTION WITH THE USE OR PERFORMANCE OF THIS SOFTWARE.\ \ \ -\pard\tx720\tx1440\tx2160\tx2880\tx3600\tx4320\tx5040\tx5760\tx6480\tx7200\tx7920\tx8640\pardirnatural -\b \cf0 \ul \ulc0 LICENSES AND ACKNOWLEDGEMENTS FOR INCORPORATED SOFTWARE\ -\pard\tx720\tx1440\tx2160\tx2880\tx3600\tx4320\tx5040\tx5760\tx6480\tx7200\tx7920\tx8640\pardirnatural +\b \ul LICENSES AND ACKNOWLEDGEMENTS FOR INCORPORATED SOFTWARE\ -\b0 \cf0 \ulnone \ +\b0 \ulnone \ This installer incorporates portions of the following third-party software:\ \ \f2 $THIRD_PARTY_LIBS\ \ -\pard\tx720\tx1440\tx2160\tx2880\tx3600\tx4320\tx5040\tx5760\tx6480\tx7200\tx7920\tx8640\pardirnatural -\f0 \cf0 For licenses and acknowledgements for these and other third-party software incorporated in this Python distribution, please refer to the on-line documentation {\field{\*\fldinst{HYPERLINK "https://docs.python.org/$VERSION/license.html#licenses-and-acknowledgements-for-incorporated-software"}}{\fldrslt here}}.\ -\pard\tx720\tx1440\tx2160\tx2880\tx3600\tx4320\tx5040\tx5760\tx6480\tx7200\tx7920\tx8640\pardirnatural -\cf0 \ +\f0 For licenses and acknowledgements for these and other third-party software incorporated in this Python distribution, please refer to the on-line documentation {\field{\*\fldinst{HYPERLINK "https://docs.python.org/$VERSION/license.html#licenses-and-acknowledgements-for-incorporated-software"}}{\fldrslt here}}.\ +\ \ \ \ diff --git a/Mac/IDLE/IDLE.app/Contents/Info.plist b/Mac/IDLE/IDLE.app/Contents/Info.plist --- a/Mac/IDLE/IDLE.app/Contents/Info.plist +++ b/Mac/IDLE/IDLE.app/Contents/Info.plist @@ -36,7 +36,7 @@ CFBundleExecutable IDLE CFBundleGetInfoString - %version%, ? 2001-2014 Python Software Foundation + %version%, ? 2001-2015 Python Software Foundation CFBundleIconFile IDLE.icns CFBundleIdentifier diff --git a/Mac/PythonLauncher/Info.plist.in b/Mac/PythonLauncher/Info.plist.in --- a/Mac/PythonLauncher/Info.plist.in +++ b/Mac/PythonLauncher/Info.plist.in @@ -40,7 +40,7 @@ CFBundleExecutable Python Launcher CFBundleGetInfoString - %VERSION%, ? 2001-2014 Python Software Foundation + %VERSION%, ? 2001-2015 Python Software Foundation CFBundleIconFile PythonLauncher.icns CFBundleIdentifier diff --git a/Mac/Resources/app/Info.plist.in b/Mac/Resources/app/Info.plist.in --- a/Mac/Resources/app/Info.plist.in +++ b/Mac/Resources/app/Info.plist.in @@ -20,7 +20,7 @@ CFBundleExecutable Python CFBundleGetInfoString - %version%, (c) 2004-2014 Python Software Foundation. + %version%, (c) 2001-2015 Python Software Foundation. CFBundleHelpBookFolder Documentation @@ -37,7 +37,7 @@ CFBundleInfoDictionaryVersion 6.0 CFBundleLongVersionString - %version%, (c) 2004-2014 Python Software Foundation. + %version%, (c) 2001-2015 Python Software Foundation. CFBundleName Python CFBundlePackageType @@ -55,7 +55,7 @@ NSAppleScriptEnabled NSHumanReadableCopyright - (c) 2014 Python Software Foundation. + (c) 2001-2015 Python Software Foundation. NSHighResolutionCapable diff --git a/Mac/Resources/framework/Info.plist.in b/Mac/Resources/framework/Info.plist.in --- a/Mac/Resources/framework/Info.plist.in +++ b/Mac/Resources/framework/Info.plist.in @@ -17,9 +17,9 @@ CFBundlePackageType FMWK CFBundleShortVersionString - %VERSION%, (c) 2004-2014 Python Software Foundation. + %VERSION%, (c) 2001-2015 Python Software Foundation. CFBundleLongVersionString - %VERSION%, (c) 2004-2014 Python Software Foundation. + %VERSION%, (c) 2001-2015 Python Software Foundation. CFBundleSignature ???? CFBundleVersion -- Repository URL: https://hg.python.org/cpython From solipsis at pitrou.net Thu Jan 1 09:25:48 2015 From: solipsis at pitrou.net (solipsis at pitrou.net) Date: Thu, 01 Jan 2015 09:25:48 +0100 Subject: [Python-checkins] Daily reference leaks (4d1ae7eec0d4): sum=3 Message-ID: results for 4d1ae7eec0d4 on branch "default" -------------------------------------------- test_functools leaked [0, 0, 3] memory blocks, sum=3 Command line was: ['./python', '-m', 'test.regrtest', '-uall', '-R', '3:3:/home/antoine/cpython/refleaks/reflog5qHHxE', '-x'] From python-checkins at python.org Thu Jan 1 14:28:35 2015 From: python-checkins at python.org (serhiy.storchaka) Date: Thu, 01 Jan 2015 13:28:35 +0000 Subject: [Python-checkins] =?utf-8?q?cpython=3A_Issue_=2323132=3A_Improve_?= =?utf-8?q?performance_and_introspection_support_of_comparison?= Message-ID: <20150101132833.72575.35086@psf.io> https://hg.python.org/cpython/rev/4e85df8b3ea6 changeset: 94003:4e85df8b3ea6 user: Serhiy Storchaka date: Thu Jan 01 15:23:12 2015 +0200 summary: Issue #23132: Improve performance and introspection support of comparison methods created by functool.total_ordering. files: Lib/functools.py | 87 ++++++++++++++----------- Lib/test/test_functools.py | 18 +++++ Misc/NEWS | 3 + 3 files changed, 69 insertions(+), 39 deletions(-) diff --git a/Lib/functools.py b/Lib/functools.py --- a/Lib/functools.py +++ b/Lib/functools.py @@ -113,76 +113,85 @@ # underlying user provided method. Using this scheme, the __gt__ derived # from a user provided __lt__ becomes: # -# lambda self, other: _not_op_and_not_eq(self.__lt__, self, other)) +# 'def __gt__(self, other):' + _not_op_and_not_eq % '__lt__' -def _not_op(op, other): - # "not a < b" handles "a >= b" - # "not a <= b" handles "a > b" - # "not a >= b" handles "a < b" - # "not a > b" handles "a <= b" - op_result = op(other) +# "not a < b" handles "a >= b" +# "not a <= b" handles "a > b" +# "not a >= b" handles "a < b" +# "not a > b" handles "a <= b" +_not_op = ''' + op_result = self.%s(other) if op_result is NotImplemented: return NotImplemented return not op_result +''' -def _op_or_eq(op, self, other): - # "a < b or a == b" handles "a <= b" - # "a > b or a == b" handles "a >= b" - op_result = op(other) +# "a > b or a == b" handles "a >= b" +# "a < b or a == b" handles "a <= b" +_op_or_eq = ''' + op_result = self.%s(other) if op_result is NotImplemented: return NotImplemented return op_result or self == other +''' -def _not_op_and_not_eq(op, self, other): - # "not (a < b or a == b)" handles "a > b" - # "not a < b and a != b" is equivalent - # "not (a > b or a == b)" handles "a < b" - # "not a > b and a != b" is equivalent - op_result = op(other) +# "not (a < b or a == b)" handles "a > b" +# "not a < b and a != b" is equivalent +# "not (a > b or a == b)" handles "a < b" +# "not a > b and a != b" is equivalent +_not_op_and_not_eq = ''' + op_result = self.%s(other) if op_result is NotImplemented: return NotImplemented return not op_result and self != other +''' -def _not_op_or_eq(op, self, other): - # "not a <= b or a == b" handles "a >= b" - # "not a >= b or a == b" handles "a <= b" - op_result = op(other) +# "not a <= b or a == b" handles "a >= b" +# "not a >= b or a == b" handles "a <= b" +_not_op_or_eq = ''' + op_result = self.%s(other) if op_result is NotImplemented: return NotImplemented return not op_result or self == other +''' -def _op_and_not_eq(op, self, other): - # "a <= b and not a == b" handles "a < b" - # "a >= b and not a == b" handles "a > b" - op_result = op(other) +# "a <= b and not a == b" handles "a < b" +# "a >= b and not a == b" handles "a > b" +_op_and_not_eq = ''' + op_result = self.%s(other) if op_result is NotImplemented: return NotImplemented return op_result and self != other +''' def total_ordering(cls): """Class decorator that fills in missing ordering methods""" convert = { - '__lt__': [('__gt__', lambda self, other: _not_op_and_not_eq(self.__lt__, self, other)), - ('__le__', lambda self, other: _op_or_eq(self.__lt__, self, other)), - ('__ge__', lambda self, other: _not_op(self.__lt__, other))], - '__le__': [('__ge__', lambda self, other: _not_op_or_eq(self.__le__, self, other)), - ('__lt__', lambda self, other: _op_and_not_eq(self.__le__, self, other)), - ('__gt__', lambda self, other: _not_op(self.__le__, other))], - '__gt__': [('__lt__', lambda self, other: _not_op_and_not_eq(self.__gt__, self, other)), - ('__ge__', lambda self, other: _op_or_eq(self.__gt__, self, other)), - ('__le__', lambda self, other: _not_op(self.__gt__, other))], - '__ge__': [('__le__', lambda self, other: _not_op_or_eq(self.__ge__, self, other)), - ('__gt__', lambda self, other: _op_and_not_eq(self.__ge__, self, other)), - ('__lt__', lambda self, other: _not_op(self.__ge__, other))] + '__lt__': {'__gt__': _not_op_and_not_eq, + '__le__': _op_or_eq, + '__ge__': _not_op}, + '__le__': {'__ge__': _not_op_or_eq, + '__lt__': _op_and_not_eq, + '__gt__': _not_op}, + '__gt__': {'__lt__': _not_op_and_not_eq, + '__ge__': _op_or_eq, + '__le__': _not_op}, + '__ge__': {'__le__': _not_op_or_eq, + '__gt__': _op_and_not_eq, + '__lt__': _not_op} } # Find user-defined comparisons (not those inherited from object). roots = [op for op in convert if getattr(cls, op, None) is not getattr(object, op, None)] if not roots: raise ValueError('must define at least one ordering operation: < > <= >=') root = max(roots) # prefer __lt__ to __le__ to __gt__ to __ge__ - for opname, opfunc in convert[root]: + for opname, opfunc in convert[root].items(): if opname not in roots: - opfunc.__name__ = opname + namespace = {} + exec('def %s(self, other):%s' % (opname, opfunc % root), namespace) + opfunc = namespace[opname] + opfunc.__qualname__ = '%s.%s' % (cls.__qualname__, opname) + opfunc.__module__ = cls.__module__ opfunc.__doc__ = getattr(int, opname).__doc__ setattr(cls, opname, opfunc) return cls diff --git a/Lib/test/test_functools.py b/Lib/test/test_functools.py --- a/Lib/test/test_functools.py +++ b/Lib/test/test_functools.py @@ -880,6 +880,24 @@ with self.assertRaises(TypeError): a <= b + def test_pickle(self): + for proto in range(4, pickle.HIGHEST_PROTOCOL + 1): + for name in '__lt__', '__gt__', '__le__', '__ge__': + with self.subTest(method=name, proto=proto): + method = getattr(Orderable_LT, name) + method_copy = pickle.loads(pickle.dumps(method, proto)) + self.assertIs(method_copy, method) + + at functools.total_ordering +class Orderable_LT: + def __init__(self, value): + self.value = value + def __lt__(self, other): + return self.value < other.value + def __eq__(self, other): + return self.value == other.value + + class TestLRU(unittest.TestCase): def test_lru(self): diff --git a/Misc/NEWS b/Misc/NEWS --- a/Misc/NEWS +++ b/Misc/NEWS @@ -196,6 +196,9 @@ Library ------- +- Issue #23132: Improve performance and introspection support of comparison + methods created by functool.total_ordering. + - Issue #19776: Add a expanduser() method on Path objects. - Issue #23112: Fix SimpleHTTPServer to correctly carry the query string and -- Repository URL: https://hg.python.org/cpython From solipsis at pitrou.net Fri Jan 2 08:57:55 2015 From: solipsis at pitrou.net (solipsis at pitrou.net) Date: Fri, 02 Jan 2015 08:57:55 +0100 Subject: [Python-checkins] Daily reference leaks (4e85df8b3ea6): sum=3 Message-ID: results for 4e85df8b3ea6 on branch "default" -------------------------------------------- test_functools leaked [0, 0, 3] memory blocks, sum=3 Command line was: ['./python', '-m', 'test.regrtest', '-uall', '-R', '3:3:/home/antoine/cpython/refleaks/reflogYEDrZ1', '-x'] From python-checkins at python.org Fri Jan 2 19:09:57 2015 From: python-checkins at python.org (benjamin.peterson) Date: Fri, 02 Jan 2015 18:09:57 +0000 Subject: [Python-checkins] =?utf-8?q?peps=3A_conjugate_=27do=27_correctly_?= =?utf-8?b?KGNsb3NlcyAjMjMxNDkp?= Message-ID: <20150102180950.125904.7267@psf.io> https://hg.python.org/peps/rev/0c72bd524aed changeset: 5652:0c72bd524aed user: Benjamin Peterson date: Fri Jan 02 12:09:41 2015 -0600 summary: conjugate 'do' correctly (closes #23149) files: pep-0008.txt | 2 +- 1 files changed, 1 insertions(+), 1 deletions(-) diff --git a/pep-0008.txt b/pep-0008.txt --- a/pep-0008.txt +++ b/pep-0008.txt @@ -394,7 +394,7 @@ ============= In Python, single-quoted strings and double-quoted strings are the -same. This PEP do not make a recommendation for this. Pick a rule +same. This PEP does not make a recommendation for this. Pick a rule and stick to it. When a string contains single or double quote characters, however, use the other one to avoid backslashes in the string. It improves readability. -- Repository URL: https://hg.python.org/peps From python-checkins at python.org Fri Jan 2 19:10:59 2015 From: python-checkins at python.org (benjamin.peterson) Date: Fri, 02 Jan 2015 18:10:59 +0000 Subject: [Python-checkins] =?utf-8?q?peps=3A_remove_trailing_ws?= Message-ID: <20150102181043.11571.97062@psf.io> https://hg.python.org/peps/rev/aaffbaf67933 changeset: 5653:aaffbaf67933 user: Benjamin Peterson date: Fri Jan 02 12:10:41 2015 -0600 summary: remove trailing ws files: pep-0008.txt | 8 ++++---- 1 files changed, 4 insertions(+), 4 deletions(-) diff --git a/pep-0008.txt b/pep-0008.txt --- a/pep-0008.txt +++ b/pep-0008.txt @@ -967,13 +967,13 @@ - Use ``is not`` operator rather than ``not ... is``. While both expressions are functionally identical, the former is more readable and preferred. - + Yes:: - + if foo is not None: - + No:: - + if not foo is None: - When implementing ordering operations with rich comparisons, it is -- Repository URL: https://hg.python.org/peps From python-checkins at python.org Sat Jan 3 03:11:10 2015 From: python-checkins at python.org (steve.dower) Date: Sat, 03 Jan 2015 02:11:10 +0000 Subject: [Python-checkins] =?utf-8?q?cpython=3A_Changes_=25s_to_=25ls_in_w?= =?utf-8?q?printf_in_launcher=2Ec_for_C99_compatibility=2E?= Message-ID: <20150103021105.8747.34424@psf.io> https://hg.python.org/cpython/rev/0a095aa9b5d3 changeset: 94004:0a095aa9b5d3 user: Steve Dower date: Fri Jan 02 18:07:46 2015 -0800 summary: Changes %s to %ls in wprintf in launcher.c for C99 compatibility. files: PC/launcher.c | 102 +++++++++++++++++++------------------- 1 files changed, 51 insertions(+), 51 deletions(-) diff --git a/PC/launcher.c b/PC/launcher.c --- a/PC/launcher.c +++ b/PC/launcher.c @@ -87,13 +87,13 @@ if (rc == 0) { /* a Windows error */ winerror(GetLastError(), win_message, MSGSIZE); if (len >= 0) { - _snwprintf_s(&message[len], MSGSIZE - len, _TRUNCATE, L": %s", + _snwprintf_s(&message[len], MSGSIZE - len, _TRUNCATE, L": %ls", win_message); } } #if !defined(_WINDOWS) - fwprintf(stderr, L"%s\n", message); + fwprintf(stderr, L"%ls\n", message); #else MessageBox(NULL, message, TEXT("Python Launcher is sorry to say ..."), MB_OK); @@ -202,7 +202,7 @@ wchar_t *key_name = (root == HKEY_LOCAL_MACHINE) ? L"HKLM" : L"HKCU"; if (status != ERROR_SUCCESS) - debug(L"locate_pythons_for_key: unable to open PythonCore key in %s\n", + debug(L"locate_pythons_for_key: unable to open PythonCore key in %ls\n", key_name); else { ip = &installed_pythons[num_installed_pythons]; @@ -212,19 +212,19 @@ if (status != ERROR_NO_MORE_ITEMS) { /* unexpected error */ winerror(status, message, MSGSIZE); - debug(L"Can't enumerate registry key for version %s: %s\n", + debug(L"Can't enumerate registry key for version %ls: %ls\n", ip->version, message); } break; } else { _snwprintf_s(ip_path, IP_SIZE, _TRUNCATE, - L"%s\\%s\\InstallPath", CORE_PATH, ip->version); + L"%ls\\%ls\\InstallPath", CORE_PATH, ip->version); status = RegOpenKeyExW(root, ip_path, 0, flags, &ip_key); if (status != ERROR_SUCCESS) { winerror(status, message, MSGSIZE); // Note: 'message' already has a trailing \n - debug(L"%s\\%s: %s", key_name, ip_path, message); + debug(L"%ls\\%ls: %ls", key_name, ip_path, message); continue; } data_size = sizeof(ip->executable) - 1; @@ -233,7 +233,7 @@ RegCloseKey(ip_key); if (status != ERROR_SUCCESS) { winerror(status, message, MSGSIZE); - debug(L"%s\\%s: %s\n", key_name, ip_path, message); + debug(L"%ls\\%ls: %ls\n", key_name, ip_path, message); continue; } if (type == REG_SZ) { @@ -246,27 +246,27 @@ _snwprintf_s(&ip->executable[data_size], MAX_PATH - data_size, MAX_PATH - data_size, - L"%s%s", check, PYTHON_EXECUTABLE); + L"%ls%ls", check, PYTHON_EXECUTABLE); attrs = GetFileAttributesW(ip->executable); if (attrs == INVALID_FILE_ATTRIBUTES) { winerror(GetLastError(), message, MSGSIZE); - debug(L"locate_pythons_for_key: %s: %s", + debug(L"locate_pythons_for_key: %ls: %ls", ip->executable, message); } else if (attrs & FILE_ATTRIBUTE_DIRECTORY) { - debug(L"locate_pythons_for_key: '%s' is a \ + debug(L"locate_pythons_for_key: '%ls' is a \ directory\n", ip->executable, attrs); } else if (find_existing_python(ip->executable)) { - debug(L"locate_pythons_for_key: %s: already \ -found: %s\n", ip->executable); + debug(L"locate_pythons_for_key: %ls: already \ +found: %ls\n", ip->executable); } else { /* check the executable type. */ ok = GetBinaryTypeW(ip->executable, &attrs); if (!ok) { - debug(L"Failure getting binary type: %s\n", + debug(L"Failure getting binary type: %ls\n", ip->executable); } else { @@ -277,7 +277,7 @@ else ip->bits = 0; if (ip->bits == 0) { - debug(L"locate_pythons_for_key: %s: \ + debug(L"locate_pythons_for_key: %ls: \ invalid binary type: %X\n", ip->executable, attrs); } @@ -291,7 +291,7 @@ ip->executable[n + 1] = L'\"'; ip->executable[n + 2] = L'\0'; } - debug(L"locate_pythons_for_key: %s \ + debug(L"locate_pythons_for_key: %ls \ is a %dbit executable\n", ip->executable, ip->bits); ++num_installed_pythons; @@ -397,7 +397,7 @@ DWORD size; /* First, search the environment. */ - _snwprintf_s(configured_value, MSGSIZE, _TRUNCATE, L"py_%s", key); + _snwprintf_s(configured_value, MSGSIZE, _TRUNCATE, L"py_%ls", key); result = get_env(configured_value); if (result == NULL && appdata_ini_path[0]) { /* Not in environment: check local configuration. */ @@ -420,10 +420,10 @@ } } if (result) { - debug(L"found configured value '%s=%s' in %s\n", + debug(L"found configured value '%ls=%ls' in %ls\n", key, result, found_in ? found_in : L"(unknown)"); } else { - debug(L"found no configured value for '%s'\n", key); + debug(L"found no configured value for '%ls'\n", key); } return result; } @@ -449,9 +449,9 @@ } if (*wanted_ver) { result = find_python_by_version(wanted_ver); - debug(L"search for Python version '%s' found ", wanted_ver); + debug(L"search for Python version '%ls' found ", wanted_ver); if (result) { - debug(L"'%s'\n", result->executable); + debug(L"'%ls'\n", result->executable); } else { debug(L"no interpreter\n"); } @@ -467,7 +467,7 @@ result = find_python_by_version(L"3"); debug(L"search for default Python found "); if (result) { - debug(L"version %s at '%s'\n", + debug(L"version %ls at '%ls'\n", result->version, result->executable); } else { debug(L"no interpreter\n"); @@ -505,19 +505,19 @@ plen = GetModuleFileNameW(NULL, wrapped_script_path, MAX_PATH); p = wcsrchr(wrapped_script_path, L'.'); if (p == NULL) { - debug(L"GetModuleFileNameW returned value has no extension: %s\n", + debug(L"GetModuleFileNameW returned value has no extension: %ls\n", wrapped_script_path); - error(RC_NO_SCRIPT, L"Wrapper name '%s' is not valid.", wrapped_script_path); + error(RC_NO_SCRIPT, L"Wrapper name '%ls' is not valid.", wrapped_script_path); } wcsncpy_s(p, MAX_PATH - (p - wrapped_script_path) + 1, SCRIPT_SUFFIX, _TRUNCATE); attrs = GetFileAttributesW(wrapped_script_path); if (attrs == INVALID_FILE_ATTRIBUTES) { - debug(L"File '%s' non-existent\n", wrapped_script_path); - error(RC_NO_SCRIPT, L"Script file '%s' is not present.", wrapped_script_path); + debug(L"File '%ls' non-existent\n", wrapped_script_path); + error(RC_NO_SCRIPT, L"Script file '%ls' is not present.", wrapped_script_path); } - debug(L"Using wrapped script file '%s'\n", wrapped_script_path); + debug(L"Using wrapped script file '%ls'\n", wrapped_script_path); } #endif @@ -579,7 +579,7 @@ GetMessage(&msg, 0, 0, 0); #endif - debug(L"run_child: about to run '%s'\n", cmdline); + debug(L"run_child: about to run '%ls'\n", cmdline); job = CreateJobObject(NULL, NULL); ok = QueryInformationJobObject(job, JobObjectExtendedLimitInformation, &info, sizeof(info), &rc); @@ -611,7 +611,7 @@ ok = CreateProcessW(NULL, cmdline, NULL, NULL, TRUE, 0, NULL, NULL, &si, &pi); if (!ok) - error(RC_CREATE_PROCESS, L"Unable to create process using '%s'", cmdline); + error(RC_CREATE_PROCESS, L"Unable to create process using '%ls'", cmdline); AssignProcessToJobObject(job, pi.hProcess); CloseHandle(pi.hThread); WaitForSingleObjectEx(pi.hProcess, INFINITE, FALSE); @@ -648,11 +648,11 @@ child_command_size); if (no_suffix) _snwprintf_s(child_command, child_command_size, - child_command_size - 1, L"%s %s", + child_command_size - 1, L"%ls %ls", executable, cmdline); else _snwprintf_s(child_command, child_command_size, - child_command_size - 1, L"%s %s %s", + child_command_size - 1, L"%ls %ls %ls", executable, suffix, cmdline); run_child(child_command); free(child_command); @@ -791,7 +791,7 @@ add_command(wchar_t * name, wchar_t * cmdline) { if (num_commands >= MAX_COMMANDS) { - debug(L"can't add %s = '%s': no room\n", name, cmdline); + debug(L"can't add %ls = '%ls': no room\n", name, cmdline); } else { COMMAND * cp = &commands[num_commands++]; @@ -813,14 +813,14 @@ read = GetPrivateProfileStringW(L"commands", NULL, NULL, keynames, MSGSIZE, config_path); if (read == MSGSIZE - 1) { - debug(L"read_commands: %s: not enough space for names\n", config_path); + debug(L"read_commands: %ls: not enough space for names\n", config_path); } key = keynames; while (*key) { read = GetPrivateProfileStringW(L"commands", key, NULL, value, MSGSIZE, config_path); if (read == MSGSIZE - 1) { - debug(L"read_commands: %s: not enough space for %s\n", + debug(L"read_commands: %ls: not enough space for %ls\n", config_path, key); } cmdp = skip_whitespace(value); @@ -1097,7 +1097,7 @@ if ((read >= 4) && (buffer[3] == '\n') && (buffer[2] == '\r')) { ip = find_by_magic((buffer[1] << 8 | buffer[0]) & 0xFFFF); if (ip != NULL) { - debug(L"script file is compiled against Python %s\n", + debug(L"script file is compiled against Python %ls\n", ip->version); invoke_child(ip->executable, NULL, cmdline); } @@ -1200,7 +1200,7 @@ is_virt = parse_shebang(shebang_line, nchars, &command, &suffix, &search); if (command != NULL) { - debug(L"parse_shebang: found command: %s\n", command); + debug(L"parse_shebang: found command: %ls\n", command); if (!is_virt) { invoke_child(command, suffix, cmdline); } @@ -1212,7 +1212,7 @@ } if (wcsncmp(command, L"python", 6)) error(RC_BAD_VIRTUAL_PATH, L"Unknown virtual \ -path '%s'", command); +path '%ls'", command); command += 6; /* skip past "python" */ if (search && ((*command == L'\0') || isspace(*command))) { /* Command is eligible for path search, and there @@ -1220,9 +1220,9 @@ */ debug(L"searching PATH for python executable\n"); cmd = find_on_path(L"python"); - debug(L"Python on path: %s\n", cmd ? cmd->value : L""); + debug(L"Python on path: %ls\n", cmd ? cmd->value : L""); if (cmd) { - debug(L"located python on PATH: %s\n", cmd->value); + debug(L"located python on PATH: %ls\n", cmd->value); invoke_child(cmd->value, suffix, cmdline); /* Exit here, as we have found the command */ return; @@ -1233,14 +1233,14 @@ } if (*command && !validate_version(command)) error(RC_BAD_VIRTUAL_PATH, L"Invalid version \ -specification: '%s'.\nIn the first line of the script, 'python' needs to be \ +specification: '%ls'.\nIn the first line of the script, 'python' needs to be \ followed by a valid version specifier.\nPlease check the documentation.", command); /* TODO could call validate_version(command) */ ip = locate_python(command); if (ip == NULL) { error(RC_NO_PYTHON, L"Requested Python version \ -(%s) is not installed", command); +(%ls) is not installed", command); } else { invoke_child(ip->executable, suffix, cmdline); @@ -1347,17 +1347,17 @@ wcsncpy_s(p, MAX_PATH - plen, L"\\py.ini", _TRUNCATE); attrs = GetFileAttributesW(appdata_ini_path); if (attrs == INVALID_FILE_ATTRIBUTES) { - debug(L"File '%s' non-existent\n", appdata_ini_path); + debug(L"File '%ls' non-existent\n", appdata_ini_path); appdata_ini_path[0] = L'\0'; } else { - debug(L"Using local configuration file '%s'\n", appdata_ini_path); + debug(L"Using local configuration file '%ls'\n", appdata_ini_path); } } plen = GetModuleFileNameW(NULL, launcher_ini_path, MAX_PATH); size = GetFileVersionInfoSizeW(launcher_ini_path, &size); if (size == 0) { winerror(GetLastError(), message, MSGSIZE); - debug(L"GetFileVersionInfoSize failed: %s\n", message); + debug(L"GetFileVersionInfoSize failed: %ls\n", message); } else { version_data = malloc(size); @@ -1381,7 +1381,7 @@ } p = wcsrchr(launcher_ini_path, L'\\'); if (p == NULL) { - debug(L"GetModuleFileNameW returned value has no backslash: %s\n", + debug(L"GetModuleFileNameW returned value has no backslash: %ls\n", launcher_ini_path); launcher_ini_path[0] = L'\0'; } @@ -1390,15 +1390,15 @@ _TRUNCATE); attrs = GetFileAttributesW(launcher_ini_path); if (attrs == INVALID_FILE_ATTRIBUTES) { - debug(L"File '%s' non-existent\n", launcher_ini_path); + debug(L"File '%ls' non-existent\n", launcher_ini_path); launcher_ini_path[0] = L'\0'; } else { - debug(L"Using global configuration file '%s'\n", launcher_ini_path); + debug(L"Using global configuration file '%ls'\n", launcher_ini_path); } } command = skip_me(GetCommandLineW()); - debug(L"Called with command line: %s\n", command); + debug(L"Called with command line: %ls\n", command); #if defined(SCRIPT_WRAPPER) /* The launcher is being used in "script wrapper" mode. @@ -1422,7 +1422,7 @@ wcscpy_s(newcommand, newlen, wrapped_script_path); wcscat_s(newcommand, newlen, L" "); wcscat_s(newcommand, newlen, command); - debug(L"Running wrapped script with command line '%s'\n", newcommand); + debug(L"Running wrapped script with command line '%ls'\n", newcommand); read_commands(); av[0] = wrapped_script_path; av[1] = NULL; @@ -1443,7 +1443,7 @@ if (valid) { ip = locate_python(&p[1]); if (ip == NULL) - error(RC_NO_PYTHON, L"Requested Python version (%s) not \ + error(RC_NO_PYTHON, L"Requested Python version (%ls) not \ installed", &p[1]); command += wcslen(p); command = skip_whitespace(command); @@ -1476,9 +1476,9 @@ get_version_info(version_text, MAX_PATH); fwprintf(stdout, L"\ -Python Launcher for Windows Version %s\n\n", version_text); +Python Launcher for Windows Version %ls\n\n", version_text); fwprintf(stdout, L"\ -usage: %s [ launcher-arguments ] [ python-arguments ] script [ script-arguments ]\n\n", argv[0]); +usage: %ls [ launcher-arguments ] [ python-arguments ] script [ script-arguments ]\n\n", argv[0]); fputws(L"\ Launcher arguments:\n\n\ -2 : Launch the latest Python 2.x version\n\ -- Repository URL: https://hg.python.org/cpython From python-checkins at python.org Sat Jan 3 03:27:39 2015 From: python-checkins at python.org (chris.angelico) Date: Sat, 03 Jan 2015 02:27:39 +0000 Subject: [Python-checkins] =?utf-8?q?peps=3A_Incorporate_PEP_8_text_from_I?= =?utf-8?q?an_Lee_to_clarify_annotation_policy?= Message-ID: <20150103022738.8743.36822@psf.io> https://hg.python.org/peps/rev/7eb1ddc0291c changeset: 5654:7eb1ddc0291c user: Chris Angelico date: Sat Jan 03 13:22:31 2015 +1100 summary: Incorporate PEP 8 text from Ian Lee to clarify annotation policy files: pep-0008.txt | 13 +++++++++++++ 1 files changed, 13 insertions(+), 0 deletions(-) diff --git a/pep-0008.txt b/pep-0008.txt --- a/pep-0008.txt +++ b/pep-0008.txt @@ -514,6 +514,19 @@ def complex(real, imag = 0.0): return magic(r = real, i = imag) +- Do use spaces around the ``=`` sign of an annotated function definition. + Additionally, use a single space after the ``:``, as well as a single space + on either side of the ``->`` sign representing an annotated return value. + + Yes: def munge(input: AnyStr): + Yes: def munge(sep: AnyStr = None): + Yes: def munge() -> AnyStr: + Yes: def munge(input: AnyStr, sep: AnyStr = None, limit=1000): + + No: def munge(input: AnyStr=None): + No: def munge(input:AnyStr): + No: def munge(input: AnyStr)->PosInt: + - Compound statements (multiple statements on the same line) are generally discouraged. -- Repository URL: https://hg.python.org/peps From python-checkins at python.org Sat Jan 3 03:27:39 2015 From: python-checkins at python.org (chris.angelico) Date: Sat, 03 Jan 2015 02:27:39 +0000 Subject: [Python-checkins] =?utf-8?q?peps=3A_Formatting_fixes_to_new_PEP8_?= =?utf-8?q?text?= Message-ID: <20150103022738.125880.91415@psf.io> https://hg.python.org/peps/rev/80e4f23d4a96 changeset: 5655:80e4f23d4a96 user: Chris Angelico date: Sat Jan 03 13:27:29 2015 +1100 summary: Formatting fixes to new PEP8 text files: pep-0008.txt | 18 +++++++++++------- 1 files changed, 11 insertions(+), 7 deletions(-) diff --git a/pep-0008.txt b/pep-0008.txt --- a/pep-0008.txt +++ b/pep-0008.txt @@ -518,14 +518,18 @@ Additionally, use a single space after the ``:``, as well as a single space on either side of the ``->`` sign representing an annotated return value. - Yes: def munge(input: AnyStr): - Yes: def munge(sep: AnyStr = None): - Yes: def munge() -> AnyStr: - Yes: def munge(input: AnyStr, sep: AnyStr = None, limit=1000): + Yes:: - No: def munge(input: AnyStr=None): - No: def munge(input:AnyStr): - No: def munge(input: AnyStr)->PosInt: + def munge(input: AnyStr): + def munge(sep: AnyStr = None): + def munge() -> AnyStr: + def munge(input: AnyStr, sep: AnyStr = None, limit=1000): + + No:: + + def munge(input: AnyStr=None): + def munge(input:AnyStr): + def munge(input: AnyStr)->PosInt: - Compound statements (multiple statements on the same line) are generally discouraged. -- Repository URL: https://hg.python.org/peps From python-checkins at python.org Sat Jan 3 03:31:39 2015 From: python-checkins at python.org (nick.coghlan) Date: Sat, 03 Jan 2015 02:31:39 +0000 Subject: [Python-checkins] =?utf-8?q?peps=3A_PEP_440_updates_based_on_user?= =?utf-8?q?_feedback?= Message-ID: <20150103023139.72567.22977@psf.io> https://hg.python.org/peps/rev/289dbffc16ed changeset: 5656:289dbffc16ed user: Nick Coghlan date: Sat Jan 03 12:28:31 2015 +1000 summary: PEP 440 updates based on user feedback files: pep-0440.txt | 95 ++++++++++++++++++++++++++------------- 1 files changed, 64 insertions(+), 31 deletions(-) diff --git a/pep-0440.txt b/pep-0440.txt --- a/pep-0440.txt +++ b/pep-0440.txt @@ -31,7 +31,7 @@ ======================================== The version representation and comparison scheme described in this PEP is -currently accepted on a provisional basis, as described in PEP 411. +currently accepted on a provisional basis [9]_, as described in PEP 411. This status is based on feedback received on the initial releases of pip 6.0, and setuptools 8.0, which revealed some issues in the specification that @@ -820,8 +820,7 @@ Compatible release ------------------ -A compatible release clause consists of either a version identifier without -any comparison operator or else the compatible release operator ``~=`` +A compatible release clause consists of the compatible release operator ``~=`` and a version identifier. It matches any candidate version that is expected to be compatible with the specified version. @@ -839,11 +838,9 @@ For example, the following groups of version clauses are equivalent:: - 2.2 ~= 2.2 >= 2.2, == 2.* - 1.4.5 ~= 1.4.5 >= 1.4.5, == 1.4.* @@ -851,11 +848,9 @@ compatible release clause as ``V.N.suffix``, then the suffix is ignored when determining the required prefix match:: - 2.2.post3 ~= 2.2.post3 >= 2.2.post3, == 2.* - 1.4.5a4 ~= 1.4.5a4 >= 1.4.5a4, == 1.4.* @@ -863,11 +858,9 @@ degree of forward compatibility in a compatible release clause can be controlled by appending additional zeros to the version specifier:: - 2.2.0 ~= 2.2.0 >= 2.2.0, == 2.2.* - 1.4.5.0 ~= 1.4.5.0 >= 1.4.5.0, == 1.4.5.* @@ -989,20 +982,29 @@ Exclusive ordered comparison ---------------------------- -Exclusive ordered comparisons are similar to inclusive ordered comparisons, -except that the comparison operators are ``<`` and ``>`` and the clause -MUST be effectively interpreted as implying the prefix based version -exclusion clause ``!= V.*``. +The exclusive ordered comparisons ``>`` and ``<`` are similar to the inclusive +ordered comparisons in that they rely on the relative position of the candidate +version and the specified version given the consistent ordering defined by the +standard `Version scheme`_. However, they specifically exclude pre-releases, +post-releases, and local versions of the specified version. -The exclusive ordered comparison ``> V`` MUST NOT match a post-release -or maintenance release of the given version. Maintenance releases can be -permitted by using the clause ``> V.0``, while both post releases and -maintenance releases can be permitted by using the inclusive ordered -comparison ``>= V.post1``. +The exclusive ordered comparison ``>V`` **MUST NOT** allow a post-release +of the given version unless ``V`` itself is a post release. You may mandate +that releases are later than a particular post release, including additional +post releases, by using ``>V.postN``. For example, ``>1.7`` will allow +``1.7.1`` but not ``1.7.0.post1`` and ``>1.7.post2`` will allow ``1.7.1`` +and ``1.7.0.post3`` but not ``1.7.0``. -The exclusive ordered comparison ``< V`` MUST NOT match a pre-release of -the given version, even if acceptance of pre-releases is enabled as -described in the section below. +The exclusive ordered comparison ``>V`` **MUST NOT** match a local version of +the specified version. + +The exclusive ordered comparison ```` -ordered comparison clauses. +added to make it possible to sensibly define compatible release clauses. Support for date based version identifiers @@ -1504,6 +1505,27 @@ gets normalized to a ``_`` to enable easier parsing of the filename. +Summary of changes to \PEP 440 +============================== + +The following changes were made to this PEP based on feedback received after +the initial reference implementation was released in setuptools 8.0 and pip +6.0: + +* The exclusive ordered comparisons were updated to no longer imply a ``!=V.*`` + which was deemed to be surprising behavior which was too hard to accurately + describe. Instead the exclusive ordered comparisons will simply disallow + matching pre-releases, post-releases, and local versions of the specified + version (unless the specified version is itself a pre-release, post-release + or local version). For an extended discussion see the threads on + distutils-sig [6]_ [7]_. + +* The normalized form for release candidates was updated from 'c' to 'rc'. + This change was based on user feedback received when setuptools 8.0 + started applying normalisation to the release metadata generated when + preparing packages for publication on PyPI [8]_. + + References ========== @@ -1525,6 +1547,17 @@ .. [5] Proof of Concept: PEP 440 within pip https://github.com/pypa/pip/pull/1894 +.. [6] PEP440: foo-X.Y.Z does not satisfy "foo>X.Y" + https://mail.python.org/pipermail/distutils-sig/2014-December/025451.html + +.. [7] PEP440: >1.7 vs >=1.7 + https://mail.python.org/pipermail/distutils-sig/2014-December/025507.html + +.. [8] Amend PEP 440 with Wider Feedback on Release Candidates + https://mail.python.org/pipermail/distutils-sig/2014-December/025409.html + +.. [9] Changing the status of PEP 440 to Provisional + https://mail.python.org/pipermail/distutils-sig/2014-December/025412.html Appendix A ========== -- Repository URL: https://hg.python.org/peps From solipsis at pitrou.net Sat Jan 3 09:24:44 2015 From: solipsis at pitrou.net (solipsis at pitrou.net) Date: Sat, 03 Jan 2015 09:24:44 +0100 Subject: [Python-checkins] Daily reference leaks (0a095aa9b5d3): sum=6 Message-ID: results for 0a095aa9b5d3 on branch "default" -------------------------------------------- test_asyncio leaked [0, 3, 0] memory blocks, sum=3 test_functools leaked [0, 0, 3] memory blocks, sum=3 Command line was: ['./python', '-m', 'test.regrtest', '-uall', '-R', '3:3:/home/antoine/cpython/refleaks/reflogKI_CsG', '-x'] From python-checkins at python.org Sat Jan 3 09:46:58 2015 From: python-checkins at python.org (ned.deily) Date: Sat, 03 Jan 2015 08:46:58 +0000 Subject: [Python-checkins] =?utf-8?q?cpython_=283=2E4=29=3A_Add_missing_UR?= =?utf-8?q?L_link_to_Modernize_docs=2E?= Message-ID: <20150103084653.11569.19487@psf.io> https://hg.python.org/cpython/rev/f4c79656dfd5 changeset: 94006:f4c79656dfd5 branch: 3.4 parent: 94001:50d581f69a73 user: Ned Deily date: Sat Jan 03 00:45:55 2015 -0800 summary: Add missing URL link to Modernize docs. files: Doc/howto/pyporting.rst | 2 +- 1 files changed, 1 insertions(+), 1 deletions(-) diff --git a/Doc/howto/pyporting.rst b/Doc/howto/pyporting.rst --- a/Doc/howto/pyporting.rst +++ b/Doc/howto/pyporting.rst @@ -379,7 +379,7 @@ .. _cheat sheet: http://python-future.org/compatible_idioms.html .. _coverage.py: https://pypi.python.org/pypi/coverage .. _Futurize: http://python-future.org/automatic_conversion.html -.. _Modernize: +.. _Modernize: http://python-modernize.readthedocs.org/en/latest/ .. _Porting to Python 3: http://python3porting.com/ .. _Pylint: https://pypi.python.org/pypi/pylint .. _Python 3 Q & A: http://ncoghlan-devs-python-notes.readthedocs.org/en/latest/python3/questions_and_answers.html -- Repository URL: https://hg.python.org/cpython From python-checkins at python.org Sat Jan 3 09:46:58 2015 From: python-checkins at python.org (ned.deily) Date: Sat, 03 Jan 2015 08:46:58 +0000 Subject: [Python-checkins] =?utf-8?q?cpython_=282=2E7=29=3A_Add_missing_UR?= =?utf-8?q?L_link_to_Modernize_docs=2E?= Message-ID: <20150103084652.8747.70261@psf.io> https://hg.python.org/cpython/rev/7746f83d8a74 changeset: 94005:7746f83d8a74 branch: 2.7 parent: 94000:4e067cf7e299 user: Ned Deily date: Sat Jan 03 00:45:25 2015 -0800 summary: Add missing URL link to Modernize docs. files: Doc/howto/pyporting.rst | 2 +- 1 files changed, 1 insertions(+), 1 deletions(-) diff --git a/Doc/howto/pyporting.rst b/Doc/howto/pyporting.rst --- a/Doc/howto/pyporting.rst +++ b/Doc/howto/pyporting.rst @@ -379,7 +379,7 @@ .. _cheat sheet: http://python-future.org/compatible_idioms.html .. _coverage.py: https://pypi.python.org/pypi/coverage .. _Futurize: http://python-future.org/automatic_conversion.html -.. _Modernize: +.. _Modernize: http://python-modernize.readthedocs.org/en/latest/ .. _Porting to Python 3: http://python3porting.com/ .. _Pylint: https://pypi.python.org/pypi/pylint .. _Python 3 Q & A: http://ncoghlan-devs-python-notes.readthedocs.org/en/latest/python3/questions_and_answers.html -- Repository URL: https://hg.python.org/cpython From python-checkins at python.org Sat Jan 3 09:46:58 2015 From: python-checkins at python.org (ned.deily) Date: Sat, 03 Jan 2015 08:46:58 +0000 Subject: [Python-checkins] =?utf-8?q?cpython_=28merge_3=2E4_-=3E_default?= =?utf-8?q?=29=3A_Add_missing_URL_link_to_Modernize_docs=2E?= Message-ID: <20150103084653.72567.84035@psf.io> https://hg.python.org/cpython/rev/008b77138424 changeset: 94007:008b77138424 parent: 94004:0a095aa9b5d3 parent: 94006:f4c79656dfd5 user: Ned Deily date: Sat Jan 03 00:46:24 2015 -0800 summary: Add missing URL link to Modernize docs. files: Doc/howto/pyporting.rst | 2 +- 1 files changed, 1 insertions(+), 1 deletions(-) diff --git a/Doc/howto/pyporting.rst b/Doc/howto/pyporting.rst --- a/Doc/howto/pyporting.rst +++ b/Doc/howto/pyporting.rst @@ -379,7 +379,7 @@ .. _cheat sheet: http://python-future.org/compatible_idioms.html .. _coverage.py: https://pypi.python.org/pypi/coverage .. _Futurize: http://python-future.org/automatic_conversion.html -.. _Modernize: +.. _Modernize: http://python-modernize.readthedocs.org/en/latest/ .. _Porting to Python 3: http://python3porting.com/ .. _Pylint: https://pypi.python.org/pypi/pylint .. _Python 3 Q & A: http://ncoghlan-devs-python-notes.readthedocs.org/en/latest/python3/questions_and_answers.html -- Repository URL: https://hg.python.org/cpython From python-checkins at python.org Sat Jan 3 11:20:43 2015 From: python-checkins at python.org (donald.stufft) Date: Sat, 03 Jan 2015 10:20:43 +0000 Subject: [Python-checkins] =?utf-8?q?cpython_=283=2E4=29=3A_Upgrade_the_bu?= =?utf-8?q?ndled_pip_to_6=2E0=2E6_and_the_bundled_setuptools_to_11=2E0?= Message-ID: <20150103102032.125892.85646@psf.io> https://hg.python.org/cpython/rev/26e1b78448f9 changeset: 94008:26e1b78448f9 branch: 3.4 parent: 94006:f4c79656dfd5 user: Donald Stufft date: Sat Jan 03 05:20:23 2015 -0500 summary: Upgrade the bundled pip to 6.0.6 and the bundled setuptools to 11.0 files: Lib/ensurepip/__init__.py | 4 ++-- Lib/ensurepip/_bundled/pip-6.0.2-py2.py3-none-any.whl | Bin Lib/ensurepip/_bundled/pip-6.0.6-py2.py3-none-any.whl | Bin Lib/ensurepip/_bundled/setuptools-11.0-py2.py3-none-any.whl | Bin Lib/ensurepip/_bundled/setuptools-8.2.1-py2.py3-none-any.whl | Bin 5 files changed, 2 insertions(+), 2 deletions(-) diff --git a/Lib/ensurepip/__init__.py b/Lib/ensurepip/__init__.py --- a/Lib/ensurepip/__init__.py +++ b/Lib/ensurepip/__init__.py @@ -8,9 +8,9 @@ __all__ = ["version", "bootstrap"] -_SETUPTOOLS_VERSION = "8.2.1" +_SETUPTOOLS_VERSION = "11.0" -_PIP_VERSION = "6.0.2" +_PIP_VERSION = "6.0.6" # pip currently requires ssl support, so we try to provide a nicer # error message when that is missing (http://bugs.python.org/issue19744) diff --git a/Lib/ensurepip/_bundled/pip-6.0.2-py2.py3-none-any.whl b/Lib/ensurepip/_bundled/pip-6.0.2-py2.py3-none-any.whl deleted file mode 100644 index 312467d98be6893a4a965af2e2f53b63188676c0..e69de29bb2d1d6434b8b29ae775ad8c2e48c5391 GIT binary patch [stripped] diff --git a/Lib/ensurepip/_bundled/pip-6.0.6-py2.py3-none-any.whl b/Lib/ensurepip/_bundled/pip-6.0.6-py2.py3-none-any.whl new file mode 100644 index e69de29bb2d1d6434b8b29ae775ad8c2e48c5391..e2be1055f375447e667e34ce2ce018ec41a12221 GIT binary patch [stripped] diff --git a/Lib/ensurepip/_bundled/setuptools-11.0-py2.py3-none-any.whl b/Lib/ensurepip/_bundled/setuptools-11.0-py2.py3-none-any.whl new file mode 100644 index e69de29bb2d1d6434b8b29ae775ad8c2e48c5391..ceeaa3cd5c326495156cb90ed3934ee5b712ad2e GIT binary patch [stripped] diff --git a/Lib/ensurepip/_bundled/setuptools-8.2.1-py2.py3-none-any.whl b/Lib/ensurepip/_bundled/setuptools-8.2.1-py2.py3-none-any.whl deleted file mode 100644 index fa3a6a54053285d701c58cae779ebd9a8a992118..e69de29bb2d1d6434b8b29ae775ad8c2e48c5391 GIT binary patch [stripped] -- Repository URL: https://hg.python.org/cpython From python-checkins at python.org Sat Jan 3 11:21:36 2015 From: python-checkins at python.org (donald.stufft) Date: Sat, 03 Jan 2015 10:21:36 +0000 Subject: [Python-checkins] =?utf-8?q?cpython_=28merge_3=2E4_-=3E_default?= =?utf-8?q?=29=3A_Update_bundled_pip_and_setuptools_to_6=2E0=2E6_and_11=2E?= =?utf-8?q?0=2E?= Message-ID: <20150103102129.22419.99748@psf.io> https://hg.python.org/cpython/rev/44cc97e8ec1b changeset: 94009:44cc97e8ec1b parent: 94007:008b77138424 parent: 94008:26e1b78448f9 user: Donald Stufft date: Sat Jan 03 05:21:23 2015 -0500 summary: Update bundled pip and setuptools to 6.0.6 and 11.0. files: Lib/ensurepip/__init__.py | 4 ++-- Lib/ensurepip/_bundled/pip-6.0.2-py2.py3-none-any.whl | Bin Lib/ensurepip/_bundled/pip-6.0.6-py2.py3-none-any.whl | Bin Lib/ensurepip/_bundled/setuptools-11.0-py2.py3-none-any.whl | Bin Lib/ensurepip/_bundled/setuptools-8.2.1-py2.py3-none-any.whl | Bin 5 files changed, 2 insertions(+), 2 deletions(-) diff --git a/Lib/ensurepip/__init__.py b/Lib/ensurepip/__init__.py --- a/Lib/ensurepip/__init__.py +++ b/Lib/ensurepip/__init__.py @@ -8,9 +8,9 @@ __all__ = ["version", "bootstrap"] -_SETUPTOOLS_VERSION = "8.2.1" +_SETUPTOOLS_VERSION = "11.0" -_PIP_VERSION = "6.0.2" +_PIP_VERSION = "6.0.6" # pip currently requires ssl support, so we try to provide a nicer # error message when that is missing (http://bugs.python.org/issue19744) diff --git a/Lib/ensurepip/_bundled/pip-6.0.2-py2.py3-none-any.whl b/Lib/ensurepip/_bundled/pip-6.0.2-py2.py3-none-any.whl deleted file mode 100644 index 312467d98be6893a4a965af2e2f53b63188676c0..e69de29bb2d1d6434b8b29ae775ad8c2e48c5391 GIT binary patch [stripped] diff --git a/Lib/ensurepip/_bundled/pip-6.0.6-py2.py3-none-any.whl b/Lib/ensurepip/_bundled/pip-6.0.6-py2.py3-none-any.whl new file mode 100644 index e69de29bb2d1d6434b8b29ae775ad8c2e48c5391..e2be1055f375447e667e34ce2ce018ec41a12221 GIT binary patch [stripped] diff --git a/Lib/ensurepip/_bundled/setuptools-11.0-py2.py3-none-any.whl b/Lib/ensurepip/_bundled/setuptools-11.0-py2.py3-none-any.whl new file mode 100644 index e69de29bb2d1d6434b8b29ae775ad8c2e48c5391..ceeaa3cd5c326495156cb90ed3934ee5b712ad2e GIT binary patch [stripped] diff --git a/Lib/ensurepip/_bundled/setuptools-8.2.1-py2.py3-none-any.whl b/Lib/ensurepip/_bundled/setuptools-8.2.1-py2.py3-none-any.whl deleted file mode 100644 index fa3a6a54053285d701c58cae779ebd9a8a992118..e69de29bb2d1d6434b8b29ae775ad8c2e48c5391 GIT binary patch [stripped] -- Repository URL: https://hg.python.org/cpython From python-checkins at python.org Sat Jan 3 11:23:47 2015 From: python-checkins at python.org (donald.stufft) Date: Sat, 03 Jan 2015 10:23:47 +0000 Subject: [Python-checkins] =?utf-8?q?cpython_=282=2E7=29=3A_Update_bundled?= =?utf-8?q?_pip_and_setuptools_to_6=2E0=2E6_and_11=2E0?= Message-ID: <20150103102345.72557.71261@psf.io> https://hg.python.org/cpython/rev/1346a6013127 changeset: 94010:1346a6013127 branch: 2.7 parent: 94005:7746f83d8a74 user: Donald Stufft date: Sat Jan 03 05:23:39 2015 -0500 summary: Update bundled pip and setuptools to 6.0.6 and 11.0 files: Lib/ensurepip/__init__.py | 4 ++-- Lib/ensurepip/_bundled/pip-6.0.2-py2.py3-none-any.whl | Bin Lib/ensurepip/_bundled/pip-6.0.6-py2.py3-none-any.whl | Bin Lib/ensurepip/_bundled/setuptools-11.0-py2.py3-none-any.whl | Bin Lib/ensurepip/_bundled/setuptools-8.2.1-py2.py3-none-any.whl | Bin 5 files changed, 2 insertions(+), 2 deletions(-) diff --git a/Lib/ensurepip/__init__.py b/Lib/ensurepip/__init__.py --- a/Lib/ensurepip/__init__.py +++ b/Lib/ensurepip/__init__.py @@ -12,9 +12,9 @@ __all__ = ["version", "bootstrap"] -_SETUPTOOLS_VERSION = "8.2.1" +_SETUPTOOLS_VERSION = "11.0" -_PIP_VERSION = "6.0.2" +_PIP_VERSION = "6.0.6" # pip currently requires ssl support, so we try to provide a nicer # error message when that is missing (http://bugs.python.org/issue19744) diff --git a/Lib/ensurepip/_bundled/pip-6.0.2-py2.py3-none-any.whl b/Lib/ensurepip/_bundled/pip-6.0.2-py2.py3-none-any.whl deleted file mode 100644 index 312467d98be6893a4a965af2e2f53b63188676c0..e69de29bb2d1d6434b8b29ae775ad8c2e48c5391 GIT binary patch [stripped] diff --git a/Lib/ensurepip/_bundled/pip-6.0.6-py2.py3-none-any.whl b/Lib/ensurepip/_bundled/pip-6.0.6-py2.py3-none-any.whl new file mode 100644 index e69de29bb2d1d6434b8b29ae775ad8c2e48c5391..e2be1055f375447e667e34ce2ce018ec41a12221 GIT binary patch [stripped] diff --git a/Lib/ensurepip/_bundled/setuptools-11.0-py2.py3-none-any.whl b/Lib/ensurepip/_bundled/setuptools-11.0-py2.py3-none-any.whl new file mode 100644 index e69de29bb2d1d6434b8b29ae775ad8c2e48c5391..ceeaa3cd5c326495156cb90ed3934ee5b712ad2e GIT binary patch [stripped] diff --git a/Lib/ensurepip/_bundled/setuptools-8.2.1-py2.py3-none-any.whl b/Lib/ensurepip/_bundled/setuptools-8.2.1-py2.py3-none-any.whl deleted file mode 100644 index fa3a6a54053285d701c58cae779ebd9a8a992118..e69de29bb2d1d6434b8b29ae775ad8c2e48c5391 GIT binary patch [stripped] -- Repository URL: https://hg.python.org/cpython From python-checkins at python.org Sat Jan 3 23:17:30 2015 From: python-checkins at python.org (antoine.pitrou) Date: Sat, 03 Jan 2015 22:17:30 +0000 Subject: [Python-checkins] =?utf-8?q?cpython=3A_Issue_=2323143=3A_Remove_c?= =?utf-8?q?ompatibility_with_OpenSSLs_older_than_0=2E9=2E8=2E?= Message-ID: <20150103221729.72575.62261@psf.io> https://hg.python.org/cpython/rev/e9f05a4a5f16 changeset: 94011:e9f05a4a5f16 parent: 94009:44cc97e8ec1b user: Antoine Pitrou date: Sat Jan 03 23:17:23 2015 +0100 summary: Issue #23143: Remove compatibility with OpenSSLs older than 0.9.8. (the last 0.9.7 release was in 2007) files: Misc/NEWS | 2 + Modules/_ssl.c | 53 -------------------------------------- 2 files changed, 2 insertions(+), 53 deletions(-) diff --git a/Misc/NEWS b/Misc/NEWS --- a/Misc/NEWS +++ b/Misc/NEWS @@ -196,6 +196,8 @@ Library ------- +- Issue #23143: Remove compatibility with OpenSSLs older than 0.9.8. + - Issue #23132: Improve performance and introspection support of comparison methods created by functool.total_ordering. diff --git a/Modules/_ssl.c b/Modules/_ssl.c --- a/Modules/_ssl.c +++ b/Modules/_ssl.c @@ -162,13 +162,6 @@ #define X509_NAME_MAXLEN 256 -/* RAND_* APIs got added to OpenSSL in 0.9.5 */ -#if OPENSSL_VERSION_NUMBER >= 0x0090500fL -# define HAVE_OPENSSL_RAND 1 -#else -# undef HAVE_OPENSSL_RAND -#endif - /* SSL_CTX_clear_options() and SSL_clear_options() were first added in * OpenSSL 0.9.8m but do not appear in some 0.9.9-dev versions such the * 0.9.9 from "May 2008" that NetBSD 5.0 uses. */ @@ -182,28 +175,6 @@ * older SSL, but let's be safe */ #define PySSL_CB_MAXLEN 128 -/* SSL_get_finished got added to OpenSSL in 0.9.5 */ -#if OPENSSL_VERSION_NUMBER >= 0x0090500fL -# define HAVE_OPENSSL_FINISHED 1 -#else -# define HAVE_OPENSSL_FINISHED 0 -#endif - -/* ECDH support got added to OpenSSL in 0.9.8 */ -#if OPENSSL_VERSION_NUMBER < 0x0090800fL && !defined(OPENSSL_NO_ECDH) -# define OPENSSL_NO_ECDH -#endif - -/* compression support got added to OpenSSL in 0.9.8 */ -#if OPENSSL_VERSION_NUMBER < 0x0090800fL && !defined(OPENSSL_NO_COMP) -# define OPENSSL_NO_COMP -#endif - -/* X509_VERIFY_PARAM got added to OpenSSL in 0.9.8 */ -#if OPENSSL_VERSION_NUMBER >= 0x0090800fL -# define HAVE_OPENSSL_VERIFY_PARAM -#endif - typedef struct { PyObject_HEAD @@ -817,12 +788,7 @@ char buf[2048]; char *vptr; int len; - /* Issue #2973: ASN1_item_d2i() API changed in OpenSSL 0.9.6m */ -#if OPENSSL_VERSION_NUMBER >= 0x009060dfL const unsigned char *p; -#else - unsigned char *p; -#endif if (certificate == NULL) return peer_alt_names; @@ -1998,7 +1964,6 @@ Does the SSL shutdown handshake with the remote end, and returns\n\ the underlying socket object."); -#if HAVE_OPENSSL_FINISHED static PyObject * PySSL_tls_unique_cb(PySSLSocket *self) { @@ -2031,8 +1996,6 @@ \n\ If the TLS handshake is not yet complete, None is returned"); -#endif /* HAVE_OPENSSL_FINISHED */ - static PyGetSetDef ssl_getsetlist[] = { {"context", (getter) PySSL_get_context, (setter) PySSL_set_context, PySSL_set_context_doc}, @@ -2063,10 +2026,8 @@ {"compression", (PyCFunction)PySSL_compression, METH_NOARGS}, {"shutdown", (PyCFunction)PySSL_SSLshutdown, METH_NOARGS, PySSL_SSLshutdown_doc}, -#if HAVE_OPENSSL_FINISHED {"tls_unique_cb", (PyCFunction)PySSL_tls_unique_cb, METH_NOARGS, PySSL_tls_unique_cb_doc}, -#endif {NULL, NULL} }; @@ -2380,7 +2341,6 @@ return 0; } -#ifdef HAVE_OPENSSL_VERIFY_PARAM static PyObject * get_verify_flags(PySSLContext *self, void *c) { @@ -2418,7 +2378,6 @@ } return 0; } -#endif static PyObject * get_options(PySSLContext *self, void *c) @@ -3303,10 +3262,8 @@ (setter) set_check_hostname, NULL}, {"options", (getter) get_options, (setter) set_options, NULL}, -#ifdef HAVE_OPENSSL_VERIFY_PARAM {"verify_flags", (getter) get_verify_flags, (setter) set_verify_flags, NULL}, -#endif {"verify_mode", (getter) get_verify_mode, (setter) set_verify_mode, NULL}, {NULL}, /* sentinel */ @@ -3606,8 +3563,6 @@ }; -#ifdef HAVE_OPENSSL_RAND - /* helper routines for seeding the SSL PRNG */ static PyObject * PySSL_RAND_add(PyObject *self, PyObject *args) @@ -3745,8 +3700,6 @@ fails or if it does not provide enough data to seed PRNG."); #endif /* HAVE_RAND_EGD */ -#endif /* HAVE_OPENSSL_RAND */ - PyDoc_STRVAR(PySSL_get_default_verify_paths_doc, "get_default_verify_paths() -> tuple\n\ @@ -4132,7 +4085,6 @@ static PyMethodDef PySSL_methods[] = { {"_test_decode_cert", PySSL_test_decode_certificate, METH_VARARGS}, -#ifdef HAVE_OPENSSL_RAND {"RAND_add", PySSL_RAND_add, METH_VARARGS, PySSL_RAND_add_doc}, {"RAND_bytes", PySSL_RAND_bytes, METH_VARARGS, @@ -4145,7 +4097,6 @@ #endif {"RAND_status", (PyCFunction)PySSL_RAND_status, METH_NOARGS, PySSL_RAND_status_doc}, -#endif {"get_default_verify_paths", (PyCFunction)PySSL_get_default_verify_paths, METH_NOARGS, PySSL_get_default_verify_paths_doc}, #ifdef _MSC_VER @@ -4500,11 +4451,7 @@ Py_INCREF(r); PyModule_AddObject(m, "HAS_SNI", r); -#if HAVE_OPENSSL_FINISHED r = Py_True; -#else - r = Py_False; -#endif Py_INCREF(r); PyModule_AddObject(m, "HAS_TLS_UNIQUE", r); -- Repository URL: https://hg.python.org/cpython From python-checkins at python.org Sat Jan 3 23:21:27 2015 From: python-checkins at python.org (antoine.pitrou) Date: Sat, 03 Jan 2015 22:21:27 +0000 Subject: [Python-checkins] =?utf-8?q?cpython=3A_Issue_=2323143=3A_Remove_c?= =?utf-8?q?ompatibility_with_OpenSSLs_older_than_0=2E9=2E8=2E?= Message-ID: <20150103222126.8749.38486@psf.io> https://hg.python.org/cpython/rev/37c6fd09f71f changeset: 94012:37c6fd09f71f user: Antoine Pitrou date: Sat Jan 03 23:21:21 2015 +0100 summary: Issue #23143: Remove compatibility with OpenSSLs older than 0.9.8. (now the hashlib module) files: Modules/_hashopenssl.c | 12 ------------ 1 files changed, 0 insertions(+), 12 deletions(-) diff --git a/Modules/_hashopenssl.c b/Modules/_hashopenssl.c --- a/Modules/_hashopenssl.c +++ b/Modules/_hashopenssl.c @@ -31,10 +31,6 @@ #define HASH_OBJ_CONSTRUCTOR 0 #endif -/* Minimum OpenSSL version needed to support sha224 and higher. */ -#if defined(OPENSSL_VERSION_NUMBER) && (OPENSSL_VERSION_NUMBER >= 0x00908000) -#define _OPENSSL_SUPPORTS_SHA2 -#endif typedef struct { PyObject_HEAD @@ -56,12 +52,10 @@ DEFINE_CONSTS_FOR_NEW(md5) DEFINE_CONSTS_FOR_NEW(sha1) -#ifdef _OPENSSL_SUPPORTS_SHA2 DEFINE_CONSTS_FOR_NEW(sha224) DEFINE_CONSTS_FOR_NEW(sha256) DEFINE_CONSTS_FOR_NEW(sha384) DEFINE_CONSTS_FOR_NEW(sha512) -#endif static EVPobject * @@ -798,12 +792,10 @@ GEN_CONSTRUCTOR(md5) GEN_CONSTRUCTOR(sha1) -#ifdef _OPENSSL_SUPPORTS_SHA2 GEN_CONSTRUCTOR(sha224) GEN_CONSTRUCTOR(sha256) GEN_CONSTRUCTOR(sha384) GEN_CONSTRUCTOR(sha512) -#endif /* List of functions exported by this module */ @@ -815,12 +807,10 @@ #endif CONSTRUCTOR_METH_DEF(md5), CONSTRUCTOR_METH_DEF(sha1), -#ifdef _OPENSSL_SUPPORTS_SHA2 CONSTRUCTOR_METH_DEF(sha224), CONSTRUCTOR_METH_DEF(sha256), CONSTRUCTOR_METH_DEF(sha384), CONSTRUCTOR_METH_DEF(sha512), -#endif {NULL, NULL} /* Sentinel */ }; @@ -877,11 +867,9 @@ /* these constants are used by the convenience constructors */ INIT_CONSTRUCTOR_CONSTANTS(md5); INIT_CONSTRUCTOR_CONSTANTS(sha1); -#ifdef _OPENSSL_SUPPORTS_SHA2 INIT_CONSTRUCTOR_CONSTANTS(sha224); INIT_CONSTRUCTOR_CONSTANTS(sha256); INIT_CONSTRUCTOR_CONSTANTS(sha384); INIT_CONSTRUCTOR_CONSTANTS(sha512); -#endif return m; } -- Repository URL: https://hg.python.org/cpython From python-checkins at python.org Sun Jan 4 05:33:26 2015 From: python-checkins at python.org (zach.ware) Date: Sun, 04 Jan 2015 04:33:26 +0000 Subject: [Python-checkins] =?utf-8?q?cpython=3A_Closes_=2323154=3A_Fix_unn?= =?utf-8?q?ecessary_recompilation_of_OpenSSL_on_Windows?= Message-ID: <20150104043326.22421.41033@psf.io> https://hg.python.org/cpython/rev/d53506fe31e1 changeset: 94013:d53506fe31e1 user: Zachary Ware date: Sat Jan 03 22:33:10 2015 -0600 summary: Closes #23154: Fix unnecessary recompilation of OpenSSL on Windows files: PCbuild/libeay.vcxproj | 6 ------ PCbuild/openssl.props | 6 ++++++ PCbuild/ssleay.vcxproj | 6 ------ 3 files changed, 6 insertions(+), 12 deletions(-) diff --git a/PCbuild/libeay.vcxproj b/PCbuild/libeay.vcxproj --- a/PCbuild/libeay.vcxproj +++ b/PCbuild/libeay.vcxproj @@ -42,12 +42,6 @@ - - - StaticLibrary - $(opensslDir)tmp32\libeay\ - $(opensslDir)tmp64\libeay\ - diff --git a/PCbuild/openssl.props b/PCbuild/openssl.props --- a/PCbuild/openssl.props +++ b/PCbuild/openssl.props @@ -2,6 +2,12 @@ + + StaticLibrary + $(opensslDir)tmp\$(ArchName)_$(Configuration)\$(ProjectName)\ + $(opensslDir)tmp\$(ArchName)\$(ProjectName)\ + + diff --git a/PCbuild/ssleay.vcxproj b/PCbuild/ssleay.vcxproj --- a/PCbuild/ssleay.vcxproj +++ b/PCbuild/ssleay.vcxproj @@ -42,12 +42,6 @@ - - - StaticLibrary - $(opensslDir)tmp32\ssleay\ - $(opensslDir)tmp64\ssleay\ - -- Repository URL: https://hg.python.org/cpython From solipsis at pitrou.net Sun Jan 4 09:22:10 2015 From: solipsis at pitrou.net (solipsis at pitrou.net) Date: Sun, 04 Jan 2015 09:22:10 +0100 Subject: [Python-checkins] Daily reference leaks (37c6fd09f71f): sum=3 Message-ID: results for 37c6fd09f71f on branch "default" -------------------------------------------- test_collections leaked [-2, 2, 0] references, sum=0 test_functools leaked [0, 0, 3] memory blocks, sum=3 Command line was: ['./python', '-m', 'test.regrtest', '-uall', '-R', '3:3:/home/antoine/cpython/refleaks/reflogYTpcfK', '-x'] From python-checkins at python.org Sun Jan 4 09:37:21 2015 From: python-checkins at python.org (gregory.p.smith) Date: Sun, 04 Jan 2015 08:37:21 +0000 Subject: [Python-checkins] =?utf-8?q?cpython_=28merge_3=2E4_-=3E_default?= =?utf-8?q?=29=3A_fix_issue23157_-_time=5Fhashlib_hadn=27t_been_ported_to_?= =?utf-8?q?Python_3=2E?= Message-ID: <20150104083711.72565.42292@psf.io> https://hg.python.org/cpython/rev/b96985753613 changeset: 94015:b96985753613 parent: 94013:d53506fe31e1 parent: 94014:badb7e319ed0 user: Gregory P. Smith date: Sun Jan 04 00:36:59 2015 -0800 summary: fix issue23157 - time_hashlib hadn't been ported to Python 3. files: Lib/test/time_hashlib.py | 7 ++++--- 1 files changed, 4 insertions(+), 3 deletions(-) diff --git a/Lib/test/time_hashlib.py b/Lib/test/time_hashlib.py --- a/Lib/test/time_hashlib.py +++ b/Lib/test/time_hashlib.py @@ -1,7 +1,8 @@ # It's intended that this script be run by hand. It runs speed tests on # hashlib functions; it does not test for correctness. -import sys, time +import sys +import time import hashlib @@ -9,8 +10,8 @@ raise RuntimeError("eek, creatorFunc not overridden") def test_scaled_msg(scale, name): - iterations = 106201/scale * 20 - longStr = 'Z'*scale + iterations = 106201//scale * 20 + longStr = b'Z'*scale localCF = creatorFunc start = time.time() -- Repository URL: https://hg.python.org/cpython From python-checkins at python.org Sun Jan 4 09:37:21 2015 From: python-checkins at python.org (gregory.p.smith) Date: Sun, 04 Jan 2015 08:37:21 +0000 Subject: [Python-checkins] =?utf-8?q?cpython_=283=2E4=29=3A_fix_issue23157?= =?utf-8?q?_-_time=5Fhashlib_hadn=27t_been_ported_to_Python_3=2E?= Message-ID: <20150104083711.11579.88013@psf.io> https://hg.python.org/cpython/rev/badb7e319ed0 changeset: 94014:badb7e319ed0 branch: 3.4 parent: 94008:26e1b78448f9 user: Gregory P. Smith date: Sun Jan 04 00:36:04 2015 -0800 summary: fix issue23157 - time_hashlib hadn't been ported to Python 3. files: Lib/test/time_hashlib.py | 7 ++++--- 1 files changed, 4 insertions(+), 3 deletions(-) diff --git a/Lib/test/time_hashlib.py b/Lib/test/time_hashlib.py --- a/Lib/test/time_hashlib.py +++ b/Lib/test/time_hashlib.py @@ -1,7 +1,8 @@ # It's intended that this script be run by hand. It runs speed tests on # hashlib functions; it does not test for correctness. -import sys, time +import sys +import time import hashlib @@ -9,8 +10,8 @@ raise RuntimeError("eek, creatorFunc not overridden") def test_scaled_msg(scale, name): - iterations = 106201/scale * 20 - longStr = 'Z'*scale + iterations = 106201//scale * 20 + longStr = b'Z'*scale localCF = creatorFunc start = time.time() -- Repository URL: https://hg.python.org/cpython From python-checkins at python.org Sun Jan 4 17:20:44 2015 From: python-checkins at python.org (benjamin.peterson) Date: Sun, 04 Jan 2015 16:20:44 +0000 Subject: [Python-checkins] =?utf-8?q?cpython_=282=2E7=29=3A_make_SSLv23_th?= =?utf-8?q?e_default_version_in_ftplib_=28closes_=2323111=29?= Message-ID: <20150104162021.125896.97738@psf.io> https://hg.python.org/cpython/rev/98ee845a139a changeset: 94016:98ee845a139a branch: 2.7 parent: 94010:1346a6013127 user: Benjamin Peterson date: Sun Jan 04 10:20:16 2015 -0600 summary: make SSLv23 the default version in ftplib (closes #23111) files: Doc/library/ftplib.rst | 2 +- Lib/ftplib.py | 4 ++-- Misc/NEWS | 2 ++ 3 files changed, 5 insertions(+), 3 deletions(-) diff --git a/Doc/library/ftplib.rst b/Doc/library/ftplib.rst --- a/Doc/library/ftplib.rst +++ b/Doc/library/ftplib.rst @@ -384,7 +384,7 @@ .. attribute:: FTP_TLS.ssl_version - The SSL version to use (defaults to *TLSv1*). + The SSL version to use (defaults to :attr:`ssl.PROTOCOL_SSLv23`). .. method:: FTP_TLS.auth() diff --git a/Lib/ftplib.py b/Lib/ftplib.py --- a/Lib/ftplib.py +++ b/Lib/ftplib.py @@ -638,7 +638,7 @@ '221 Goodbye.' >>> ''' - ssl_version = ssl.PROTOCOL_TLSv1 + ssl_version = ssl.PROTOCOL_SSLv23 def __init__(self, host='', user='', passwd='', acct='', keyfile=None, certfile=None, timeout=_GLOBAL_DEFAULT_TIMEOUT): @@ -656,7 +656,7 @@ '''Set up secure control connection by using TLS/SSL.''' if isinstance(self.sock, ssl.SSLSocket): raise ValueError("Already using TLS") - if self.ssl_version == ssl.PROTOCOL_TLSv1: + if self.ssl_version >= ssl.PROTOCOL_SSLv23: resp = self.voidcmd('AUTH TLS') else: resp = self.voidcmd('AUTH SSL') diff --git a/Misc/NEWS b/Misc/NEWS --- a/Misc/NEWS +++ b/Misc/NEWS @@ -15,6 +15,8 @@ Library ------- +- Issue #23111: Maximize compatibility in protocol versions of ftplib.FTP_TLS. + - Issue #23112: Fix SimpleHTTPServer to correctly carry the query string and fragment when it redirects to add a trailing slash. -- Repository URL: https://hg.python.org/cpython From python-checkins at python.org Sun Jan 4 22:36:56 2015 From: python-checkins at python.org (benjamin.peterson) Date: Sun, 04 Jan 2015 21:36:56 +0000 Subject: [Python-checkins] =?utf-8?q?cpython_=282=2E7=29=3A_allow_a_SSLCon?= =?utf-8?q?text_to_be_given_to_ftplib=2EFTP=5FTLS?= Message-ID: <20150104213653.8765.37503@psf.io> https://hg.python.org/cpython/rev/e8342b3154d1 changeset: 94017:e8342b3154d1 branch: 2.7 user: Benjamin Peterson date: Sun Jan 04 15:36:31 2015 -0600 summary: allow a SSLContext to be given to ftplib.FTP_TLS files: Doc/library/ftplib.rst | 18 +++- Lib/ftplib.py | 22 ++++- Lib/test/test_ftplib.py | 115 +++++++++++++++++++++------ Misc/NEWS | 2 + 4 files changed, 120 insertions(+), 37 deletions(-) diff --git a/Doc/library/ftplib.rst b/Doc/library/ftplib.rst --- a/Doc/library/ftplib.rst +++ b/Doc/library/ftplib.rst @@ -55,18 +55,26 @@ *timeout* was added. -.. class:: FTP_TLS([host[, user[, passwd[, acct[, keyfile[, certfile[, timeout]]]]]]]) +.. class:: FTP_TLS([host[, user[, passwd[, acct[, keyfile[, certfile[, context[, timeout]]]]]]]]) - A :class:`FTP` subclass which adds TLS support to FTP as described in + A :class:`FTP` subclass which adds TLS support to FTP as described in :rfc:`4217`. Connect as usual to port 21 implicitly securing the FTP control connection before authenticating. Securing the data connection requires the user to - explicitly ask for it by calling the :meth:`prot_p` method. - *keyfile* and *certfile* are optional -- they can contain a PEM formatted - private key and certificate chain file name for the SSL connection. + explicitly ask for it by calling the :meth:`prot_p` method. *context* + is a :class:`ssl.SSLContext` object which allows bundling SSL configuration + options, certificates and private keys into a single (potentially + long-lived) structure. Please read :ref:`ssl-security` for best practices. + + *keyfile* and *certfile* are a legacy alternative to *context* -- they + can point to PEM-formatted private key and certificate chain files + (respectively) for the SSL connection. .. versionadded:: 2.7 + .. versionchanged:: 2.7.10 + The *context* parameter was added. + Here's a sample session using the :class:`FTP_TLS` class: >>> from ftplib import FTP_TLS diff --git a/Lib/ftplib.py b/Lib/ftplib.py --- a/Lib/ftplib.py +++ b/Lib/ftplib.py @@ -641,9 +641,21 @@ ssl_version = ssl.PROTOCOL_SSLv23 def __init__(self, host='', user='', passwd='', acct='', keyfile=None, - certfile=None, timeout=_GLOBAL_DEFAULT_TIMEOUT): + certfile=None, context=None, + timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None): + if context is not None and keyfile is not None: + raise ValueError("context and keyfile arguments are mutually " + "exclusive") + if context is not None and certfile is not None: + raise ValueError("context and certfile arguments are mutually " + "exclusive") self.keyfile = keyfile self.certfile = certfile + if context is None: + context = ssl._create_stdlib_context(self.ssl_version, + certfile=certfile, + keyfile=keyfile) + self.context = context self._prot_p = False FTP.__init__(self, host, user, passwd, acct, timeout) @@ -660,8 +672,8 @@ resp = self.voidcmd('AUTH TLS') else: resp = self.voidcmd('AUTH SSL') - self.sock = ssl.wrap_socket(self.sock, self.keyfile, self.certfile, - ssl_version=self.ssl_version) + self.sock = self.context.wrap_socket(self.sock, + server_hostname=self.host) self.file = self.sock.makefile(mode='rb') return resp @@ -692,8 +704,8 @@ def ntransfercmd(self, cmd, rest=None): conn, size = FTP.ntransfercmd(self, cmd, rest) if self._prot_p: - conn = ssl.wrap_socket(conn, self.keyfile, self.certfile, - ssl_version=self.ssl_version) + conn = self.context.wrap_socket(conn, + server_hostname=self.host) return conn, size def retrbinary(self, cmd, callback, blocksize=8192, rest=None): diff --git a/Lib/test/test_ftplib.py b/Lib/test/test_ftplib.py --- a/Lib/test/test_ftplib.py +++ b/Lib/test/test_ftplib.py @@ -20,7 +20,7 @@ from test.test_support import HOST, HOSTv6 threading = test_support.import_module('threading') - +TIMEOUT = 3 # the dummy data returned by server over the data channel when # RETR, LIST and NLST commands are issued RETR_DATA = 'abcde12345\r\n' * 1000 @@ -223,6 +223,7 @@ self.active = False self.active_lock = threading.Lock() self.host, self.port = self.socket.getsockname()[:2] + self.handler_instance = None def start(self): assert not self.active @@ -246,8 +247,7 @@ def handle_accept(self): conn, addr = self.accept() - self.handler = self.handler(conn) - self.close() + self.handler_instance = self.handler(conn) def handle_connect(self): self.close() @@ -262,7 +262,8 @@ if ssl is not None: - CERTFILE = os.path.join(os.path.dirname(__file__), "keycert.pem") + CERTFILE = os.path.join(os.path.dirname(__file__), "keycert3.pem") + CAFILE = os.path.join(os.path.dirname(__file__), "pycacert.pem") class SSLConnection(object, asyncore.dispatcher): """An asyncore.dispatcher subclass supporting TLS/SSL.""" @@ -271,23 +272,25 @@ _ssl_closing = False def secure_connection(self): - self.socket = ssl.wrap_socket(self.socket, suppress_ragged_eofs=False, - certfile=CERTFILE, server_side=True, - do_handshake_on_connect=False, - ssl_version=ssl.PROTOCOL_SSLv23) + socket = ssl.wrap_socket(self.socket, suppress_ragged_eofs=False, + certfile=CERTFILE, server_side=True, + do_handshake_on_connect=False, + ssl_version=ssl.PROTOCOL_SSLv23) + self.del_channel() + self.set_socket(socket) self._ssl_accepting = True def _do_ssl_handshake(self): try: self.socket.do_handshake() - except ssl.SSLError, err: + except ssl.SSLError as err: if err.args[0] in (ssl.SSL_ERROR_WANT_READ, ssl.SSL_ERROR_WANT_WRITE): return elif err.args[0] == ssl.SSL_ERROR_EOF: return self.handle_close() raise - except socket.error, err: + except socket.error as err: if err.args[0] == errno.ECONNABORTED: return self.handle_close() else: @@ -297,18 +300,21 @@ self._ssl_closing = True try: self.socket = self.socket.unwrap() - except ssl.SSLError, err: + except ssl.SSLError as err: if err.args[0] in (ssl.SSL_ERROR_WANT_READ, ssl.SSL_ERROR_WANT_WRITE): return - except socket.error, err: + except socket.error as err: # Any "socket error" corresponds to a SSL_ERROR_SYSCALL return # from OpenSSL's SSL_shutdown(), corresponding to a # closed socket condition. See also: # http://www.mail-archive.com/openssl-users at openssl.org/msg60710.html pass self._ssl_closing = False - super(SSLConnection, self).close() + if getattr(self, '_ccc', False) is False: + super(SSLConnection, self).close() + else: + pass def handle_read_event(self): if self._ssl_accepting: @@ -329,7 +335,7 @@ def send(self, data): try: return super(SSLConnection, self).send(data) - except ssl.SSLError, err: + except ssl.SSLError as err: if err.args[0] in (ssl.SSL_ERROR_EOF, ssl.SSL_ERROR_ZERO_RETURN, ssl.SSL_ERROR_WANT_READ, ssl.SSL_ERROR_WANT_WRITE): @@ -339,13 +345,13 @@ def recv(self, buffer_size): try: return super(SSLConnection, self).recv(buffer_size) - except ssl.SSLError, err: + except ssl.SSLError as err: if err.args[0] in (ssl.SSL_ERROR_WANT_READ, ssl.SSL_ERROR_WANT_WRITE): - return '' + return b'' if err.args[0] in (ssl.SSL_ERROR_EOF, ssl.SSL_ERROR_ZERO_RETURN): self.handle_close() - return '' + return b'' raise def handle_error(self): @@ -355,6 +361,8 @@ if (isinstance(self.socket, ssl.SSLSocket) and self.socket._sslobj is not None): self._do_ssl_shutdown() + else: + super(SSLConnection, self).close() class DummyTLS_DTPHandler(SSLConnection, DummyDTPHandler): @@ -462,12 +470,12 @@ def test_rename(self): self.client.rename('a', 'b') - self.server.handler.next_response = '200' + self.server.handler_instance.next_response = '200' self.assertRaises(ftplib.error_reply, self.client.rename, 'a', 'b') def test_delete(self): self.client.delete('foo') - self.server.handler.next_response = '199' + self.server.handler_instance.next_response = '199' self.assertRaises(ftplib.error_reply, self.client.delete, 'foo') def test_size(self): @@ -515,7 +523,7 @@ def test_storbinary(self): f = StringIO.StringIO(RETR_DATA) self.client.storbinary('stor', f) - self.assertEqual(self.server.handler.last_received_data, RETR_DATA) + self.assertEqual(self.server.handler_instance.last_received_data, RETR_DATA) # test new callback arg flag = [] f.seek(0) @@ -527,12 +535,12 @@ for r in (30, '30'): f.seek(0) self.client.storbinary('stor', f, rest=r) - self.assertEqual(self.server.handler.rest, str(r)) + self.assertEqual(self.server.handler_instance.rest, str(r)) def test_storlines(self): f = StringIO.StringIO(RETR_DATA.replace('\r\n', '\n')) self.client.storlines('stor', f) - self.assertEqual(self.server.handler.last_received_data, RETR_DATA) + self.assertEqual(self.server.handler_instance.last_received_data, RETR_DATA) # test new callback arg flag = [] f.seek(0) @@ -551,14 +559,14 @@ def test_makeport(self): self.client.makeport() # IPv4 is in use, just make sure send_eprt has not been used - self.assertEqual(self.server.handler.last_received_cmd, 'port') + self.assertEqual(self.server.handler_instance.last_received_cmd, 'port') def test_makepasv(self): host, port = self.client.makepasv() conn = socket.create_connection((host, port), 10) conn.close() # IPv4 is in use, just make sure send_epsv has not been used - self.assertEqual(self.server.handler.last_received_cmd, 'pasv') + self.assertEqual(self.server.handler_instance.last_received_cmd, 'pasv') def test_line_too_long(self): self.assertRaises(ftplib.Error, self.client.sendcmd, @@ -600,13 +608,13 @@ def test_makeport(self): self.client.makeport() - self.assertEqual(self.server.handler.last_received_cmd, 'eprt') + self.assertEqual(self.server.handler_instance.last_received_cmd, 'eprt') def test_makepasv(self): host, port = self.client.makepasv() conn = socket.create_connection((host, port), 10) conn.close() - self.assertEqual(self.server.handler.last_received_cmd, 'epsv') + self.assertEqual(self.server.handler_instance.last_received_cmd, 'epsv') def test_transfer(self): def retr(): @@ -642,7 +650,7 @@ def setUp(self): self.server = DummyTLS_FTPServer((HOST, 0)) self.server.start() - self.client = ftplib.FTP_TLS(timeout=10) + self.client = ftplib.FTP_TLS(timeout=TIMEOUT) self.client.connect(self.server.host, self.server.port) def tearDown(self): @@ -695,6 +703,59 @@ finally: self.client.ssl_version = ssl.PROTOCOL_TLSv1 + def test_context(self): + self.client.quit() + ctx = ssl.SSLContext(ssl.PROTOCOL_TLSv1) + self.assertRaises(ValueError, ftplib.FTP_TLS, keyfile=CERTFILE, + context=ctx) + self.assertRaises(ValueError, ftplib.FTP_TLS, certfile=CERTFILE, + context=ctx) + self.assertRaises(ValueError, ftplib.FTP_TLS, certfile=CERTFILE, + keyfile=CERTFILE, context=ctx) + + self.client = ftplib.FTP_TLS(context=ctx, timeout=TIMEOUT) + self.client.connect(self.server.host, self.server.port) + self.assertNotIsInstance(self.client.sock, ssl.SSLSocket) + self.client.auth() + self.assertIs(self.client.sock.context, ctx) + self.assertIsInstance(self.client.sock, ssl.SSLSocket) + + self.client.prot_p() + sock = self.client.transfercmd('list') + try: + self.assertIs(sock.context, ctx) + self.assertIsInstance(sock, ssl.SSLSocket) + finally: + sock.close() + + def test_check_hostname(self): + self.client.quit() + ctx = ssl.SSLContext(ssl.PROTOCOL_TLSv1) + ctx.verify_mode = ssl.CERT_REQUIRED + ctx.check_hostname = True + ctx.load_verify_locations(CAFILE) + self.client = ftplib.FTP_TLS(context=ctx, timeout=TIMEOUT) + + # 127.0.0.1 doesn't match SAN + self.client.connect(self.server.host, self.server.port) + with self.assertRaises(ssl.CertificateError): + self.client.auth() + # exception quits connection + + self.client.connect(self.server.host, self.server.port) + self.client.prot_p() + with self.assertRaises(ssl.CertificateError): + self.client.transfercmd("list").close() + self.client.quit() + + self.client.connect("localhost", self.server.port) + self.client.auth() + self.client.quit() + + self.client.connect("localhost", self.server.port) + self.client.prot_p() + self.client.transfercmd("list").close() + class TestTimeouts(TestCase): diff --git a/Misc/NEWS b/Misc/NEWS --- a/Misc/NEWS +++ b/Misc/NEWS @@ -15,6 +15,8 @@ Library ------- +- Backport the context argument to ftplib.FTP_TLS. + - Issue #23111: Maximize compatibility in protocol versions of ftplib.FTP_TLS. - Issue #23112: Fix SimpleHTTPServer to correctly carry the query string and -- Repository URL: https://hg.python.org/cpython From python-checkins at python.org Sun Jan 4 23:07:07 2015 From: python-checkins at python.org (benjamin.peterson) Date: Sun, 04 Jan 2015 22:07:07 +0000 Subject: [Python-checkins] =?utf-8?q?cpython_=283=2E2=29=3A_add_some_overf?= =?utf-8?q?low_checks_before_multiplying_=28closes_=2323165=29?= Message-ID: <20150104220625.72551.67814@psf.io> https://hg.python.org/cpython/rev/1ce98e85929d changeset: 94018:1ce98e85929d branch: 3.2 parent: 93995:1806a00fd6b8 user: Benjamin Peterson date: Sun Jan 04 16:03:17 2015 -0600 summary: add some overflow checks before multiplying (closes #23165) files: Misc/NEWS | 3 +++ Python/fileutils.c | 16 +++++++++++++--- 2 files changed, 16 insertions(+), 3 deletions(-) diff --git a/Misc/NEWS b/Misc/NEWS --- a/Misc/NEWS +++ b/Misc/NEWS @@ -10,6 +10,9 @@ Core and Builtins ----------------- +- Issue #23165: Perform overflow checks before allocating memory in the + _Py_char2wchar function. + - Issue #19529: Fix a potential crash in converting Unicode objects to wchar_t when Py_UNICODE is 4 bytes but wchar_t is 2 bytes, for example on AIX. diff --git a/Python/fileutils.c b/Python/fileutils.c --- a/Python/fileutils.c +++ b/Python/fileutils.c @@ -169,8 +169,11 @@ wchar_t *res; unsigned char *in; wchar_t *out; + size_t argsize = strlen(arg) + 1; - res = PyMem_Malloc((strlen(arg)+1)*sizeof(wchar_t)); + if (argsize > PY_SSIZE_T_MAX/sizeof(wchar_t)) + return NULL; + res = PyMem_Malloc(argsize*sizeof(wchar_t)); if (!res) return NULL; @@ -250,10 +253,15 @@ argsize = mbstowcs(NULL, arg, 0); #endif if (argsize != (size_t)-1) { - res = (wchar_t *)PyMem_Malloc((argsize+1)*sizeof(wchar_t)); + if (argsize == PY_SSIZE_T_MAX) + goto oom; + argsize += 1; + if (argsize > PY_SSIZE_T_MAX/sizeof(wchar_t)) + goto oom; + res = (wchar_t *)PyMem_Malloc(argsize*sizeof(wchar_t)); if (!res) goto oom; - count = mbstowcs(res, arg, argsize+1); + count = mbstowcs(res, arg, argsize); if (count != (size_t)-1) { wchar_t *tmp; /* Only use the result if it contains no @@ -276,6 +284,8 @@ /* Overallocate; as multi-byte characters are in the argument, the actual output could use less memory. */ argsize = strlen(arg) + 1; + if (argsize > PY_SSIZE_T_MAX/sizeof(wchar_t)) + goto oom; res = (wchar_t*)PyMem_Malloc(argsize*sizeof(wchar_t)); if (!res) goto oom; -- Repository URL: https://hg.python.org/cpython From python-checkins at python.org Sun Jan 4 23:07:07 2015 From: python-checkins at python.org (benjamin.peterson) Date: Sun, 04 Jan 2015 22:07:07 +0000 Subject: [Python-checkins] =?utf-8?q?cpython_=28merge_3=2E4_-=3E_default?= =?utf-8?b?KTogbWVyZ2UgMy40ICgjMjMxNjUp?= Message-ID: <20150104220625.11585.65183@psf.io> https://hg.python.org/cpython/rev/8c4fb312e15d changeset: 94021:8c4fb312e15d parent: 94015:b96985753613 parent: 94020:d45e16b1ed86 user: Benjamin Peterson date: Sun Jan 04 16:06:14 2015 -0600 summary: merge 3.4 (#23165) files: Misc/NEWS | 3 +++ Python/fileutils.c | 16 +++++++++++++--- 2 files changed, 16 insertions(+), 3 deletions(-) diff --git a/Misc/NEWS b/Misc/NEWS --- a/Misc/NEWS +++ b/Misc/NEWS @@ -193,6 +193,9 @@ exception. In versions prior to 3.5, '#' with 'c' had no effect. Now specifying it is an error. Patch by Torsten Landschoff. +- Issue #23165: Perform overflow checks before allocating memory in the + _Py_char2wchar function. + Library ------- diff --git a/Python/fileutils.c b/Python/fileutils.c --- a/Python/fileutils.c +++ b/Python/fileutils.c @@ -220,8 +220,11 @@ wchar_t *res; unsigned char *in; wchar_t *out; + size_t argsize = strlen(arg) + 1; - res = PyMem_RawMalloc((strlen(arg)+1)*sizeof(wchar_t)); + if (argsize > PY_SSIZE_T_MAX/sizeof(wchar_t)) + return NULL; + res = PyMem_RawMalloc(argsize*sizeof(wchar_t)); if (!res) return NULL; @@ -305,10 +308,15 @@ argsize = mbstowcs(NULL, arg, 0); #endif if (argsize != (size_t)-1) { - res = (wchar_t *)PyMem_RawMalloc((argsize+1)*sizeof(wchar_t)); + if (argsize == PY_SSIZE_T_MAX) + goto oom; + argsize += 1; + if (argsize > PY_SSIZE_T_MAX/sizeof(wchar_t)) + goto oom; + res = (wchar_t *)PyMem_RawMalloc(argsize*sizeof(wchar_t)); if (!res) goto oom; - count = mbstowcs(res, arg, argsize+1); + count = mbstowcs(res, arg, argsize); if (count != (size_t)-1) { wchar_t *tmp; /* Only use the result if it contains no @@ -331,6 +339,8 @@ /* Overallocate; as multi-byte characters are in the argument, the actual output could use less memory. */ argsize = strlen(arg) + 1; + if (argsize > PY_SSIZE_T_MAX/sizeof(wchar_t)) + goto oom; res = (wchar_t*)PyMem_RawMalloc(argsize*sizeof(wchar_t)); if (!res) goto oom; -- Repository URL: https://hg.python.org/cpython From python-checkins at python.org Sun Jan 4 23:07:07 2015 From: python-checkins at python.org (benjamin.peterson) Date: Sun, 04 Jan 2015 22:07:07 +0000 Subject: [Python-checkins] =?utf-8?b?Y3B5dGhvbiAobWVyZ2UgMy4yIC0+IDMuMyk6?= =?utf-8?q?_merge_3=2E2_=28closes_=2323165=29?= Message-ID: <20150104220625.11565.88480@psf.io> https://hg.python.org/cpython/rev/d1af6f3a8ce3 changeset: 94019:d1af6f3a8ce3 branch: 3.3 parent: 93996:3b202cc79a38 parent: 94018:1ce98e85929d user: Benjamin Peterson date: Sun Jan 04 16:03:59 2015 -0600 summary: merge 3.2 (closes #23165) files: Misc/NEWS | 3 +++ Python/fileutils.c | 16 +++++++++++++--- 2 files changed, 16 insertions(+), 3 deletions(-) diff --git a/Misc/NEWS b/Misc/NEWS --- a/Misc/NEWS +++ b/Misc/NEWS @@ -23,6 +23,9 @@ - Issue #22518: Fix integer overflow issues in latin-1 encoding. +- Issue #23165: Perform overflow checks before allocating memory in the + _Py_char2wchar function. + Library ------- diff --git a/Python/fileutils.c b/Python/fileutils.c --- a/Python/fileutils.c +++ b/Python/fileutils.c @@ -201,8 +201,11 @@ wchar_t *res; unsigned char *in; wchar_t *out; + size_t argsize = strlen(arg) + 1; - res = PyMem_Malloc((strlen(arg)+1)*sizeof(wchar_t)); + if (argsize > PY_SSIZE_T_MAX/sizeof(wchar_t)) + return NULL; + res = PyMem_Malloc(argsize*sizeof(wchar_t)); if (!res) return NULL; @@ -284,10 +287,15 @@ argsize = mbstowcs(NULL, arg, 0); #endif if (argsize != (size_t)-1) { - res = (wchar_t *)PyMem_Malloc((argsize+1)*sizeof(wchar_t)); + if (argsize == PY_SSIZE_T_MAX) + goto oom; + argsize += 1; + if (argsize > PY_SSIZE_T_MAX/sizeof(wchar_t)) + goto oom; + res = (wchar_t *)PyMem_Malloc(argsize*sizeof(wchar_t)); if (!res) goto oom; - count = mbstowcs(res, arg, argsize+1); + count = mbstowcs(res, arg, argsize); if (count != (size_t)-1) { wchar_t *tmp; /* Only use the result if it contains no @@ -310,6 +318,8 @@ /* Overallocate; as multi-byte characters are in the argument, the actual output could use less memory. */ argsize = strlen(arg) + 1; + if (argsize > PY_SSIZE_T_MAX/sizeof(wchar_t)) + goto oom; res = (wchar_t*)PyMem_Malloc(argsize*sizeof(wchar_t)); if (!res) goto oom; -- Repository URL: https://hg.python.org/cpython From python-checkins at python.org Sun Jan 4 23:07:07 2015 From: python-checkins at python.org (benjamin.peterson) Date: Sun, 04 Jan 2015 22:07:07 +0000 Subject: [Python-checkins] =?utf-8?b?Y3B5dGhvbiAobWVyZ2UgMy4zIC0+IDMuNCk6?= =?utf-8?q?_merge_3=2E3_=28closes_=2323165=29?= Message-ID: <20150104220625.72551.16592@psf.io> https://hg.python.org/cpython/rev/d45e16b1ed86 changeset: 94020:d45e16b1ed86 branch: 3.4 parent: 94014:badb7e319ed0 parent: 94019:d1af6f3a8ce3 user: Benjamin Peterson date: Sun Jan 04 16:05:39 2015 -0600 summary: merge 3.3 (closes #23165) files: Misc/NEWS | 3 +++ Python/fileutils.c | 16 +++++++++++++--- 2 files changed, 16 insertions(+), 3 deletions(-) diff --git a/Misc/NEWS b/Misc/NEWS --- a/Misc/NEWS +++ b/Misc/NEWS @@ -38,6 +38,9 @@ - Issue #22518: Fix integer overflow issues in latin-1 encoding. +- Issue #23165: Perform overflow checks before allocating memory in the + _Py_char2wchar function. + Library ------- diff --git a/Python/fileutils.c b/Python/fileutils.c --- a/Python/fileutils.c +++ b/Python/fileutils.c @@ -220,8 +220,11 @@ wchar_t *res; unsigned char *in; wchar_t *out; + size_t argsize = strlen(arg) + 1; - res = PyMem_RawMalloc((strlen(arg)+1)*sizeof(wchar_t)); + if (argsize > PY_SSIZE_T_MAX/sizeof(wchar_t)) + return NULL; + res = PyMem_RawMalloc(argsize*sizeof(wchar_t)); if (!res) return NULL; @@ -303,10 +306,15 @@ argsize = mbstowcs(NULL, arg, 0); #endif if (argsize != (size_t)-1) { - res = (wchar_t *)PyMem_RawMalloc((argsize+1)*sizeof(wchar_t)); + if (argsize == PY_SSIZE_T_MAX) + goto oom; + argsize += 1; + if (argsize > PY_SSIZE_T_MAX/sizeof(wchar_t)) + goto oom; + res = (wchar_t *)PyMem_RawMalloc(argsize*sizeof(wchar_t)); if (!res) goto oom; - count = mbstowcs(res, arg, argsize+1); + count = mbstowcs(res, arg, argsize); if (count != (size_t)-1) { wchar_t *tmp; /* Only use the result if it contains no @@ -329,6 +337,8 @@ /* Overallocate; as multi-byte characters are in the argument, the actual output could use less memory. */ argsize = strlen(arg) + 1; + if (argsize > PY_SSIZE_T_MAX/sizeof(wchar_t)) + goto oom; res = (wchar_t*)PyMem_RawMalloc(argsize*sizeof(wchar_t)); if (!res) goto oom; -- Repository URL: https://hg.python.org/cpython From python-checkins at python.org Sun Jan 4 23:32:01 2015 From: python-checkins at python.org (benjamin.peterson) Date: Sun, 04 Jan 2015 22:32:01 +0000 Subject: [Python-checkins] =?utf-8?q?cpython_=28merge_3=2E4_-=3E_default?= =?utf-8?b?KTogbWVyZ2UgMy40ICgjMjMxNjcp?= Message-ID: <20150104223023.11591.14734@psf.io> https://hg.python.org/cpython/rev/826831a9a376 changeset: 94023:826831a9a376 parent: 94021:8c4fb312e15d parent: 94022:8ac23d3242b4 user: Benjamin Peterson date: Sun Jan 04 16:30:13 2015 -0600 summary: merge 3.4 (#23167) files: Doc/library/marshal.rst | 2 +- 1 files changed, 1 insertions(+), 1 deletions(-) diff --git a/Doc/library/marshal.rst b/Doc/library/marshal.rst --- a/Doc/library/marshal.rst +++ b/Doc/library/marshal.rst @@ -106,7 +106,7 @@ format, version 1 shares interned strings and version 2 uses a binary format for floating point numbers. Version 3 adds support for object instancing and recursion. - The current version is 3. + The current version is 4. .. rubric:: Footnotes -- Repository URL: https://hg.python.org/cpython From python-checkins at python.org Sun Jan 4 23:32:01 2015 From: python-checkins at python.org (benjamin.peterson) Date: Sun, 04 Jan 2015 22:32:01 +0000 Subject: [Python-checkins] =?utf-8?q?cpython_=283=2E4=29=3A_the_current_ma?= =?utf-8?q?rshal_version_is_4_=28closes_=2323167=29?= Message-ID: <20150104223022.11575.21099@psf.io> https://hg.python.org/cpython/rev/8ac23d3242b4 changeset: 94022:8ac23d3242b4 branch: 3.4 parent: 94020:d45e16b1ed86 user: Benjamin Peterson date: Sun Jan 04 16:29:48 2015 -0600 summary: the current marshal version is 4 (closes #23167) Patch by Dmitry Kazakov. files: Doc/library/marshal.rst | 2 +- 1 files changed, 1 insertions(+), 1 deletions(-) diff --git a/Doc/library/marshal.rst b/Doc/library/marshal.rst --- a/Doc/library/marshal.rst +++ b/Doc/library/marshal.rst @@ -106,7 +106,7 @@ format, version 1 shares interned strings and version 2 uses a binary format for floating point numbers. Version 3 adds support for object instancing and recursion. - The current version is 3. + The current version is 4. .. rubric:: Footnotes -- Repository URL: https://hg.python.org/cpython From python-checkins at python.org Mon Jan 5 08:19:21 2015 From: python-checkins at python.org (berker.peksag) Date: Mon, 05 Jan 2015 07:19:21 +0000 Subject: [Python-checkins] =?utf-8?q?cpython_=28merge_3=2E4_-=3E_default?= =?utf-8?q?=29=3A_Issue_=2318644=3A_Fix_a_ResourceWarning_in_formatter=2Et?= =?utf-8?b?ZXN0KCku?= Message-ID: <20150105071805.22395.71925@psf.io> https://hg.python.org/cpython/rev/f374e4e6d04b changeset: 94025:f374e4e6d04b parent: 94023:826831a9a376 parent: 94024:f859a61f5853 user: Berker Peksag date: Mon Jan 05 09:20:07 2015 +0200 summary: Issue #18644: Fix a ResourceWarning in formatter.test(). Patch by Vajrasky Kok. files: Lib/formatter.py | 14 +++++++++----- 1 files changed, 9 insertions(+), 5 deletions(-) diff --git a/Lib/formatter.py b/Lib/formatter.py --- a/Lib/formatter.py +++ b/Lib/formatter.py @@ -436,11 +436,15 @@ fp = open(sys.argv[1]) else: fp = sys.stdin - for line in fp: - if line == '\n': - f.end_paragraph(1) - else: - f.add_flowing_data(line) + try: + for line in fp: + if line == '\n': + f.end_paragraph(1) + else: + f.add_flowing_data(line) + finally: + if fp is not sys.stdin: + fp.close() f.end_paragraph(0) -- Repository URL: https://hg.python.org/cpython From python-checkins at python.org Mon Jan 5 08:19:21 2015 From: python-checkins at python.org (berker.peksag) Date: Mon, 05 Jan 2015 07:19:21 +0000 Subject: [Python-checkins] =?utf-8?b?Y3B5dGhvbiAoMy40KTogSXNzdWUgIzE4NjQ0?= =?utf-8?q?=3A_Fix_a_ResourceWarning_in_formatter=2Etest=28=29=2E?= Message-ID: <20150105071804.8761.47447@psf.io> https://hg.python.org/cpython/rev/f859a61f5853 changeset: 94024:f859a61f5853 branch: 3.4 parent: 94022:8ac23d3242b4 user: Berker Peksag date: Mon Jan 05 09:19:40 2015 +0200 summary: Issue #18644: Fix a ResourceWarning in formatter.test(). Patch by Vajrasky Kok. files: Lib/formatter.py | 14 +++++++++----- 1 files changed, 9 insertions(+), 5 deletions(-) diff --git a/Lib/formatter.py b/Lib/formatter.py --- a/Lib/formatter.py +++ b/Lib/formatter.py @@ -436,11 +436,15 @@ fp = open(sys.argv[1]) else: fp = sys.stdin - for line in fp: - if line == '\n': - f.end_paragraph(1) - else: - f.add_flowing_data(line) + try: + for line in fp: + if line == '\n': + f.end_paragraph(1) + else: + f.add_flowing_data(line) + finally: + if fp is not sys.stdin: + fp.close() f.end_paragraph(0) -- Repository URL: https://hg.python.org/cpython From solipsis at pitrou.net Mon Jan 5 09:11:30 2015 From: solipsis at pitrou.net (solipsis at pitrou.net) Date: Mon, 05 Jan 2015 09:11:30 +0100 Subject: [Python-checkins] Daily reference leaks (826831a9a376): sum=6 Message-ID: results for 826831a9a376 on branch "default" -------------------------------------------- test_asyncio leaked [0, 0, 3] memory blocks, sum=3 test_functools leaked [0, 0, 3] memory blocks, sum=3 Command line was: ['./python', '-m', 'test.regrtest', '-uall', '-R', '3:3:/home/antoine/cpython/refleaks/reflogqFVfLC', '-x'] From python-checkins at python.org Mon Jan 5 10:06:45 2015 From: python-checkins at python.org (ned.deily) Date: Mon, 05 Jan 2015 09:06:45 +0000 Subject: [Python-checkins] =?utf-8?q?cpython_=28merge_3=2E4_-=3E_default?= =?utf-8?q?=29=3A_Issue_=2322165=3A_merge_from_3=2E4?= Message-ID: <20150105090607.125884.47082@psf.io> https://hg.python.org/cpython/rev/85258e08b69b changeset: 94027:85258e08b69b parent: 94025:f374e4e6d04b parent: 94026:1bc41bbbe02d user: Ned Deily date: Mon Jan 05 01:05:36 2015 -0800 summary: Issue #22165: merge from 3.4 files: Lib/test/test_httpservers.py | 1 + 1 files changed, 1 insertions(+), 0 deletions(-) diff --git a/Lib/test/test_httpservers.py b/Lib/test/test_httpservers.py --- a/Lib/test/test_httpservers.py +++ b/Lib/test/test_httpservers.py @@ -269,6 +269,7 @@ self.assertEqual(data, body) return body + @support.requires_mac_ver(10, 5) @unittest.skipUnless(support.TESTFN_UNDECODABLE, 'need support.TESTFN_UNDECODABLE') def test_undecodable_filename(self): -- Repository URL: https://hg.python.org/cpython From python-checkins at python.org Mon Jan 5 10:06:45 2015 From: python-checkins at python.org (ned.deily) Date: Mon, 05 Jan 2015 09:06:45 +0000 Subject: [Python-checkins] =?utf-8?b?Y3B5dGhvbiAoMy40KTogSXNzdWUgIzIyMTY1?= =?utf-8?q?=3A_Skip_test=5Fundecodable=5Ffilename_on_OS_X_prior_to_10=2E5?= =?utf-8?q?=2E?= Message-ID: <20150105090606.22417.1634@psf.io> https://hg.python.org/cpython/rev/1bc41bbbe02d changeset: 94026:1bc41bbbe02d branch: 3.4 parent: 94024:f859a61f5853 user: Ned Deily date: Mon Jan 05 01:02:30 2015 -0800 summary: Issue #22165: Skip test_undecodable_filename on OS X prior to 10.5. 10.4 systems do not allow creation of files with such filenames. files: Lib/test/test_httpservers.py | 1 + 1 files changed, 1 insertions(+), 0 deletions(-) diff --git a/Lib/test/test_httpservers.py b/Lib/test/test_httpservers.py --- a/Lib/test/test_httpservers.py +++ b/Lib/test/test_httpservers.py @@ -269,6 +269,7 @@ self.assertEqual(data, body) return body + @support.requires_mac_ver(10, 5) @unittest.skipUnless(support.TESTFN_UNDECODABLE, 'need support.TESTFN_UNDECODABLE') def test_undecodable_filename(self): -- Repository URL: https://hg.python.org/cpython From python-checkins at python.org Mon Jan 5 21:39:16 2015 From: python-checkins at python.org (benjamin.peterson) Date: Mon, 05 Jan 2015 20:39:16 +0000 Subject: [Python-checkins] =?utf-8?q?cpython_=28merge_3=2E4_-=3E_default?= =?utf-8?b?KTogbWVyZ2UgMy40?= Message-ID: <20150105203912.8751.41009@psf.io> https://hg.python.org/cpython/rev/a12da014d394 changeset: 94030:a12da014d394 parent: 94027:85258e08b69b parent: 94029:57a8befe75d4 user: Benjamin Peterson date: Mon Jan 05 14:39:06 2015 -0600 summary: merge 3.4 files: Doc/extending/extending.rst | 15 ++++++++++----- 1 files changed, 10 insertions(+), 5 deletions(-) diff --git a/Doc/extending/extending.rst b/Doc/extending/extending.rst --- a/Doc/extending/extending.rst +++ b/Doc/extending/extending.rst @@ -20,12 +20,17 @@ The compilation of an extension module depends on its intended use as well as on your system setup; details are given in later chapters. -Do note that if your use case is calling C library functions or system calls, -you should consider using the :mod:`ctypes` module rather than writing custom -C code. Not only does :mod:`ctypes` let you write Python code to interface -with C code, but it is more portable between implementations of Python than -writing and compiling an extension module which typically ties you to CPython. +.. note:: + The C extension interface is specific to CPython, and extension modules do + not work on other Python implementations. In many cases, it is possible to + avoid writing C extensions and preserve portability to other implementations. + For example, if your use case is calling C library functions or system calls, + you should consider using the :mod:`ctypes` module or the `cffi + `_ library rather than writing custom C code. + These modules let you write Python code to interface with C code and are more + portable between implementations of Python than writing and compiling a C + extension module. .. _extending-simpleexample: -- Repository URL: https://hg.python.org/cpython From python-checkins at python.org Mon Jan 5 21:39:16 2015 From: python-checkins at python.org (benjamin.peterson) Date: Mon, 05 Jan 2015 20:39:16 +0000 Subject: [Python-checkins] =?utf-8?q?cpython_=283=2E4=29=3A_emphasize_that?= =?utf-8?q?_cffi_is_better_than_extension_modules_for_portability?= Message-ID: <20150105203912.72567.41320@psf.io> https://hg.python.org/cpython/rev/57a8befe75d4 changeset: 94029:57a8befe75d4 branch: 3.4 parent: 94026:1bc41bbbe02d user: Benjamin Peterson date: Mon Jan 05 14:38:46 2015 -0600 summary: emphasize that cffi is better than extension modules for portability files: Doc/extending/extending.rst | 15 ++++++++++----- 1 files changed, 10 insertions(+), 5 deletions(-) diff --git a/Doc/extending/extending.rst b/Doc/extending/extending.rst --- a/Doc/extending/extending.rst +++ b/Doc/extending/extending.rst @@ -20,12 +20,17 @@ The compilation of an extension module depends on its intended use as well as on your system setup; details are given in later chapters. -Do note that if your use case is calling C library functions or system calls, -you should consider using the :mod:`ctypes` module rather than writing custom -C code. Not only does :mod:`ctypes` let you write Python code to interface -with C code, but it is more portable between implementations of Python than -writing and compiling an extension module which typically ties you to CPython. +.. note:: + The C extension interface is specific to CPython, and extension modules do + not work on other Python implementations. In many cases, it is possible to + avoid writing C extensions and preserve portability to other implementations. + For example, if your use case is calling C library functions or system calls, + you should consider using the :mod:`ctypes` module or the `cffi + `_ library rather than writing custom C code. + These modules let you write Python code to interface with C code and are more + portable between implementations of Python than writing and compiling a C + extension module. .. _extending-simpleexample: -- Repository URL: https://hg.python.org/cpython From python-checkins at python.org Mon Jan 5 21:39:16 2015 From: python-checkins at python.org (benjamin.peterson) Date: Mon, 05 Jan 2015 20:39:16 +0000 Subject: [Python-checkins] =?utf-8?q?cpython_=282=2E7=29=3A_emphasize_that?= =?utf-8?q?_cffi_is_better_than_extension_modules_for_portability?= Message-ID: <20150105203912.125894.46618@psf.io> https://hg.python.org/cpython/rev/518f40815684 changeset: 94028:518f40815684 branch: 2.7 parent: 94017:e8342b3154d1 user: Benjamin Peterson date: Mon Jan 05 14:38:46 2015 -0600 summary: emphasize that cffi is better than extension modules for portability files: Doc/extending/extending.rst | 15 ++++++++++----- 1 files changed, 10 insertions(+), 5 deletions(-) diff --git a/Doc/extending/extending.rst b/Doc/extending/extending.rst --- a/Doc/extending/extending.rst +++ b/Doc/extending/extending.rst @@ -20,12 +20,17 @@ The compilation of an extension module depends on its intended use as well as on your system setup; details are given in later chapters. -Do note that if your use case is calling C library functions or system calls, -you should consider using the :mod:`ctypes` module rather than writing custom -C code. Not only does :mod:`ctypes` let you write Python code to interface -with C code, but it is more portable between implementations of Python than -writing and compiling an extension module which typically ties you to CPython. +.. note:: + The C extension interface is specific to CPython, and extension modules do + not work on other Python implementations. In many cases, it is possible to + avoid writing C extensions and preserve portability to other implementations. + For example, if your use case is calling C library functions or system calls, + you should consider using the :mod:`ctypes` module or the `cffi + `_ library rather than writing custom C code. + These modules let you write Python code to interface with C code and are more + portable between implementations of Python than writing and compiling a C + extension module. .. _extending-simpleexample: -- Repository URL: https://hg.python.org/cpython From python-checkins at python.org Tue Jan 6 01:05:48 2015 From: python-checkins at python.org (victor.stinner) Date: Tue, 06 Jan 2015 00:05:48 +0000 Subject: [Python-checkins] =?utf-8?b?Y3B5dGhvbiAoMy40KTogSXNzdWUgIzIzMDQ2?= =?utf-8?q?=3A_Expose_the_BaseEventLoop_class_in_the_asyncio_namespace?= Message-ID: <20150106000541.22395.88746@psf.io> https://hg.python.org/cpython/rev/ddf6b78faed9 changeset: 94031:ddf6b78faed9 branch: 3.4 parent: 94029:57a8befe75d4 user: Victor Stinner date: Tue Jan 06 01:03:58 2015 +0100 summary: Issue #23046: Expose the BaseEventLoop class in the asyncio namespace files: Lib/asyncio/__init__.py | 4 +++- Lib/asyncio/base_events.py | 2 +- 2 files changed, 4 insertions(+), 2 deletions(-) diff --git a/Lib/asyncio/__init__.py b/Lib/asyncio/__init__.py --- a/Lib/asyncio/__init__.py +++ b/Lib/asyncio/__init__.py @@ -18,6 +18,7 @@ import _overlapped # Will also be exported. # This relies on each of the submodules having an __all__ variable. +from .base_events import * from .coroutines import * from .events import * from .futures import * @@ -29,7 +30,8 @@ from .tasks import * from .transports import * -__all__ = (coroutines.__all__ + +__all__ = (base_events.__all__ + + coroutines.__all__ + events.__all__ + futures.__all__ + locks.__all__ + diff --git a/Lib/asyncio/base_events.py b/Lib/asyncio/base_events.py --- a/Lib/asyncio/base_events.py +++ b/Lib/asyncio/base_events.py @@ -35,7 +35,7 @@ from .log import logger -__all__ = ['BaseEventLoop', 'Server'] +__all__ = ['BaseEventLoop'] # Argument for default thread pool executor creation. -- Repository URL: https://hg.python.org/cpython From python-checkins at python.org Tue Jan 6 01:05:48 2015 From: python-checkins at python.org (victor.stinner) Date: Tue, 06 Jan 2015 00:05:48 +0000 Subject: [Python-checkins] =?utf-8?q?cpython_=28merge_3=2E4_-=3E_default?= =?utf-8?q?=29=3A_Merge_3=2E4_=28asyncio=29?= Message-ID: <20150106000541.72577.94329@psf.io> https://hg.python.org/cpython/rev/5c5f1fe559d2 changeset: 94032:5c5f1fe559d2 parent: 94030:a12da014d394 parent: 94031:ddf6b78faed9 user: Victor Stinner date: Tue Jan 06 01:04:38 2015 +0100 summary: Merge 3.4 (asyncio) files: Lib/asyncio/__init__.py | 4 +++- Lib/asyncio/base_events.py | 2 +- 2 files changed, 4 insertions(+), 2 deletions(-) diff --git a/Lib/asyncio/__init__.py b/Lib/asyncio/__init__.py --- a/Lib/asyncio/__init__.py +++ b/Lib/asyncio/__init__.py @@ -18,6 +18,7 @@ import _overlapped # Will also be exported. # This relies on each of the submodules having an __all__ variable. +from .base_events import * from .coroutines import * from .events import * from .futures import * @@ -29,7 +30,8 @@ from .tasks import * from .transports import * -__all__ = (coroutines.__all__ + +__all__ = (base_events.__all__ + + coroutines.__all__ + events.__all__ + futures.__all__ + locks.__all__ + diff --git a/Lib/asyncio/base_events.py b/Lib/asyncio/base_events.py --- a/Lib/asyncio/base_events.py +++ b/Lib/asyncio/base_events.py @@ -35,7 +35,7 @@ from .log import logger -__all__ = ['BaseEventLoop', 'Server'] +__all__ = ['BaseEventLoop'] # Argument for default thread pool executor creation. -- Repository URL: https://hg.python.org/cpython From python-checkins at python.org Tue Jan 6 01:15:03 2015 From: python-checkins at python.org (victor.stinner) Date: Tue, 06 Jan 2015 00:15:03 +0000 Subject: [Python-checkins] =?utf-8?b?Y3B5dGhvbiAoMy40KTogSXNzdWUgIzIzMTQw?= =?utf-8?q?=2C_asyncio=3A_Fix_cancellation_of_Process=2Ewait=28=29=2E_Chec?= =?utf-8?q?k_the_state_of?= Message-ID: <20150106001452.11577.72439@psf.io> https://hg.python.org/cpython/rev/7c9b9d2514bb changeset: 94033:7c9b9d2514bb branch: 3.4 parent: 94031:ddf6b78faed9 user: Victor Stinner date: Tue Jan 06 01:13:49 2015 +0100 summary: Issue #23140, asyncio: Fix cancellation of Process.wait(). Check the state of the waiter future before setting its result. files: Lib/asyncio/subprocess.py | 3 +- Lib/test/test_asyncio/test_subprocess.py | 28 ++++++++++++ 2 files changed, 30 insertions(+), 1 deletions(-) diff --git a/Lib/asyncio/subprocess.py b/Lib/asyncio/subprocess.py --- a/Lib/asyncio/subprocess.py +++ b/Lib/asyncio/subprocess.py @@ -96,7 +96,8 @@ returncode = self._transport.get_returncode() while self._waiters: waiter = self._waiters.popleft() - waiter.set_result(returncode) + if not waiter.cancelled(): + waiter.set_result(returncode) class Process: diff --git a/Lib/test/test_asyncio/test_subprocess.py b/Lib/test/test_asyncio/test_subprocess.py --- a/Lib/test/test_asyncio/test_subprocess.py +++ b/Lib/test/test_asyncio/test_subprocess.py @@ -223,6 +223,34 @@ self.assertEqual(output.rstrip(), b'3') self.assertEqual(exitcode, 0) + def test_cancel_process_wait(self): + # Issue #23140: cancel Process.wait() + + @asyncio.coroutine + def wait_proc(proc, event): + event.set() + yield from proc.wait() + + @asyncio.coroutine + def cancel_wait(): + proc = yield from asyncio.create_subprocess_exec( + *PROGRAM_BLOCKED, + loop=self.loop) + + # Create an internal future waiting on the process exit + event = asyncio.Event(loop=self.loop) + task = self.loop.create_task(wait_proc(proc, event)) + yield from event.wait() + + # Cancel the future + task.cancel() + + # Kill the process and wait until it is done + proc.kill() + yield from proc.wait() + + self.loop.run_until_complete(cancel_wait()) + if sys.platform != 'win32': # Unix -- Repository URL: https://hg.python.org/cpython From python-checkins at python.org Tue Jan 6 01:15:03 2015 From: python-checkins at python.org (victor.stinner) Date: Tue, 06 Jan 2015 00:15:03 +0000 Subject: [Python-checkins] =?utf-8?q?cpython_=28merge_3=2E4_-=3E_default?= =?utf-8?q?=29=3A_Merge_3=2E4_=28asyncio=29?= Message-ID: <20150106001452.11581.13067@psf.io> https://hg.python.org/cpython/rev/454e04786fcc changeset: 94034:454e04786fcc parent: 94032:5c5f1fe559d2 parent: 94033:7c9b9d2514bb user: Victor Stinner date: Tue Jan 06 01:14:09 2015 +0100 summary: Merge 3.4 (asyncio) files: Lib/asyncio/subprocess.py | 3 +- Lib/test/test_asyncio/test_subprocess.py | 28 ++++++++++++ 2 files changed, 30 insertions(+), 1 deletions(-) diff --git a/Lib/asyncio/subprocess.py b/Lib/asyncio/subprocess.py --- a/Lib/asyncio/subprocess.py +++ b/Lib/asyncio/subprocess.py @@ -96,7 +96,8 @@ returncode = self._transport.get_returncode() while self._waiters: waiter = self._waiters.popleft() - waiter.set_result(returncode) + if not waiter.cancelled(): + waiter.set_result(returncode) class Process: diff --git a/Lib/test/test_asyncio/test_subprocess.py b/Lib/test/test_asyncio/test_subprocess.py --- a/Lib/test/test_asyncio/test_subprocess.py +++ b/Lib/test/test_asyncio/test_subprocess.py @@ -223,6 +223,34 @@ self.assertEqual(output.rstrip(), b'3') self.assertEqual(exitcode, 0) + def test_cancel_process_wait(self): + # Issue #23140: cancel Process.wait() + + @asyncio.coroutine + def wait_proc(proc, event): + event.set() + yield from proc.wait() + + @asyncio.coroutine + def cancel_wait(): + proc = yield from asyncio.create_subprocess_exec( + *PROGRAM_BLOCKED, + loop=self.loop) + + # Create an internal future waiting on the process exit + event = asyncio.Event(loop=self.loop) + task = self.loop.create_task(wait_proc(proc, event)) + yield from event.wait() + + # Cancel the future + task.cancel() + + # Kill the process and wait until it is done + proc.kill() + yield from proc.wait() + + self.loop.run_until_complete(cancel_wait()) + if sys.platform != 'win32': # Unix -- Repository URL: https://hg.python.org/cpython From python-checkins at python.org Tue Jan 6 01:23:55 2015 From: python-checkins at python.org (victor.stinner) Date: Tue, 06 Jan 2015 00:23:55 +0000 Subject: [Python-checkins] =?utf-8?q?cpython_=28merge_3=2E4_-=3E_default?= =?utf-8?q?=29=3A_Merge_3=2E4_=28asyncio=29?= Message-ID: <20150106002348.72555.80224@psf.io> https://hg.python.org/cpython/rev/bfd35f76fd0b changeset: 94036:bfd35f76fd0b parent: 94034:454e04786fcc parent: 94035:990ce80d8283 user: Victor Stinner date: Tue Jan 06 01:22:54 2015 +0100 summary: Merge 3.4 (asyncio) files: Lib/test/test_asyncio/test_subprocess.py | 14 +++++------ 1 files changed, 6 insertions(+), 8 deletions(-) diff --git a/Lib/test/test_asyncio/test_subprocess.py b/Lib/test/test_asyncio/test_subprocess.py --- a/Lib/test/test_asyncio/test_subprocess.py +++ b/Lib/test/test_asyncio/test_subprocess.py @@ -227,20 +227,18 @@ # Issue #23140: cancel Process.wait() @asyncio.coroutine - def wait_proc(proc, event): - event.set() - yield from proc.wait() - - @asyncio.coroutine def cancel_wait(): proc = yield from asyncio.create_subprocess_exec( *PROGRAM_BLOCKED, loop=self.loop) # Create an internal future waiting on the process exit - event = asyncio.Event(loop=self.loop) - task = self.loop.create_task(wait_proc(proc, event)) - yield from event.wait() + task = self.loop.create_task(proc.wait()) + self.loop.call_soon(task.cancel) + try: + yield from task + except asyncio.CancelledError: + pass # Cancel the future task.cancel() -- Repository URL: https://hg.python.org/cpython From python-checkins at python.org Tue Jan 6 01:23:56 2015 From: python-checkins at python.org (victor.stinner) Date: Tue, 06 Jan 2015 00:23:56 +0000 Subject: [Python-checkins] =?utf-8?b?Y3B5dGhvbiAoMy40KTogSXNzdWUgIzIzMTQw?= =?utf-8?q?=2C_asyncio=3A_Simplify_the_unit_test?= Message-ID: <20150106002348.22417.89847@psf.io> https://hg.python.org/cpython/rev/990ce80d8283 changeset: 94035:990ce80d8283 branch: 3.4 parent: 94033:7c9b9d2514bb user: Victor Stinner date: Tue Jan 06 01:22:45 2015 +0100 summary: Issue #23140, asyncio: Simplify the unit test files: Lib/test/test_asyncio/test_subprocess.py | 14 +++++------ 1 files changed, 6 insertions(+), 8 deletions(-) diff --git a/Lib/test/test_asyncio/test_subprocess.py b/Lib/test/test_asyncio/test_subprocess.py --- a/Lib/test/test_asyncio/test_subprocess.py +++ b/Lib/test/test_asyncio/test_subprocess.py @@ -227,20 +227,18 @@ # Issue #23140: cancel Process.wait() @asyncio.coroutine - def wait_proc(proc, event): - event.set() - yield from proc.wait() - - @asyncio.coroutine def cancel_wait(): proc = yield from asyncio.create_subprocess_exec( *PROGRAM_BLOCKED, loop=self.loop) # Create an internal future waiting on the process exit - event = asyncio.Event(loop=self.loop) - task = self.loop.create_task(wait_proc(proc, event)) - yield from event.wait() + task = self.loop.create_task(proc.wait()) + self.loop.call_soon(task.cancel) + try: + yield from task + except asyncio.CancelledError: + pass # Cancel the future task.cancel() -- Repository URL: https://hg.python.org/cpython From python-checkins at python.org Tue Jan 6 07:00:17 2015 From: python-checkins at python.org (raymond.hettinger) Date: Tue, 06 Jan 2015 06:00:17 +0000 Subject: [Python-checkins] =?utf-8?q?cpython_=28merge_3=2E4_-=3E_default?= =?utf-8?q?=29=3A_merge?= Message-ID: <20150106060015.11581.96544@psf.io> https://hg.python.org/cpython/rev/141f6a3c4153 changeset: 94038:141f6a3c4153 parent: 94036:bfd35f76fd0b parent: 94037:09b0da38ce8d user: Raymond Hettinger date: Mon Jan 05 22:00:08 2015 -0800 summary: merge files: Lib/functools.py | 153 ++++++++++++++++++---------------- 1 files changed, 79 insertions(+), 74 deletions(-) diff --git a/Lib/functools.py b/Lib/functools.py --- a/Lib/functools.py +++ b/Lib/functools.py @@ -89,110 +89,115 @@ ### total_ordering class decorator ################################################################################ -# The correct way to indicate that a comparison operation doesn't -# recognise the other type is to return NotImplemented and let the -# interpreter handle raising TypeError if both operands return -# NotImplemented from their respective comparison methods -# -# This makes the implementation of total_ordering more complicated, since -# we need to be careful not to trigger infinite recursion when two -# different types that both use this decorator encounter each other. -# -# For example, if a type implements __lt__, it's natural to define -# __gt__ as something like: -# -# lambda self, other: not self < other and not self == other -# -# However, using the operator syntax like that ends up invoking the full -# type checking machinery again and means we can end up bouncing back and -# forth between the two operands until we run out of stack space. -# -# The solution is to define helper functions that invoke the appropriate -# magic methods directly, ensuring we only try each operand once, and -# return NotImplemented immediately if it is returned from the -# underlying user provided method. Using this scheme, the __gt__ derived -# from a user provided __lt__ becomes: -# -# 'def __gt__(self, other):' + _not_op_and_not_eq % '__lt__' +# The total ordering functions all invoke the root magic method directly +# rather than using the corresponding operator. This avoids possible +# infinite recursion that could occur when the operator dispatch logic +# detects a NotImplemented result and then calls a reflected method. -# "not a < b" handles "a >= b" -# "not a <= b" handles "a > b" -# "not a >= b" handles "a < b" -# "not a > b" handles "a <= b" -_not_op = ''' - op_result = self.%s(other) +def _gt_from_lt(self, other): + 'Return a > b. Computed by @total_ordering from (not a < b) and (a != b).' + op_result = self.__lt__(other) + if op_result is NotImplemented: + return NotImplemented + return not op_result and self != other + +def _le_from_lt(self, other): + 'Return a <= b. Computed by @total_ordering from (a < b) or (a == b).' + op_result = self.__lt__(other) + return op_result or self == other + +def _ge_from_lt(self, other): + 'Return a >= b. Computed by @total_ordering from (not a < b).' + op_result = self.__lt__(other) if op_result is NotImplemented: return NotImplemented return not op_result -''' -# "a > b or a == b" handles "a >= b" -# "a < b or a == b" handles "a <= b" -_op_or_eq = ''' - op_result = self.%s(other) +def _ge_from_le(self, other): + 'Return a >= b. Computed by @total_ordering from (not a <= b) or (a == b).' + op_result = self.__le__(other) if op_result is NotImplemented: return NotImplemented - return op_result or self == other -''' + return not op_result or self == other -# "not (a < b or a == b)" handles "a > b" -# "not a < b and a != b" is equivalent -# "not (a > b or a == b)" handles "a < b" -# "not a > b and a != b" is equivalent -_not_op_and_not_eq = ''' - op_result = self.%s(other) +def _lt_from_le(self, other): + 'Return a < b. Computed by @total_ordering from (a <= b) and (a != b).' + op_result = self.__le__(other) + if op_result is NotImplemented: + return NotImplemented + return op_result and self != other + +def _gt_from_le(self, other): + 'Return a > b. Computed by @total_ordering from (not a <= b).' + op_result = self.__le__(other) + if op_result is NotImplemented: + return NotImplemented + return not op_result + +def _lt_from_gt(self, other): + 'Return a < b. Computed by @total_ordering from (not a > b) and (a != b).' + op_result = self.__gt__(other) if op_result is NotImplemented: return NotImplemented return not op_result and self != other -''' -# "not a <= b or a == b" handles "a >= b" -# "not a >= b or a == b" handles "a <= b" -_not_op_or_eq = ''' - op_result = self.%s(other) +def _ge_from_gt(self, other): + 'Return a >= b. Computed by @total_ordering from (a > b) or (a == b).' + op_result = self.__gt__(other) + return op_result or self == other + +def _le_from_gt(self, other): + 'Return a <= b. Computed by @total_ordering from (not a > b).' + op_result = self.__gt__(other) + if op_result is NotImplemented: + return NotImplemented + return not op_result + +def _le_from_ge(self, other): + 'Return a <= b. Computed by @total_ordering from (not a >= b) or (a == b).' + op_result = self.__ge__(other) if op_result is NotImplemented: return NotImplemented return not op_result or self == other -''' -# "a <= b and not a == b" handles "a < b" -# "a >= b and not a == b" handles "a > b" -_op_and_not_eq = ''' - op_result = self.%s(other) +def _gt_from_ge(self, other): + 'Return a > b. Computed by @total_ordering from (a >= b) and (a != b).' + op_result = self.__ge__(other) if op_result is NotImplemented: return NotImplemented return op_result and self != other -''' + +def _lt_from_ge(self, other): + 'Return a < b. Computed by @total_ordering from (not a >= b).' + op_result = self.__ge__(other) + if op_result is NotImplemented: + return NotImplemented + return not op_result def total_ordering(cls): """Class decorator that fills in missing ordering methods""" convert = { - '__lt__': {'__gt__': _not_op_and_not_eq, - '__le__': _op_or_eq, - '__ge__': _not_op}, - '__le__': {'__ge__': _not_op_or_eq, - '__lt__': _op_and_not_eq, - '__gt__': _not_op}, - '__gt__': {'__lt__': _not_op_and_not_eq, - '__ge__': _op_or_eq, - '__le__': _not_op}, - '__ge__': {'__le__': _not_op_or_eq, - '__gt__': _op_and_not_eq, - '__lt__': _not_op} + '__lt__': [('__gt__', _gt_from_lt), + ('__le__', _le_from_lt), + ('__ge__', _ge_from_lt)], + '__le__': [('__ge__', _ge_from_le), + ('__lt__', _lt_from_le), + ('__gt__', _gt_from_le)], + '__gt__': [('__lt__', _lt_from_gt), + ('__ge__', _ge_from_gt), + ('__le__', _le_from_gt)], + '__ge__': [('__le__', _le_from_ge), + ('__gt__', _gt_from_ge), + ('__lt__', _lt_from_ge)] } # Find user-defined comparisons (not those inherited from object). roots = [op for op in convert if getattr(cls, op, None) is not getattr(object, op, None)] if not roots: raise ValueError('must define at least one ordering operation: < > <= >=') root = max(roots) # prefer __lt__ to __le__ to __gt__ to __ge__ - for opname, opfunc in convert[root].items(): + for opname, opfunc in convert[root]: if opname not in roots: - namespace = {} - exec('def %s(self, other):%s' % (opname, opfunc % root), namespace) - opfunc = namespace[opname] - opfunc.__qualname__ = '%s.%s' % (cls.__qualname__, opname) - opfunc.__module__ = cls.__module__ - opfunc.__doc__ = getattr(int, opname).__doc__ + opfunc.__name__ = opname setattr(cls, opname, opfunc) return cls -- Repository URL: https://hg.python.org/cpython From python-checkins at python.org Tue Jan 6 07:00:17 2015 From: python-checkins at python.org (raymond.hettinger) Date: Tue, 06 Jan 2015 06:00:17 +0000 Subject: [Python-checkins] =?utf-8?b?Y3B5dGhvbiAoMy40KTogSXNzdWUgIzIzMTMy?= =?utf-8?q?=3A_Mitigate_regression_in_speed_and_clarity_in?= Message-ID: <20150106060015.125896.92510@psf.io> https://hg.python.org/cpython/rev/09b0da38ce8d changeset: 94037:09b0da38ce8d branch: 3.4 parent: 94035:990ce80d8283 user: Raymond Hettinger date: Mon Jan 05 21:52:10 2015 -0800 summary: Issue #23132: Mitigate regression in speed and clarity in functools.total_ordering. files: Lib/functools.py | 140 +++++++++++++++++++--------------- Misc/NEWS | 2 + 2 files changed, 79 insertions(+), 63 deletions(-) diff --git a/Lib/functools.py b/Lib/functools.py --- a/Lib/functools.py +++ b/Lib/functools.py @@ -89,91 +89,106 @@ ### total_ordering class decorator ################################################################################ -# The correct way to indicate that a comparison operation doesn't -# recognise the other type is to return NotImplemented and let the -# interpreter handle raising TypeError if both operands return -# NotImplemented from their respective comparison methods -# -# This makes the implementation of total_ordering more complicated, since -# we need to be careful not to trigger infinite recursion when two -# different types that both use this decorator encounter each other. -# -# For example, if a type implements __lt__, it's natural to define -# __gt__ as something like: -# -# lambda self, other: not self < other and not self == other -# -# However, using the operator syntax like that ends up invoking the full -# type checking machinery again and means we can end up bouncing back and -# forth between the two operands until we run out of stack space. -# -# The solution is to define helper functions that invoke the appropriate -# magic methods directly, ensuring we only try each operand once, and -# return NotImplemented immediately if it is returned from the -# underlying user provided method. Using this scheme, the __gt__ derived -# from a user provided __lt__ becomes: -# -# lambda self, other: _not_op_and_not_eq(self.__lt__, self, other)) +# The total ordering functions all invoke the root magic method directly +# rather than using the corresponding operator. This avoids possible +# infinite recursion that could occur when the operator dispatch logic +# detects a NotImplemented result and then calls a reflected method. -def _not_op(op, other): - # "not a < b" handles "a >= b" - # "not a <= b" handles "a > b" - # "not a >= b" handles "a < b" - # "not a > b" handles "a <= b" - op_result = op(other) +def _gt_from_lt(self, other): + 'Return a > b. Computed by @total_ordering from (not a < b) and (a != b).' + op_result = self.__lt__(other) + if op_result is NotImplemented: + return NotImplemented + return not op_result and self != other + +def _le_from_lt(self, other): + 'Return a <= b. Computed by @total_ordering from (a < b) or (a == b).' + op_result = self.__lt__(other) + return op_result or self == other + +def _ge_from_lt(self, other): + 'Return a >= b. Computed by @total_ordering from (not a < b).' + op_result = self.__lt__(other) if op_result is NotImplemented: return NotImplemented return not op_result -def _op_or_eq(op, self, other): - # "a < b or a == b" handles "a <= b" - # "a > b or a == b" handles "a >= b" - op_result = op(other) +def _ge_from_le(self, other): + 'Return a >= b. Computed by @total_ordering from (not a <= b) or (a == b).' + op_result = self.__le__(other) if op_result is NotImplemented: return NotImplemented - return op_result or self == other + return not op_result or self == other -def _not_op_and_not_eq(op, self, other): - # "not (a < b or a == b)" handles "a > b" - # "not a < b and a != b" is equivalent - # "not (a > b or a == b)" handles "a < b" - # "not a > b and a != b" is equivalent - op_result = op(other) +def _lt_from_le(self, other): + 'Return a < b. Computed by @total_ordering from (a <= b) and (a != b).' + op_result = self.__le__(other) + if op_result is NotImplemented: + return NotImplemented + return op_result and self != other + +def _gt_from_le(self, other): + 'Return a > b. Computed by @total_ordering from (not a <= b).' + op_result = self.__le__(other) + if op_result is NotImplemented: + return NotImplemented + return not op_result + +def _lt_from_gt(self, other): + 'Return a < b. Computed by @total_ordering from (not a > b) and (a != b).' + op_result = self.__gt__(other) if op_result is NotImplemented: return NotImplemented return not op_result and self != other -def _not_op_or_eq(op, self, other): - # "not a <= b or a == b" handles "a >= b" - # "not a >= b or a == b" handles "a <= b" - op_result = op(other) +def _ge_from_gt(self, other): + 'Return a >= b. Computed by @total_ordering from (a > b) or (a == b).' + op_result = self.__gt__(other) + return op_result or self == other + +def _le_from_gt(self, other): + 'Return a <= b. Computed by @total_ordering from (not a > b).' + op_result = self.__gt__(other) + if op_result is NotImplemented: + return NotImplemented + return not op_result + +def _le_from_ge(self, other): + 'Return a <= b. Computed by @total_ordering from (not a >= b) or (a == b).' + op_result = self.__ge__(other) if op_result is NotImplemented: return NotImplemented return not op_result or self == other -def _op_and_not_eq(op, self, other): - # "a <= b and not a == b" handles "a < b" - # "a >= b and not a == b" handles "a > b" - op_result = op(other) +def _gt_from_ge(self, other): + 'Return a > b. Computed by @total_ordering from (a >= b) and (a != b).' + op_result = self.__ge__(other) if op_result is NotImplemented: return NotImplemented return op_result and self != other +def _lt_from_ge(self, other): + 'Return a < b. Computed by @total_ordering from (not a >= b).' + op_result = self.__ge__(other) + if op_result is NotImplemented: + return NotImplemented + return not op_result + def total_ordering(cls): """Class decorator that fills in missing ordering methods""" convert = { - '__lt__': [('__gt__', lambda self, other: _not_op_and_not_eq(self.__lt__, self, other)), - ('__le__', lambda self, other: _op_or_eq(self.__lt__, self, other)), - ('__ge__', lambda self, other: _not_op(self.__lt__, other))], - '__le__': [('__ge__', lambda self, other: _not_op_or_eq(self.__le__, self, other)), - ('__lt__', lambda self, other: _op_and_not_eq(self.__le__, self, other)), - ('__gt__', lambda self, other: _not_op(self.__le__, other))], - '__gt__': [('__lt__', lambda self, other: _not_op_and_not_eq(self.__gt__, self, other)), - ('__ge__', lambda self, other: _op_or_eq(self.__gt__, self, other)), - ('__le__', lambda self, other: _not_op(self.__gt__, other))], - '__ge__': [('__le__', lambda self, other: _not_op_or_eq(self.__ge__, self, other)), - ('__gt__', lambda self, other: _op_and_not_eq(self.__ge__, self, other)), - ('__lt__', lambda self, other: _not_op(self.__ge__, other))] + '__lt__': [('__gt__', _gt_from_lt), + ('__le__', _le_from_lt), + ('__ge__', _ge_from_lt)], + '__le__': [('__ge__', _ge_from_le), + ('__lt__', _lt_from_le), + ('__gt__', _gt_from_le)], + '__gt__': [('__lt__', _lt_from_gt), + ('__ge__', _ge_from_gt), + ('__le__', _le_from_gt)], + '__ge__': [('__le__', _le_from_ge), + ('__gt__', _gt_from_ge), + ('__lt__', _lt_from_ge)] } # Find user-defined comparisons (not those inherited from object). roots = [op for op in convert if getattr(cls, op, None) is not getattr(object, op, None)] @@ -183,7 +198,6 @@ for opname, opfunc in convert[root]: if opname not in roots: opfunc.__name__ = opname - opfunc.__doc__ = getattr(int, opname).__doc__ setattr(cls, opname, opfunc) return cls diff --git a/Misc/NEWS b/Misc/NEWS --- a/Misc/NEWS +++ b/Misc/NEWS @@ -47,6 +47,8 @@ - Issue #23111: In the ftplib, make ssl.PROTOCOL_SSLv23 the default protocol version. +- Issue #23132: Mitigate regression in speed and clarity in functools.total_ordering. + - Issue #22585: On OpenBSD 5.6 and newer, os.urandom() now calls getentropy(), instead of reading /dev/urandom, to get pseudo-random bytes. -- Repository URL: https://hg.python.org/cpython From python-checkins at python.org Tue Jan 6 07:47:36 2015 From: python-checkins at python.org (zach.ware) Date: Tue, 06 Jan 2015 06:47:36 +0000 Subject: [Python-checkins] =?utf-8?q?cpython_=283=2E4=29=3A_Cosmetic_fixes?= =?utf-8?q?_to_the_=27Develop_with_asyncio=27_page?= Message-ID: <20150106064612.125900.81426@psf.io> https://hg.python.org/cpython/rev/37801e3b82e4 changeset: 94039:37801e3b82e4 branch: 3.4 parent: 94037:09b0da38ce8d user: Zachary Ware date: Tue Jan 06 00:40:43 2015 -0600 summary: Cosmetic fixes to the 'Develop with asyncio' page files: Doc/library/asyncio-dev.rst | 18 ++++++++++-------- 1 files changed, 10 insertions(+), 8 deletions(-) diff --git a/Doc/library/asyncio-dev.rst b/Doc/library/asyncio-dev.rst --- a/Doc/library/asyncio-dev.rst +++ b/Doc/library/asyncio-dev.rst @@ -71,8 +71,8 @@ .. seealso:: - See the :ref:`Synchronization primitives ` section to - synchronize tasks. + The :ref:`Synchronization primitives ` section describes ways + to synchronize tasks. .. _asyncio-handle-blocking: @@ -112,8 +112,8 @@ ---------------------------------------- When a coroutine function is called and its result is not passed to -:func:`async` or to the :meth:`BaseEventLoop.create_task` method: the execution -of the coroutine objet will never be scheduled and it is probably a bug. +:func:`async` or to the :meth:`BaseEventLoop.create_task` method, the execution +of the coroutine object will never be scheduled which is probably a bug. :ref:`Enable the debug mode of asyncio ` to :ref:`log a warning ` to detect it. @@ -147,7 +147,7 @@ Python usually calls :func:`sys.displayhook` on unhandled exceptions. If :meth:`Future.set_exception` is called, but the exception is never consumed, -:func:`sys.displayhook` is not called. Instead, a :ref:`a log is emitted +:func:`sys.displayhook` is not called. Instead, :ref:`a log is emitted ` when the future is deleted by the garbage collector, with the traceback where the exception was raised. @@ -195,7 +195,7 @@ raise Exception("not consumed") Exception: not consumed -There are different options to fix this issue. The first option is to chain to +There are different options to fix this issue. The first option is to chain the coroutine in another coroutine and use classic try/except:: @asyncio.coroutine @@ -218,10 +218,12 @@ except Exception: print("exception consumed") -See also the :meth:`Future.exception` method. +.. seealso:: + The :meth:`Future.exception` method. -Chain correctly coroutines + +Chain coroutines correctly -------------------------- When a coroutine function calls other coroutine functions and tasks, they -- Repository URL: https://hg.python.org/cpython From python-checkins at python.org Tue Jan 6 07:47:36 2015 From: python-checkins at python.org (zach.ware) Date: Tue, 06 Jan 2015 06:47:36 +0000 Subject: [Python-checkins] =?utf-8?q?cpython_=28merge_3=2E4_-=3E_default?= =?utf-8?q?=29=3A_Merge_with_3=2E4?= Message-ID: <20150106064612.8761.1508@psf.io> https://hg.python.org/cpython/rev/8120043810af changeset: 94040:8120043810af parent: 94038:141f6a3c4153 parent: 94039:37801e3b82e4 user: Zachary Ware date: Tue Jan 06 00:45:52 2015 -0600 summary: Merge with 3.4 files: Doc/library/asyncio-dev.rst | 18 ++++++++++-------- 1 files changed, 10 insertions(+), 8 deletions(-) diff --git a/Doc/library/asyncio-dev.rst b/Doc/library/asyncio-dev.rst --- a/Doc/library/asyncio-dev.rst +++ b/Doc/library/asyncio-dev.rst @@ -71,8 +71,8 @@ .. seealso:: - See the :ref:`Synchronization primitives ` section to - synchronize tasks. + The :ref:`Synchronization primitives ` section describes ways + to synchronize tasks. .. _asyncio-handle-blocking: @@ -112,8 +112,8 @@ ---------------------------------------- When a coroutine function is called and its result is not passed to -:func:`async` or to the :meth:`BaseEventLoop.create_task` method: the execution -of the coroutine objet will never be scheduled and it is probably a bug. +:func:`async` or to the :meth:`BaseEventLoop.create_task` method, the execution +of the coroutine object will never be scheduled which is probably a bug. :ref:`Enable the debug mode of asyncio ` to :ref:`log a warning ` to detect it. @@ -147,7 +147,7 @@ Python usually calls :func:`sys.displayhook` on unhandled exceptions. If :meth:`Future.set_exception` is called, but the exception is never consumed, -:func:`sys.displayhook` is not called. Instead, a :ref:`a log is emitted +:func:`sys.displayhook` is not called. Instead, :ref:`a log is emitted ` when the future is deleted by the garbage collector, with the traceback where the exception was raised. @@ -195,7 +195,7 @@ raise Exception("not consumed") Exception: not consumed -There are different options to fix this issue. The first option is to chain to +There are different options to fix this issue. The first option is to chain the coroutine in another coroutine and use classic try/except:: @asyncio.coroutine @@ -218,10 +218,12 @@ except Exception: print("exception consumed") -See also the :meth:`Future.exception` method. +.. seealso:: + The :meth:`Future.exception` method. -Chain correctly coroutines + +Chain coroutines correctly -------------------------- When a coroutine function calls other coroutine functions and tasks, they -- Repository URL: https://hg.python.org/cpython From solipsis at pitrou.net Tue Jan 6 09:30:19 2015 From: solipsis at pitrou.net (solipsis at pitrou.net) Date: Tue, 06 Jan 2015 09:30:19 +0100 Subject: [Python-checkins] Daily reference leaks (bfd35f76fd0b): sum=6 Message-ID: results for bfd35f76fd0b on branch "default" -------------------------------------------- test_asyncio leaked [0, 3, 0] memory blocks, sum=3 test_functools leaked [0, 0, 3] memory blocks, sum=3 Command line was: ['./python', '-m', 'test.regrtest', '-uall', '-R', '3:3:/home/antoine/cpython/refleaks/reflogjtElM7', '-x'] From python-checkins at python.org Tue Jan 6 11:53:05 2015 From: python-checkins at python.org (victor.stinner) Date: Tue, 06 Jan 2015 10:53:05 +0000 Subject: [Python-checkins] =?utf-8?q?cpython=3A_test=5Fssl=3A_add_more_deb?= =?utf-8?q?ug_to_investigate_test=5Fopenssl=5Fversion=28=29_failure_on?= Message-ID: <20150106105203.22395.51727@psf.io> https://hg.python.org/cpython/rev/87976d72fd5c changeset: 94041:87976d72fd5c user: Victor Stinner date: Tue Jan 06 11:51:06 2015 +0100 summary: test_ssl: add more debug to investigate test_openssl_version() failure on OpenBSD with LibreSSL. files: Lib/test/test_ssl.py | 4 ++-- 1 files changed, 2 insertions(+), 2 deletions(-) diff --git a/Lib/test/test_ssl.py b/Lib/test/test_ssl.py --- a/Lib/test/test_ssl.py +++ b/Lib/test/test_ssl.py @@ -312,10 +312,10 @@ # Version string as returned by {Open,Libre}SSL, the format might change if "LibreSSL" in s: self.assertTrue(s.startswith("LibreSSL {:d}.{:d}".format(major, minor)), - (s, t)) + (s, t, hex(n))) else: self.assertTrue(s.startswith("OpenSSL {:d}.{:d}.{:d}".format(major, minor, fix)), - (s, t)) + (s, t, hex(n))) @support.cpython_only def test_refcycle(self): -- Repository URL: https://hg.python.org/cpython From python-checkins at python.org Tue Jan 6 11:56:27 2015 From: python-checkins at python.org (vinay.sajip) Date: Tue, 06 Jan 2015 10:56:27 +0000 Subject: [Python-checkins] =?utf-8?q?cpython=3A_Closes_=2323151=3A_Removed?= =?utf-8?q?_unnecessary_initialization=2E?= Message-ID: <20150106105620.11579.91285@psf.io> https://hg.python.org/cpython/rev/8bfe230db0bc changeset: 94042:8bfe230db0bc user: Vinay Sajip date: Tue Jan 06 10:56:09 2015 +0000 summary: Closes #23151: Removed unnecessary initialization. files: Lib/logging/__init__.py | 2 -- 1 files changed, 0 insertions(+), 2 deletions(-) diff --git a/Lib/logging/__init__.py b/Lib/logging/__init__.py --- a/Lib/logging/__init__.py +++ b/Lib/logging/__init__.py @@ -1086,7 +1086,6 @@ # # Determine which class to use when instantiating loggers. # -_loggerClass = None def setLoggerClass(klass): """ @@ -1105,7 +1104,6 @@ """ Return the class to be used when instantiating a logger. """ - return _loggerClass class Manager(object): -- Repository URL: https://hg.python.org/cpython From python-checkins at python.org Tue Jan 6 12:11:03 2015 From: python-checkins at python.org (vinay.sajip) Date: Tue, 06 Jan 2015 11:11:03 +0000 Subject: [Python-checkins] =?utf-8?q?cpython_=282=2E7=29=3A_Clarified_docu?= =?utf-8?q?mentation_for_logging_exception_function/method=2E?= Message-ID: <20150106111100.22413.55383@psf.io> https://hg.python.org/cpython/rev/86374d71d4d2 changeset: 94043:86374d71d4d2 branch: 2.7 parent: 94028:518f40815684 user: Vinay Sajip date: Tue Jan 06 11:10:12 2015 +0000 summary: Clarified documentation for logging exception function/method. files: Doc/library/logging.rst | 10 ++++++---- 1 files changed, 6 insertions(+), 4 deletions(-) diff --git a/Doc/library/logging.rst b/Doc/library/logging.rst --- a/Doc/library/logging.rst +++ b/Doc/library/logging.rst @@ -227,8 +227,9 @@ .. method:: Logger.exception(msg, *args, **kwargs) Logs a message with level :const:`ERROR` on this logger. The arguments are - interpreted as for :meth:`debug`. Exception info is added to the logging - message. This method should only be called from an exception handler. + interpreted as for :meth:`debug`, except that any passed *exc_info* is not + inspected. Exception info is always added to the logging message. This method + should only be called from an exception handler. .. method:: Logger.addFilter(filt) @@ -845,8 +846,9 @@ .. function:: exception(msg[, *args[, **kwargs]]) Logs a message with level :const:`ERROR` on the root logger. The arguments are - interpreted as for :func:`debug`. Exception info is added to the logging - message. This function should only be called from an exception handler. + interpreted as for :func:`debug`, except that any passed *exc_info* is not + inspected. Exception info is always added to the logging message. This + function should only be called from an exception handler. .. function:: log(level, msg[, *args[, **kwargs]]) -- Repository URL: https://hg.python.org/cpython From python-checkins at python.org Tue Jan 6 12:19:57 2015 From: python-checkins at python.org (vinay.sajip) Date: Tue, 06 Jan 2015 11:19:57 +0000 Subject: [Python-checkins] =?utf-8?q?cpython=3A_Closes_=2321980=3A_Added_a?= =?utf-8?b?IF9fcmVwcl9fIGZvciBMb2dSZWNvcmQu?= Message-ID: <20150106111953.8739.47223@psf.io> https://hg.python.org/cpython/rev/390ffd39631b changeset: 94044:390ffd39631b parent: 94042:8bfe230db0bc user: Vinay Sajip date: Tue Jan 06 11:19:42 2015 +0000 summary: Closes #21980: Added a __repr__ for LogRecord. files: Lib/logging/__init__.py | 6 ++++-- 1 files changed, 4 insertions(+), 2 deletions(-) diff --git a/Lib/logging/__init__.py b/Lib/logging/__init__.py --- a/Lib/logging/__init__.py +++ b/Lib/logging/__init__.py @@ -1,4 +1,4 @@ -# Copyright 2001-2014 by Vinay Sajip. All Rights Reserved. +# Copyright 2001-2015 by Vinay Sajip. All Rights Reserved. # # Permission to use, copy, modify, and distribute this software and its # documentation for any purpose and without fee is hereby granted, @@ -18,7 +18,7 @@ Logging package for Python. Based on PEP 282 and comments thereto in comp.lang.python. -Copyright (C) 2001-2014 Vinay Sajip. All Rights Reserved. +Copyright (C) 2001-2015 Vinay Sajip. All Rights Reserved. To use, simply 'import logging' and log away! """ @@ -316,6 +316,8 @@ return ''%(self.name, self.levelno, self.pathname, self.lineno, self.msg) + __repr__ = __str__ + def getMessage(self): """ Return the message for this LogRecord. -- Repository URL: https://hg.python.org/cpython From python-checkins at python.org Tue Jan 6 12:23:51 2015 From: python-checkins at python.org (victor.stinner) Date: Tue, 06 Jan 2015 11:23:51 +0000 Subject: [Python-checkins] =?utf-8?q?cpython_=28merge_3=2E4_-=3E_default?= =?utf-8?q?=29=3A_Null_merge_python_3=2E4_=28change_already_applied_to_Pyt?= =?utf-8?q?hon_3=2E5=29?= Message-ID: <20150106112350.11577.49205@psf.io> https://hg.python.org/cpython/rev/9654ef862c2b changeset: 94046:9654ef862c2b parent: 94044:390ffd39631b parent: 94045:a8c4925e2359 user: Victor Stinner date: Tue Jan 06 12:23:15 2015 +0100 summary: Null merge python 3.4 (change already applied to Python 3.5) files: -- Repository URL: https://hg.python.org/cpython From python-checkins at python.org Tue Jan 6 12:23:51 2015 From: python-checkins at python.org (victor.stinner) Date: Tue, 06 Jan 2015 11:23:51 +0000 Subject: [Python-checkins] =?utf-8?b?Y3B5dGhvbiAoMy40KTogSXNzdWUgIzIwODk2?= =?utf-8?q?=2C_=2322935=3A_The_ssl=2Eget=5Fserver=5Fcertificate=28=29_func?= =?utf-8?q?tion_now_uses_the?= Message-ID: <20150106112349.72561.51614@psf.io> https://hg.python.org/cpython/rev/a8c4925e2359 changeset: 94045:a8c4925e2359 branch: 3.4 parent: 94039:37801e3b82e4 user: Victor Stinner date: Tue Jan 06 12:21:26 2015 +0100 summary: Issue #20896, #22935: The ssl.get_server_certificate() function now uses the ssl.PROTOCOL_SSLv23 protocol by default, not ssl.PROTOCOL_SSLv3, for maximum compatibility and support platforms where ssl.PROTOCOL_SSLv3 support is disabled. files: Lib/ssl.py | 2 +- Misc/NEWS | 5 +++++ 2 files changed, 6 insertions(+), 1 deletions(-) diff --git a/Lib/ssl.py b/Lib/ssl.py --- a/Lib/ssl.py +++ b/Lib/ssl.py @@ -922,7 +922,7 @@ d = pem_cert_string.strip()[len(PEM_HEADER):-len(PEM_FOOTER)] return base64.decodebytes(d.encode('ASCII', 'strict')) -def get_server_certificate(addr, ssl_version=PROTOCOL_SSLv3, ca_certs=None): +def get_server_certificate(addr, ssl_version=PROTOCOL_SSLv23, ca_certs=None): """Retrieve the certificate from the server at the specified address, and return it as a PEM-encoded string. If 'ca_certs' is specified, validate the server cert against it. diff --git a/Misc/NEWS b/Misc/NEWS --- a/Misc/NEWS +++ b/Misc/NEWS @@ -44,6 +44,11 @@ Library ------- +- Issue #20896, #22935: The :func:`ssl.get_server_certificate` function now + uses the :data:`~ssl.PROTOCOL_SSLv23` protocol by default, not + :data:`~ssl.PROTOCOL_SSLv3`, for maximum compatibility and support platforms + where :data:`~ssl.PROTOCOL_SSLv3` support is disabled. + - Issue #23111: In the ftplib, make ssl.PROTOCOL_SSLv23 the default protocol version. -- Repository URL: https://hg.python.org/cpython From python-checkins at python.org Tue Jan 6 12:40:17 2015 From: python-checkins at python.org (victor.stinner) Date: Tue, 06 Jan 2015 11:40:17 +0000 Subject: [Python-checkins] =?utf-8?b?Y3B5dGhvbiAoMi43KTogSXNzdWUgIzIzMTY4?= =?utf-8?q?=3A_skip_sys=2Estdin=2Eseek=28=29_test_if_stdin_is_not_a_TTY?= Message-ID: <20150106113957.11575.85700@psf.io> https://hg.python.org/cpython/rev/7f30206d402f changeset: 94047:7f30206d402f branch: 2.7 parent: 94043:86374d71d4d2 user: Victor Stinner date: Tue Jan 06 12:39:45 2015 +0100 summary: Issue #23168: skip sys.stdin.seek() test if stdin is not a TTY files: Lib/test/test_file2k.py | 22 ++++++++++++++-------- 1 files changed, 14 insertions(+), 8 deletions(-) diff --git a/Lib/test/test_file2k.py b/Lib/test/test_file2k.py --- a/Lib/test/test_file2k.py +++ b/Lib/test/test_file2k.py @@ -230,14 +230,20 @@ else: f.close() - def testStdin(self): - # This causes the interpreter to exit on OSF1 v5.1. - if sys.platform != 'osf1V5': - self.assertRaises(IOError, sys.stdin.seek, -1) - else: - print >>sys.__stdout__, ( - ' Skipping sys.stdin.seek(-1), it may crash the interpreter.' - ' Test manually.') + def testStdinSeek(self): + if sys.platform == 'osf1V5': + # This causes the interpreter to exit on OSF1 v5.1. + self.skipTest('Skipping sys.stdin.seek(-1), it may crash ' + 'the interpreter. Test manually.') + + if not sys.stdin.isatty(): + # Issue #23168: if stdin is redirected to a file, stdin becomes + # seekable + self.skipTest('stdin must be a TTY in this test') + + self.assertRaises(IOError, sys.stdin.seek, -1) + + def testStdinTruncate(self): self.assertRaises(IOError, sys.stdin.truncate) def testUnicodeOpen(self): -- Repository URL: https://hg.python.org/cpython From python-checkins at python.org Tue Jan 6 14:00:50 2015 From: python-checkins at python.org (victor.stinner) Date: Tue, 06 Jan 2015 13:00:50 +0000 Subject: [Python-checkins] =?utf-8?q?cpython_=28merge_3=2E4_-=3E_default?= =?utf-8?q?=29=3A_Null_merge_3=2E4_=28changes_already_applied_to_Python_3?= =?utf-8?b?LjUp?= Message-ID: <20150106130030.22423.41176@psf.io> https://hg.python.org/cpython/rev/535232275fa2 changeset: 94051:535232275fa2 parent: 94048:05d7550bd2d9 parent: 94050:7f82f50fdad0 user: Victor Stinner date: Tue Jan 06 13:58:57 2015 +0100 summary: Null merge 3.4 (changes already applied to Python 3.5) files: -- Repository URL: https://hg.python.org/cpython From python-checkins at python.org Tue Jan 6 14:00:50 2015 From: python-checkins at python.org (victor.stinner) Date: Tue, 06 Jan 2015 13:00:50 +0000 Subject: [Python-checkins] =?utf-8?q?cpython=3A_Issue_=2323177=3A_Document?= =?utf-8?q?_that_ssl=2ERAND=5Fegd=28=29_is_not_available_with_LibreSSL?= Message-ID: <20150106130029.72565.36209@psf.io> https://hg.python.org/cpython/rev/05d7550bd2d9 changeset: 94048:05d7550bd2d9 parent: 94046:9654ef862c2b user: Victor Stinner date: Tue Jan 06 13:53:09 2015 +0100 summary: Issue #23177: Document that ssl.RAND_egd() is not available with LibreSSL files: Doc/library/ssl.rst | 2 ++ 1 files changed, 2 insertions(+), 0 deletions(-) diff --git a/Doc/library/ssl.rst b/Doc/library/ssl.rst --- a/Doc/library/ssl.rst +++ b/Doc/library/ssl.rst @@ -327,6 +327,8 @@ See http://egd.sourceforge.net/ or http://prngd.sourceforge.net/ for sources of entropy-gathering daemons. + Availability: not available with LibreSSL. + .. function:: RAND_add(bytes, entropy) Mixes the given *bytes* into the SSL pseudo-random number generator. The -- Repository URL: https://hg.python.org/cpython From python-checkins at python.org Tue Jan 6 14:00:50 2015 From: python-checkins at python.org (victor.stinner) Date: Tue, 06 Jan 2015 13:00:50 +0000 Subject: [Python-checkins] =?utf-8?b?Y3B5dGhvbiAoMi43KTogSXNzdWUgIzIxMzU2?= =?utf-8?q?=3A_Make_ssl=2ERAND=5Fegd=28=29_optional_to_support_LibreSSL=2E?= =?utf-8?q?_The?= Message-ID: <20150106130030.8739.13517@psf.io> https://hg.python.org/cpython/rev/eddcb6671a48 changeset: 94049:eddcb6671a48 branch: 2.7 parent: 94047:7f30206d402f user: Victor Stinner date: Tue Jan 06 13:53:37 2015 +0100 summary: Issue #21356: Make ssl.RAND_egd() optional to support LibreSSL. The availability of the function is checked during the compilation. Patch written by Bernard Spil. files: Doc/library/ssl.rst | 2 + Lib/socket.py | 6 +++- Lib/ssl.py | 7 ++++- Lib/test/test_ssl.py | 5 ++- Misc/NEWS | 4 +++ Modules/_ssl.c | 13 +++++++-- configure | 42 ++++++++++++++++++++++++++++++++ configure.ac | 3 ++ pyconfig.h.in | 3 ++ 9 files changed, 78 insertions(+), 7 deletions(-) diff --git a/Doc/library/ssl.rst b/Doc/library/ssl.rst --- a/Doc/library/ssl.rst +++ b/Doc/library/ssl.rst @@ -299,6 +299,8 @@ See http://egd.sourceforge.net/ or http://prngd.sourceforge.net/ for sources of entropy-gathering daemons. + Availability: not available with LibreSSL. + .. function:: RAND_add(bytes, entropy) Mixes the given *bytes* into the SSL pseudo-random number generator. The diff --git a/Lib/socket.py b/Lib/socket.py --- a/Lib/socket.py +++ b/Lib/socket.py @@ -67,7 +67,6 @@ from _ssl import SSLError as sslerror from _ssl import \ RAND_add, \ - RAND_egd, \ RAND_status, \ SSL_ERROR_ZERO_RETURN, \ SSL_ERROR_WANT_READ, \ @@ -78,6 +77,11 @@ SSL_ERROR_WANT_CONNECT, \ SSL_ERROR_EOF, \ SSL_ERROR_INVALID_ERROR_CODE + try: + from _ssl import RAND_egd + except ImportError: + # LibreSSL does not provide RAND_egd + pass import os, sys, warnings diff --git a/Lib/ssl.py b/Lib/ssl.py --- a/Lib/ssl.py +++ b/Lib/ssl.py @@ -106,7 +106,12 @@ from _ssl import (VERIFY_DEFAULT, VERIFY_CRL_CHECK_LEAF, VERIFY_CRL_CHECK_CHAIN, VERIFY_X509_STRICT) from _ssl import txt2obj as _txt2obj, nid2obj as _nid2obj -from _ssl import RAND_status, RAND_egd, RAND_add +from _ssl import RAND_status, RAND_add +try: + from _ssl import RAND_egd +except ImportError: + # LibreSSL does not provide RAND_egd + pass def _import_symbols(prefix): for n in dir(_ssl): diff --git a/Lib/test/test_ssl.py b/Lib/test/test_ssl.py --- a/Lib/test/test_ssl.py +++ b/Lib/test/test_ssl.py @@ -169,8 +169,9 @@ sys.stdout.write("\n RAND_status is %d (%s)\n" % (v, (v and "sufficient randomness") or "insufficient randomness")) - self.assertRaises(TypeError, ssl.RAND_egd, 1) - self.assertRaises(TypeError, ssl.RAND_egd, 'foo', 1) + if hasattr(ssl, 'RAND_egd'): + self.assertRaises(TypeError, ssl.RAND_egd, 1) + self.assertRaises(TypeError, ssl.RAND_egd, 'foo', 1) ssl.RAND_add("this is a random string", 75.0) def test_parse_cert(self): diff --git a/Misc/NEWS b/Misc/NEWS --- a/Misc/NEWS +++ b/Misc/NEWS @@ -15,6 +15,10 @@ Library ------- +- Issue #21356: Make ssl.RAND_egd() optional to support LibreSSL. The + availability of the function is checked during the compilation. Patch written + by Bernard Spil. + - Backport the context argument to ftplib.FTP_TLS. - Issue #23111: Maximize compatibility in protocol versions of ftplib.FTP_TLS. diff --git a/Modules/_ssl.c b/Modules/_ssl.c --- a/Modules/_ssl.c +++ b/Modules/_ssl.c @@ -3301,6 +3301,11 @@ It is necessary to seed the PRNG with RAND_add() on some platforms before\n\ using the ssl() function."); +#endif /* HAVE_OPENSSL_RAND */ + + +#ifdef HAVE_RAND_EGD + static PyObject * PySSL_RAND_egd(PyObject *self, PyObject *arg) { @@ -3327,7 +3332,7 @@ Returns number of bytes read. Raises SSLError if connection to EGD\n\ fails or if it does not provide enough data to seed PRNG."); -#endif /* HAVE_OPENSSL_RAND */ +#endif /* HAVE_RAND_EGD */ PyDoc_STRVAR(PySSL_get_default_verify_paths_doc, @@ -3720,10 +3725,12 @@ #ifdef HAVE_OPENSSL_RAND {"RAND_add", PySSL_RAND_add, METH_VARARGS, PySSL_RAND_add_doc}, + {"RAND_status", (PyCFunction)PySSL_RAND_status, METH_NOARGS, + PySSL_RAND_status_doc}, +#endif +#ifdef HAVE_RAND_EGD {"RAND_egd", PySSL_RAND_egd, METH_VARARGS, PySSL_RAND_egd_doc}, - {"RAND_status", (PyCFunction)PySSL_RAND_status, METH_NOARGS, - PySSL_RAND_status_doc}, #endif {"get_default_verify_paths", (PyCFunction)PySSL_get_default_verify_paths, METH_NOARGS, PySSL_get_default_verify_paths_doc}, diff --git a/configure b/configure --- a/configure +++ b/configure @@ -8551,6 +8551,48 @@ fi # Dynamic linking for HP-UX +{ $as_echo "$as_me:${as_lineno-$LINENO}: checking for RAND_egd in -lcrypto" >&5 +$as_echo_n "checking for RAND_egd in -lcrypto... " >&6; } +if ${ac_cv_lib_crypto_RAND_egd+:} false; then : + $as_echo_n "(cached) " >&6 +else + ac_check_lib_save_LIBS=$LIBS +LIBS="-lcrypto $LIBS" +cat confdefs.h - <<_ACEOF >conftest.$ac_ext +/* end confdefs.h. */ + +/* Override any GCC internal prototype to avoid an error. + Use char because int might match the return type of a GCC + builtin and then its argument prototype would still apply. */ +#ifdef __cplusplus +extern "C" +#endif +char RAND_egd (); +int +main () +{ +return RAND_egd (); + ; + return 0; +} +_ACEOF +if ac_fn_c_try_link "$LINENO"; then : + ac_cv_lib_crypto_RAND_egd=yes +else + ac_cv_lib_crypto_RAND_egd=no +fi +rm -f core conftest.err conftest.$ac_objext \ + conftest$ac_exeext conftest.$ac_ext +LIBS=$ac_check_lib_save_LIBS +fi +{ $as_echo "$as_me:${as_lineno-$LINENO}: result: $ac_cv_lib_crypto_RAND_egd" >&5 +$as_echo "$ac_cv_lib_crypto_RAND_egd" >&6; } +if test "x$ac_cv_lib_crypto_RAND_egd" = xyes; then : + +$as_echo "#define HAVE_RAND_EGD 1" >>confdefs.h + +fi + # only check for sem_init if thread support is requested if test "$with_threads" = "yes" -o -z "$with_threads"; then diff --git a/configure.ac b/configure.ac --- a/configure.ac +++ b/configure.ac @@ -2221,6 +2221,9 @@ # checks for libraries AC_CHECK_LIB(dl, dlopen) # Dynamic linking for SunOS/Solaris and SYSV AC_CHECK_LIB(dld, shl_load) # Dynamic linking for HP-UX +AC_CHECK_LIB(crypto, RAND_egd, + AC_DEFINE(HAVE_RAND_EGD, 1, + [Define if the libcrypto has RAND_egd])) # only check for sem_init if thread support is requested if test "$with_threads" = "yes" -o -z "$with_threads"; then diff --git a/pyconfig.h.in b/pyconfig.h.in --- a/pyconfig.h.in +++ b/pyconfig.h.in @@ -547,6 +547,9 @@ /* Define to 1 if you have the `putenv' function. */ #undef HAVE_PUTENV +/* Define if the libcrypto has RAND_egd */ +#undef HAVE_RAND_EGD + /* Define to 1 if you have the `readlink' function. */ #undef HAVE_READLINK -- Repository URL: https://hg.python.org/cpython From python-checkins at python.org Tue Jan 6 14:00:50 2015 From: python-checkins at python.org (victor.stinner) Date: Tue, 06 Jan 2015 13:00:50 +0000 Subject: [Python-checkins] =?utf-8?b?Y3B5dGhvbiAoMy40KTogSXNzdWUgIzIxMzU2?= =?utf-8?q?=3A_Make_ssl=2ERAND=5Fegd=28=29_optional_to_support_LibreSSL=2E?= =?utf-8?q?_The?= Message-ID: <20150106130030.72551.63511@psf.io> https://hg.python.org/cpython/rev/7f82f50fdad0 changeset: 94050:7f82f50fdad0 branch: 3.4 parent: 94045:a8c4925e2359 user: Victor Stinner date: Tue Jan 06 13:54:58 2015 +0100 summary: Issue #21356: Make ssl.RAND_egd() optional to support LibreSSL. The availability of the function is checked during the compilation. Patch written by Bernard Spil. files: Lib/ssl.py | 7 ++++- Lib/test/test_ssl.py | 5 ++- Misc/NEWS | 4 +++ Modules/_ssl.c | 4 +++ configure | 42 ++++++++++++++++++++++++++++++++ configure.ac | 3 ++ pyconfig.h.in | 3 ++ 7 files changed, 65 insertions(+), 3 deletions(-) diff --git a/Lib/ssl.py b/Lib/ssl.py --- a/Lib/ssl.py +++ b/Lib/ssl.py @@ -106,7 +106,12 @@ from _ssl import (VERIFY_DEFAULT, VERIFY_CRL_CHECK_LEAF, VERIFY_CRL_CHECK_CHAIN, VERIFY_X509_STRICT) from _ssl import txt2obj as _txt2obj, nid2obj as _nid2obj -from _ssl import RAND_status, RAND_egd, RAND_add, RAND_bytes, RAND_pseudo_bytes +from _ssl import RAND_status, RAND_add, RAND_bytes, RAND_pseudo_bytes +try: + from _ssl import RAND_egd +except ImportError: + # LibreSSL does not provide RAND_egd + pass def _import_symbols(prefix): for n in dir(_ssl): diff --git a/Lib/test/test_ssl.py b/Lib/test/test_ssl.py --- a/Lib/test/test_ssl.py +++ b/Lib/test/test_ssl.py @@ -154,8 +154,9 @@ self.assertRaises(ValueError, ssl.RAND_bytes, -5) self.assertRaises(ValueError, ssl.RAND_pseudo_bytes, -5) - self.assertRaises(TypeError, ssl.RAND_egd, 1) - self.assertRaises(TypeError, ssl.RAND_egd, 'foo', 1) + if hasattr(ssl, 'RAND_egd'): + self.assertRaises(TypeError, ssl.RAND_egd, 1) + self.assertRaises(TypeError, ssl.RAND_egd, 'foo', 1) ssl.RAND_add("this is a random string", 75.0) @unittest.skipUnless(os.name == 'posix', 'requires posix') diff --git a/Misc/NEWS b/Misc/NEWS --- a/Misc/NEWS +++ b/Misc/NEWS @@ -44,6 +44,10 @@ Library ------- +- Issue #21356: Make ssl.RAND_egd() optional to support LibreSSL. The + availability of the function is checked during the compilation. Patch written + by Bernard Spil. + - Issue #20896, #22935: The :func:`ssl.get_server_certificate` function now uses the :data:`~ssl.PROTOCOL_SSLv23` protocol by default, not :data:`~ssl.PROTOCOL_SSLv3`, for maximum compatibility and support platforms diff --git a/Modules/_ssl.c b/Modules/_ssl.c --- a/Modules/_ssl.c +++ b/Modules/_ssl.c @@ -3335,6 +3335,7 @@ It is necessary to seed the PRNG with RAND_add() on some platforms before\n\ using the ssl() function."); +#ifdef HAVE_RAND_EGD static PyObject * PySSL_RAND_egd(PyObject *self, PyObject *args) { @@ -3362,6 +3363,7 @@ Queries the entropy gather daemon (EGD) on the socket named by 'path'.\n\ Returns number of bytes read. Raises SSLError if connection to EGD\n\ fails or if it does not provide enough data to seed PRNG."); +#endif /* HAVE_RAND_EGD */ #endif /* HAVE_OPENSSL_RAND */ @@ -3757,8 +3759,10 @@ PySSL_RAND_bytes_doc}, {"RAND_pseudo_bytes", PySSL_RAND_pseudo_bytes, METH_VARARGS, PySSL_RAND_pseudo_bytes_doc}, +#ifdef HAVE_RAND_EGD {"RAND_egd", PySSL_RAND_egd, METH_VARARGS, PySSL_RAND_egd_doc}, +#endif {"RAND_status", (PyCFunction)PySSL_RAND_status, METH_NOARGS, PySSL_RAND_status_doc}, #endif diff --git a/configure b/configure --- a/configure +++ b/configure @@ -8913,6 +8913,48 @@ fi # Dynamic linking for HP-UX +{ $as_echo "$as_me:${as_lineno-$LINENO}: checking for RAND_egd in -lcrypto" >&5 +$as_echo_n "checking for RAND_egd in -lcrypto... " >&6; } +if ${ac_cv_lib_crypto_RAND_egd+:} false; then : + $as_echo_n "(cached) " >&6 +else + ac_check_lib_save_LIBS=$LIBS +LIBS="-lcrypto $LIBS" +cat confdefs.h - <<_ACEOF >conftest.$ac_ext +/* end confdefs.h. */ + +/* Override any GCC internal prototype to avoid an error. + Use char because int might match the return type of a GCC + builtin and then its argument prototype would still apply. */ +#ifdef __cplusplus +extern "C" +#endif +char RAND_egd (); +int +main () +{ +return RAND_egd (); + ; + return 0; +} +_ACEOF +if ac_fn_c_try_link "$LINENO"; then : + ac_cv_lib_crypto_RAND_egd=yes +else + ac_cv_lib_crypto_RAND_egd=no +fi +rm -f core conftest.err conftest.$ac_objext \ + conftest$ac_exeext conftest.$ac_ext +LIBS=$ac_check_lib_save_LIBS +fi +{ $as_echo "$as_me:${as_lineno-$LINENO}: result: $ac_cv_lib_crypto_RAND_egd" >&5 +$as_echo "$ac_cv_lib_crypto_RAND_egd" >&6; } +if test "x$ac_cv_lib_crypto_RAND_egd" = xyes; then : + +$as_echo "#define HAVE_RAND_EGD 1" >>confdefs.h + +fi + # only check for sem_init if thread support is requested if test "$with_threads" = "yes" -o -z "$with_threads"; then diff --git a/configure.ac b/configure.ac --- a/configure.ac +++ b/configure.ac @@ -2238,6 +2238,9 @@ AC_CHECK_LIB(sendfile, sendfile) AC_CHECK_LIB(dl, dlopen) # Dynamic linking for SunOS/Solaris and SYSV AC_CHECK_LIB(dld, shl_load) # Dynamic linking for HP-UX +AC_CHECK_LIB(crypto, RAND_egd, + AC_DEFINE(HAVE_RAND_EGD, 1, + [Define if the libcrypto has RAND_egd])) # only check for sem_init if thread support is requested if test "$with_threads" = "yes" -o -z "$with_threads"; then diff --git a/pyconfig.h.in b/pyconfig.h.in --- a/pyconfig.h.in +++ b/pyconfig.h.in @@ -675,6 +675,9 @@ /* Define to 1 if you have the `pwrite' function. */ #undef HAVE_PWRITE +/* Define if the libcrypto has RAND_egd */ +#undef HAVE_RAND_EGD + /* Define to 1 if you have the `readlink' function. */ #undef HAVE_READLINK -- Repository URL: https://hg.python.org/cpython From python-checkins at python.org Tue Jan 6 14:10:24 2015 From: python-checkins at python.org (victor.stinner) Date: Tue, 06 Jan 2015 13:10:24 +0000 Subject: [Python-checkins] =?utf-8?q?cpython=3A_Issue_=2323145=3A_regrtest?= =?utf-8?q?_now_shows_errors_and_raises_an_exception_if?= Message-ID: <20150106130922.125906.46392@psf.io> https://hg.python.org/cpython/rev/1bb3df5bb83f changeset: 94052:1bb3df5bb83f user: Victor Stinner date: Tue Jan 06 14:05:03 2015 +0100 summary: Issue #23145: regrtest now shows errors and raises an exception if loader.loadTestsFromModule() logged errors. files: Lib/test/regrtest.py | 4 ++++ 1 files changed, 4 insertions(+), 0 deletions(-) diff --git a/Lib/test/regrtest.py b/Lib/test/regrtest.py --- a/Lib/test/regrtest.py +++ b/Lib/test/regrtest.py @@ -1276,6 +1276,10 @@ def test_runner(): loader = unittest.TestLoader() tests = loader.loadTestsFromModule(the_module) + for error in loader.errors: + print(error, file=sys.stderr) + if loader.errors: + raise Exception("errors while loading tests") support.run_unittest(tests) test_runner() if huntrleaks: -- Repository URL: https://hg.python.org/cpython From python-checkins at python.org Tue Jan 6 15:38:52 2015 From: python-checkins at python.org (nick.coghlan) Date: Tue, 06 Jan 2015 14:38:52 +0000 Subject: [Python-checkins] =?utf-8?q?cpython_=28merge_3=2E4_-=3E_default?= =?utf-8?q?=29=3A_Merge_issue_19548_changes_from_3=2E4?= Message-ID: <20150106143717.125886.66642@psf.io> https://hg.python.org/cpython/rev/4d00d0109147 changeset: 94054:4d00d0109147 parent: 94052:1bb3df5bb83f parent: 94053:0646eee8296a user: Nick Coghlan date: Wed Jan 07 00:37:01 2015 +1000 summary: Merge issue 19548 changes from 3.4 files: Doc/glossary.rst | 5 +- Doc/library/codecs.rst | 667 +++++++++++++------------ Doc/library/functions.rst | 8 +- Doc/library/stdtypes.rst | 4 +- Doc/library/tarfile.rst | 2 +- Lib/codecs.py | 71 +- Lib/test/test_codecs.py | 46 +- Misc/NEWS | 8 + Modules/_codecsmodule.c | 6 +- 9 files changed, 432 insertions(+), 385 deletions(-) diff --git a/Doc/glossary.rst b/Doc/glossary.rst --- a/Doc/glossary.rst +++ b/Doc/glossary.rst @@ -834,10 +834,13 @@ :meth:`~collections.somenamedtuple._asdict`. Examples of struct sequences include :data:`sys.float_info` and the return value of :func:`os.stat`. + text encoding + A codec which encodes Unicode strings to bytes. + text file A :term:`file object` able to read and write :class:`str` objects. Often, a text file actually accesses a byte-oriented datastream - and handles the text encoding automatically. + and handles the :term:`text encoding` automatically. .. seealso:: A :term:`binary file` reads and write :class:`bytes` objects. diff --git a/Doc/library/codecs.rst b/Doc/library/codecs.rst --- a/Doc/library/codecs.rst +++ b/Doc/library/codecs.rst @@ -18,17 +18,24 @@ pair: stackable; streams This module defines base classes for standard Python codecs (encoders and -decoders) and provides access to the internal Python codec registry which -manages the codec and error handling lookup process. +decoders) and provides access to the internal Python codec registry, which +manages the codec and error handling lookup process. Most standard codecs +are :term:`text encodings `, which encode text to bytes, +but there are also codecs provided that encode text to text, and bytes to +bytes. Custom codecs may encode and decode between arbitrary types, but some +module features are restricted to use specifically with +:term:`text encodings `, or with codecs that encode to +:class:`bytes`. -It defines the following functions: +The module defines the following functions for encoding and decoding with +any codec: .. function:: encode(obj, encoding='utf-8', errors='strict') Encodes *obj* using the codec registered for *encoding*. *Errors* may be given to set the desired error handling scheme. The - default error handler is ``strict`` meaning that encoding errors raise + default error handler is ``'strict'`` meaning that encoding errors raise :exc:`ValueError` (or a more codec specific subclass, such as :exc:`UnicodeEncodeError`). Refer to :ref:`codec-base-classes` for more information on codec error handling. @@ -38,93 +45,63 @@ Decodes *obj* using the codec registered for *encoding*. *Errors* may be given to set the desired error handling scheme. The - default error handler is ``strict`` meaning that decoding errors raise + default error handler is ``'strict'`` meaning that decoding errors raise :exc:`ValueError` (or a more codec specific subclass, such as :exc:`UnicodeDecodeError`). Refer to :ref:`codec-base-classes` for more information on codec error handling. -.. function:: register(search_function) - - Register a codec search function. Search functions are expected to take one - argument, the encoding name in all lower case letters, and return a - :class:`CodecInfo` object having the following attributes: - - * ``name`` The name of the encoding; - - * ``encode`` The stateless encoding function; - - * ``decode`` The stateless decoding function; - - * ``incrementalencoder`` An incremental encoder class or factory function; - - * ``incrementaldecoder`` An incremental decoder class or factory function; - - * ``streamwriter`` A stream writer class or factory function; - - * ``streamreader`` A stream reader class or factory function. - - The various functions or classes take the following arguments: - - *encode* and *decode*: These must be functions or methods which have the same - interface as the :meth:`~Codec.encode`/:meth:`~Codec.decode` methods of Codec - instances (see :ref:`Codec Interface `). The functions/methods - are expected to work in a stateless mode. - - *incrementalencoder* and *incrementaldecoder*: These have to be factory - functions providing the following interface: - - ``factory(errors='strict')`` - - The factory functions must return objects providing the interfaces defined by - the base classes :class:`IncrementalEncoder` and :class:`IncrementalDecoder`, - respectively. Incremental codecs can maintain state. - - *streamreader* and *streamwriter*: These have to be factory functions providing - the following interface: - - ``factory(stream, errors='strict')`` - - The factory functions must return objects providing the interfaces defined by - the base classes :class:`StreamReader` and :class:`StreamWriter`, respectively. - Stream codecs can maintain state. - - Possible values for errors are - - * ``'strict'``: raise an exception in case of an encoding error - * ``'replace'``: replace malformed data with a suitable replacement marker, - such as ``'?'`` or ``'\ufffd'`` - * ``'ignore'``: ignore malformed data and continue without further notice - * ``'xmlcharrefreplace'``: replace with the appropriate XML character - reference (for encoding only) - * ``'backslashreplace'``: replace with backslashed escape sequences (for - encoding only) - * ``'namereplace'``: replace with ``\N{...}`` escape sequences (for - encoding only) - * ``'surrogateescape'``: on decoding, replace with code points in the Unicode - Private Use Area ranging from U+DC80 to U+DCFF. These private code - points will then be turned back into the same bytes when the - ``surrogateescape`` error handler is used when encoding the data. - (See :pep:`383` for more.) - - as well as any other error handling name defined via :func:`register_error`. - - In case a search function cannot find a given encoding, it should return - ``None``. - +The full details for each codec can also be looked up directly: .. function:: lookup(encoding) Looks up the codec info in the Python codec registry and returns a - :class:`CodecInfo` object as defined above. + :class:`CodecInfo` object as defined below. Encodings are first looked up in the registry's cache. If not found, the list of registered search functions is scanned. If no :class:`CodecInfo` object is found, a :exc:`LookupError` is raised. Otherwise, the :class:`CodecInfo` object is stored in the cache and returned to the caller. -To simplify access to the various codecs, the module provides these additional -functions which use :func:`lookup` for the codec lookup: +.. class:: CodecInfo(encode, decode, streamreader=None, streamwriter=None, incrementalencoder=None, incrementaldecoder=None, name=None) + Codec details when looking up the codec registry. The constructor + arguments are stored in attributes of the same name: + + + .. attribute:: name + + The name of the encoding. + + + .. attribute:: encode + decode + + The stateless encoding and decoding functions. These must be + functions or methods which have the same interface as + the :meth:`~Codec.encode` and :meth:`~Codec.decode` methods of Codec + instances (see :ref:`Codec Interface `). + The functions or methods are expected to work in a stateless mode. + + + .. attribute:: incrementalencoder + incrementaldecoder + + Incremental encoder and decoder classes or factory functions. + These have to provide the interface defined by the base classes + :class:`IncrementalEncoder` and :class:`IncrementalDecoder`, + respectively. Incremental codecs can maintain state. + + + .. attribute:: streamwriter + streamreader + + Stream writer and reader classes or factory functions. These have to + provide the interface defined by the base classes + :class:`StreamWriter` and :class:`StreamReader`, respectively. + Stream codecs can maintain state. + +To simplify access to the various codec components, the module provides +these additional functions which use :func:`lookup` for the codec lookup: .. function:: getencoder(encoding) @@ -173,97 +150,43 @@ Raises a :exc:`LookupError` in case the encoding cannot be found. +Custom codecs are made available by registering a suitable codec search +function: -.. function:: register_error(name, error_handler) +.. function:: register(search_function) - Register the error handling function *error_handler* under the name *name*. - *error_handler* will be called during encoding and decoding in case of an error, - when *name* is specified as the errors parameter. - - For encoding *error_handler* will be called with a :exc:`UnicodeEncodeError` - instance, which contains information about the location of the error. The - error handler must either raise this or a different exception or return a - tuple with a replacement for the unencodable part of the input and a position - where encoding should continue. The replacement may be either :class:`str` or - :class:`bytes`. If the replacement is bytes, the encoder will simply copy - them into the output buffer. If the replacement is a string, the encoder will - encode the replacement. Encoding continues on original input at the - specified position. Negative position values will be treated as being - relative to the end of the input string. If the resulting position is out of - bound an :exc:`IndexError` will be raised. - - Decoding and translating works similar, except :exc:`UnicodeDecodeError` or - :exc:`UnicodeTranslateError` will be passed to the handler and that the - replacement from the error handler will be put into the output directly. - - -.. function:: lookup_error(name) - - Return the error handler previously registered under the name *name*. - - Raises a :exc:`LookupError` in case the handler cannot be found. - - -.. function:: strict_errors(exception) - - Implements the ``strict`` error handling: each encoding or decoding error - raises a :exc:`UnicodeError`. - - -.. function:: replace_errors(exception) - - Implements the ``replace`` error handling: malformed data is replaced with a - suitable replacement character such as ``'?'`` in bytestrings and - ``'\ufffd'`` in Unicode strings. - - -.. function:: ignore_errors(exception) - - Implements the ``ignore`` error handling: malformed data is ignored and - encoding or decoding is continued without further notice. - - -.. function:: xmlcharrefreplace_errors(exception) - - Implements the ``xmlcharrefreplace`` error handling (for encoding only): the - unencodable character is replaced by an appropriate XML character reference. - - -.. function:: backslashreplace_errors(exception) - - Implements the ``backslashreplace`` error handling (for encoding only): the - unencodable character is replaced by a backslashed escape sequence. - -.. function:: namereplace_errors(exception) - - Implements the ``namereplace`` error handling (for encoding only): the - unencodable character is replaced by a ``\N{...}`` escape sequence. - - .. versionadded:: 3.5 - -To simplify working with encoded files or stream, the module also defines these -utility functions: - - -.. function:: open(filename, mode[, encoding[, errors[, buffering]]]) - - Open an encoded file using the given *mode* and return a wrapped version - providing transparent encoding/decoding. The default file mode is ``'r'`` - meaning to open the file in read mode. + Register a codec search function. Search functions are expected to take one + argument, being the encoding name in all lower case letters, and return a + :class:`CodecInfo` object. In case a search function cannot find + a given encoding, it should return ``None``. .. note:: - The wrapped version's methods will accept and return strings only. Bytes - arguments will be rejected. + Search function registration is not currently reversible, + which may cause problems in some cases, such as unit testing or + module reloading. + +While the builtin :func:`open` and the associated :mod:`io` module are the +recommended approach for working with encoded text files, this module +provides additional utility functions and classes that allow the use of a +wider range of codecs when working with binary files: + +.. function:: open(filename, mode='r', encoding=None, errors='strict', buffering=1) + + Open an encoded file using the given *mode* and return an instance of + :class:`StreamReaderWriter`, providing transparent encoding/decoding. + The default file mode is ``'r'``, meaning to open the file in read mode. .. note:: - Files are always opened in binary mode, even if no binary mode was - specified. This is done to avoid data loss due to encodings using 8-bit - values. This means that no automatic conversion of ``b'\n'`` is done - on reading and writing. + Underlying encoded files are always opened in binary mode. + No automatic conversion of ``'\n'`` is done on reading and writing. + The *mode* argument may be any binary mode acceptable to the built-in + :func:`open` function; the ``'b'`` is automatically added. *encoding* specifies the encoding which is to be used for the file. + Any encoding that encodes to and decodes from bytes is allowed, and + the data types supported by the file methods depend on the codec used. *errors* may be given to define the error handling. It defaults to ``'strict'`` which causes a :exc:`ValueError` to be raised in case an encoding error occurs. @@ -274,12 +197,15 @@ .. function:: EncodedFile(file, data_encoding, file_encoding=None, errors='strict') - Return a wrapped version of file which provides transparent encoding - translation. + Return a :class:`StreamRecoder` instance, a wrapped version of *file* + which provides transparent transcoding. The original file is closed + when the wrapped version is closed. - Bytes written to the wrapped file are interpreted according to the given - *data_encoding* and then written to the original file as bytes using the - *file_encoding*. + Data written to the wrapped file is decoded according to the given + *data_encoding* and then written to the original file as bytes using + *file_encoding*. Bytes read from the original file are decoded + according to *file_encoding*, and the result is encoded + using *data_encoding*. If *file_encoding* is not given, it defaults to *data_encoding*. @@ -291,14 +217,16 @@ .. function:: iterencode(iterator, encoding, errors='strict', **kwargs) Uses an incremental encoder to iteratively encode the input provided by - *iterator*. This function is a :term:`generator`. *errors* (as well as any + *iterator*. This function is a :term:`generator`. + The *errors* argument (as well as any other keyword argument) is passed through to the incremental encoder. .. function:: iterdecode(iterator, encoding, errors='strict', **kwargs) Uses an incremental decoder to iteratively decode the input provided by - *iterator*. This function is a :term:`generator`. *errors* (as well as any + *iterator*. This function is a :term:`generator`. + The *errors* argument (as well as any other keyword argument) is passed through to the incremental decoder. @@ -317,9 +245,10 @@ BOM_UTF32_BE BOM_UTF32_LE - These constants define various encodings of the Unicode byte order mark (BOM) - used in UTF-16 and UTF-32 data streams to indicate the byte order used in the - stream or file and in UTF-8 as a Unicode signature. :const:`BOM_UTF16` is either + These constants define various byte sequences, + being Unicode byte order marks (BOMs) for several encodings. They are + used in UTF-16 and UTF-32 data streams to indicate the byte order used, + and in UTF-8 as a Unicode signature. :const:`BOM_UTF16` is either :const:`BOM_UTF16_BE` or :const:`BOM_UTF16_LE` depending on the platform's native byte order, :const:`BOM` is an alias for :const:`BOM_UTF16`, :const:`BOM_LE` for :const:`BOM_UTF16_LE` and :const:`BOM_BE` for @@ -334,20 +263,25 @@ ------------------ The :mod:`codecs` module defines a set of base classes which define the -interface and can also be used to easily write your own codecs for use in -Python. +interfaces for working with codec objects, and can also be used as the basis +for custom codec implementations. Each codec has to define four interfaces to make it usable as codec in Python: stateless encoder, stateless decoder, stream reader and stream writer. The stream reader and writers typically reuse the stateless encoder/decoder to -implement the file protocols. +implement the file protocols. Codec authors also need to define how the +codec will handle encoding and decoding errors. -The :class:`Codec` class defines the interface for stateless encoders/decoders. -To simplify and standardize error handling, the :meth:`~Codec.encode` and -:meth:`~Codec.decode` methods may implement different error handling schemes by -providing the *errors* string argument. The following string values are defined -and implemented by all standard Python codecs: +.. _error-handlers: + +Error Handlers +^^^^^^^^^^^^^^ + +To simplify and standardize error handling, +codecs may implement different error handling schemes by +accepting the *errors* string argument. The following string values are +defined and implemented by all standard Python codecs: .. tabularcolumns:: |l|L| @@ -355,39 +289,55 @@ | Value | Meaning | +=========================+===============================================+ | ``'strict'`` | Raise :exc:`UnicodeError` (or a subclass); | -| | this is the default. | +| | this is the default. Implemented in | +| | :func:`strict_errors`. | +-------------------------+-----------------------------------------------+ -| ``'ignore'`` | Ignore the character and continue with the | -| | next. | +| ``'ignore'`` | Ignore the malformed data and continue | +| | without further notice. Implemented in | +| | :func:`ignore_errors`. | +-------------------------+-----------------------------------------------+ + +The following error handlers are only applicable to +:term:`text encodings `: + ++-------------------------+-----------------------------------------------+ +| Value | Meaning | ++=========================+===============================================+ | ``'replace'`` | Replace with a suitable replacement | -| | character; Python will use the official | -| | U+FFFD REPLACEMENT CHARACTER for the built-in | -| | Unicode codecs on decoding and '?' on | -| | encoding. | +| | marker; Python will use the official | +| | ``U+FFFD`` REPLACEMENT CHARACTER for the | +| | built-in codecs on decoding, and '?' on | +| | encoding. Implemented in | +| | :func:`replace_errors`. | +-------------------------+-----------------------------------------------+ | ``'xmlcharrefreplace'`` | Replace with the appropriate XML character | -| | reference (only for encoding). | +| | reference (only for encoding). Implemented | +| | in :func:`xmlcharrefreplace_errors`. | +-------------------------+-----------------------------------------------+ | ``'backslashreplace'`` | Replace with backslashed escape sequences | -| | (only for encoding). | +| | (only for encoding). Implemented in | +| | :func:`backslashreplace_errors`. | +-------------------------+-----------------------------------------------+ | ``'namereplace'`` | Replace with ``\N{...}`` escape sequences | | | (only for encoding). | +-------------------------+-----------------------------------------------+ -| ``'surrogateescape'`` | Replace byte with surrogate U+DCxx, as defined| -| | in :pep:`383`. | +| ``'surrogateescape'`` | On decoding, replace byte with individual | +| | surrogate code ranging from ``U+DC80`` to | +| | ``U+DCFF``. This code will then be turned | +| | back into the same byte when the | +| | ``'surrogateescape'`` error handler is used | +| | when encoding the data. (See :pep:`383` for | +| | more.) | +-------------------------+-----------------------------------------------+ -In addition, the following error handlers are specific to Unicode encoding -schemes: +In addition, the following error handler is specific to the given codecs: +-------------------+------------------------+-------------------------------------------+ -| Value | Codec | Meaning | +| Value | Codecs | Meaning | +===================+========================+===========================================+ |``'surrogatepass'``| utf-8, utf-16, utf-32, | Allow encoding and decoding of surrogate | -| | utf-16-be, utf-16-le, | codes in all the Unicode encoding schemes.| -| | utf-32-be, utf-32-le | | +| | utf-16-be, utf-16-le, | codes. These codecs normally treat the | +| | utf-32-be, utf-32-le | presence of surrogates as an error. | +-------------------+------------------------+-------------------------------------------+ .. versionadded:: 3.1 @@ -399,26 +349,103 @@ .. versionadded:: 3.5 The ``'namereplace'`` error handler. -The set of allowed values can be extended via :meth:`register_error`. +The set of allowed values can be extended by registering a new named error +handler: + +.. function:: register_error(name, error_handler) + + Register the error handling function *error_handler* under the name *name*. + The *error_handler* argument will be called during encoding and decoding + in case of an error, when *name* is specified as the errors parameter. + + For encoding, *error_handler* will be called with a :exc:`UnicodeEncodeError` + instance, which contains information about the location of the error. The + error handler must either raise this or a different exception, or return a + tuple with a replacement for the unencodable part of the input and a position + where encoding should continue. The replacement may be either :class:`str` or + :class:`bytes`. If the replacement is bytes, the encoder will simply copy + them into the output buffer. If the replacement is a string, the encoder will + encode the replacement. Encoding continues on original input at the + specified position. Negative position values will be treated as being + relative to the end of the input string. If the resulting position is out of + bound an :exc:`IndexError` will be raised. + + Decoding and translating works similarly, except :exc:`UnicodeDecodeError` or + :exc:`UnicodeTranslateError` will be passed to the handler and that the + replacement from the error handler will be put into the output directly. + + +Previously registered error handlers (including the standard error handlers) +can be looked up by name: + +.. function:: lookup_error(name) + + Return the error handler previously registered under the name *name*. + + Raises a :exc:`LookupError` in case the handler cannot be found. + +The following standard error handlers are also made available as module level +functions: + +.. function:: strict_errors(exception) + + Implements the ``'strict'`` error handling: each encoding or + decoding error raises a :exc:`UnicodeError`. + + +.. function:: replace_errors(exception) + + Implements the ``'replace'`` error handling (for :term:`text encodings + ` only): substitutes ``'?'`` for encoding errors + (to be encoded by the codec), and ``'\ufffd'`` (the Unicode replacement + character, ``'?'``) for decoding errors. + + +.. function:: ignore_errors(exception) + + Implements the ``'ignore'`` error handling: malformed data is ignored and + encoding or decoding is continued without further notice. + + +.. function:: xmlcharrefreplace_errors(exception) + + Implements the ``'xmlcharrefreplace'`` error handling (for encoding with + :term:`text encodings ` only): the + unencodable character is replaced by an appropriate XML character reference. + + +.. function:: backslashreplace_errors(exception) + + Implements the ``'backslashreplace'`` error handling (for encoding with + :term:`text encodings ` only): the + unencodable character is replaced by a backslashed escape sequence. + +.. function:: namereplace_errors(exception) + + Implements the ``namereplace`` error handling (for encoding only): the + unencodable character is replaced by a ``\N{...}`` escape sequence. + + .. versionadded:: 3.5 .. _codec-objects: -Codec Objects -^^^^^^^^^^^^^ +Stateless Encoding and Decoding +^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -The :class:`Codec` class defines these methods which also define the function -interfaces of the stateless encoder and decoder: +The base :class:`Codec` class defines these methods which also define the +function interfaces of the stateless encoder and decoder: .. method:: Codec.encode(input[, errors]) Encodes the object *input* and returns a tuple (output object, length consumed). - Encoding converts a string object to a bytes object using a particular + For instance, :term:`text encoding` converts + a string object to a bytes object using a particular character set encoding (e.g., ``cp1252`` or ``iso-8859-1``). - *errors* defines the error handling to apply. It defaults to ``'strict'`` - handling. + The *errors* argument defines the error handling to apply. + It defaults to ``'strict'`` handling. The method may not store state in the :class:`Codec` instance. Use :class:`StreamCodec` for codecs which have to keep state in order to make @@ -431,14 +458,16 @@ .. method:: Codec.decode(input[, errors]) Decodes the object *input* and returns a tuple (output object, length - consumed). Decoding converts a bytes object encoded using a particular + consumed). For instance, for a :term:`text encoding`, decoding converts + a bytes object encoded using a particular character set encoding to a string object. - *input* must be a bytes object or one which provides the read-only character + For text encodings and bytes-to-bytes codecs, + *input* must be a bytes object or one which provides the read-only buffer interface -- for example, buffer objects and memory mapped files. - *errors* defines the error handling to apply. It defaults to ``'strict'`` - handling. + The *errors* argument defines the error handling to apply. + It defaults to ``'strict'`` handling. The method may not store state in the :class:`Codec` instance. Use :class:`StreamCodec` for codecs which have to keep state in order to make @@ -447,6 +476,10 @@ The decoder must be able to handle zero length input and return an empty object of the output object type in this situation. + +Incremental Encoding and Decoding +^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ + The :class:`IncrementalEncoder` and :class:`IncrementalDecoder` classes provide the basic interface for incremental encoding and decoding. Encoding/decoding the input isn't done with one call to the stateless encoder/decoder function, but @@ -464,14 +497,14 @@ .. _incremental-encoder-objects: IncrementalEncoder Objects -^^^^^^^^^^^^^^^^^^^^^^^^^^ +~~~~~~~~~~~~~~~~~~~~~~~~~~ The :class:`IncrementalEncoder` class is used for encoding an input in multiple steps. It defines the following methods which every incremental encoder must define in order to be compatible with the Python codec registry. -.. class:: IncrementalEncoder([errors]) +.. class:: IncrementalEncoder(errors='strict') Constructor for an :class:`IncrementalEncoder` instance. @@ -480,28 +513,14 @@ the Python codec registry. The :class:`IncrementalEncoder` may implement different error handling schemes - by providing the *errors* keyword argument. These parameters are predefined: - - * ``'strict'`` Raise :exc:`ValueError` (or a subclass); this is the default. - - * ``'ignore'`` Ignore the character and continue with the next. - - * ``'replace'`` Replace with a suitable replacement character - - * ``'xmlcharrefreplace'`` Replace with the appropriate XML character reference - - * ``'backslashreplace'`` Replace with backslashed escape sequences. - - * ``'namereplace'`` Replace with ``\N{...}`` escape sequences. + by providing the *errors* keyword argument. See :ref:`error-handlers` for + possible values. The *errors* argument will be assigned to an attribute of the same name. Assigning to this attribute makes it possible to switch between different error handling strategies during the lifetime of the :class:`IncrementalEncoder` object. - The set of allowed values for the *errors* argument can be extended with - :func:`register_error`. - .. method:: encode(object[, final]) @@ -513,7 +532,8 @@ .. method:: reset() Reset the encoder to the initial state. The output is discarded: call - ``.encode('', final=True)`` to reset the encoder and to get the output. + ``.encode(object, final=True)``, passing an empty byte or text string + if necessary, to reset the encoder and to get the output. .. method:: IncrementalEncoder.getstate() @@ -534,14 +554,14 @@ .. _incremental-decoder-objects: IncrementalDecoder Objects -^^^^^^^^^^^^^^^^^^^^^^^^^^ +~~~~~~~~~~~~~~~~~~~~~~~~~~ The :class:`IncrementalDecoder` class is used for decoding an input in multiple steps. It defines the following methods which every incremental decoder must define in order to be compatible with the Python codec registry. -.. class:: IncrementalDecoder([errors]) +.. class:: IncrementalDecoder(errors='strict') Constructor for an :class:`IncrementalDecoder` instance. @@ -550,22 +570,14 @@ the Python codec registry. The :class:`IncrementalDecoder` may implement different error handling schemes - by providing the *errors* keyword argument. These parameters are predefined: - - * ``'strict'`` Raise :exc:`ValueError` (or a subclass); this is the default. - - * ``'ignore'`` Ignore the character and continue with the next. - - * ``'replace'`` Replace with a suitable replacement character. + by providing the *errors* keyword argument. See :ref:`error-handlers` for + possible values. The *errors* argument will be assigned to an attribute of the same name. Assigning to this attribute makes it possible to switch between different error handling strategies during the lifetime of the :class:`IncrementalDecoder` object. - The set of allowed values for the *errors* argument can be extended with - :func:`register_error`. - .. method:: decode(object[, final]) @@ -604,6 +616,10 @@ returned by :meth:`getstate`. +Stream Encoding and Decoding +^^^^^^^^^^^^^^^^^^^^^^^^^^^^ + + The :class:`StreamWriter` and :class:`StreamReader` classes provide generic working interfaces which can be used to implement new encoding submodules very easily. See :mod:`encodings.utf_8` for an example of how this is done. @@ -612,14 +628,14 @@ .. _stream-writer-objects: StreamWriter Objects -^^^^^^^^^^^^^^^^^^^^ +~~~~~~~~~~~~~~~~~~~~ The :class:`StreamWriter` class is a subclass of :class:`Codec` and defines the following methods which every stream writer must define in order to be compatible with the Python codec registry. -.. class:: StreamWriter(stream[, errors]) +.. class:: StreamWriter(stream, errors='strict') Constructor for a :class:`StreamWriter` instance. @@ -627,31 +643,17 @@ additional keyword arguments, but only the ones defined here are used by the Python codec registry. - *stream* must be a file-like object open for writing binary data. + The *stream* argument must be a file-like object open for writing + text or binary data, as appropriate for the specific codec. The :class:`StreamWriter` may implement different error handling schemes by - providing the *errors* keyword argument. These parameters are predefined: - - * ``'strict'`` Raise :exc:`ValueError` (or a subclass); this is the default. - - * ``'ignore'`` Ignore the character and continue with the next. - - * ``'replace'`` Replace with a suitable replacement character - - * ``'xmlcharrefreplace'`` Replace with the appropriate XML character reference - - * ``'backslashreplace'`` Replace with backslashed escape sequences. - - * ``'namereplace'`` Replace with ``\N{...}`` escape sequences. + providing the *errors* keyword argument. See :ref:`error-handlers` for + the standard error handlers the underlying stream codec may support. The *errors* argument will be assigned to an attribute of the same name. Assigning to this attribute makes it possible to switch between different error handling strategies during the lifetime of the :class:`StreamWriter` object. - The set of allowed values for the *errors* argument can be extended with - :func:`register_error`. - - .. method:: write(object) Writes the object's contents encoded to the stream. @@ -660,7 +662,8 @@ .. method:: writelines(list) Writes the concatenated list of strings to the stream (possibly by reusing - the :meth:`write` method). + the :meth:`write` method). The standard bytes-to-bytes codecs + do not support this method. .. method:: reset() @@ -679,14 +682,14 @@ .. _stream-reader-objects: StreamReader Objects -^^^^^^^^^^^^^^^^^^^^ +~~~~~~~~~~~~~~~~~~~~ The :class:`StreamReader` class is a subclass of :class:`Codec` and defines the following methods which every stream reader must define in order to be compatible with the Python codec registry. -.. class:: StreamReader(stream[, errors]) +.. class:: StreamReader(stream, errors='strict') Constructor for a :class:`StreamReader` instance. @@ -694,16 +697,12 @@ additional keyword arguments, but only the ones defined here are used by the Python codec registry. - *stream* must be a file-like object open for reading (binary) data. + The *stream* argument must be a file-like object open for reading + text or binary data, as appropriate for the specific codec. The :class:`StreamReader` may implement different error handling schemes by - providing the *errors* keyword argument. These parameters are defined: - - * ``'strict'`` Raise :exc:`ValueError` (or a subclass); this is the default. - - * ``'ignore'`` Ignore the character and continue with the next. - - * ``'replace'`` Replace with a suitable replacement character. + providing the *errors* keyword argument. See :ref:`error-handlers` for + the standard error handlers the underlying stream codec may support. The *errors* argument will be assigned to an attribute of the same name. Assigning to this attribute makes it possible to switch between different error @@ -717,17 +716,20 @@ Decodes data from the stream and returns the resulting object. - *chars* indicates the number of characters to read from the - stream. :func:`read` will never return more than *chars* characters, but - it might return less, if there are not enough characters available. + The *chars* argument indicates the number of decoded + code points or bytes to return. The :func:`read` method will + never return more data than requested, but it might return less, + if there is not enough available. - *size* indicates the approximate maximum number of bytes to read from the - stream for decoding purposes. The decoder can modify this setting as + The *size* argument indicates the approximate maximum + number of encoded bytes or code points to read + for decoding. The decoder can modify this setting as appropriate. The default value -1 indicates to read and decode as much as - possible. *size* is intended to prevent having to decode huge files in - one step. + possible. This parameter is intended to + prevent having to decode huge files in one step. - *firstline* indicates that it would be sufficient to only return the first + The *firstline* flag indicates that + it would be sufficient to only return the first line, if there are decoding errors on later lines. The method should use a greedy read strategy meaning that it should read @@ -770,17 +772,13 @@ In addition to the above methods, the :class:`StreamReader` must also inherit all other methods and attributes from the underlying stream. -The next two base classes are included for convenience. They are not needed by -the codec registry, but may provide useful in practice. - - .. _stream-reader-writer: StreamReaderWriter Objects -^^^^^^^^^^^^^^^^^^^^^^^^^^ +~~~~~~~~~~~~~~~~~~~~~~~~~~ -The :class:`StreamReaderWriter` allows wrapping streams which work in both read -and write modes. +The :class:`StreamReaderWriter` is a convenience class that allows wrapping +streams which work in both read and write modes. The design is such that one can use the factory functions returned by the :func:`lookup` function to construct the instance. @@ -801,9 +799,9 @@ .. _stream-recoder-objects: StreamRecoder Objects -^^^^^^^^^^^^^^^^^^^^^ +~~~~~~~~~~~~~~~~~~~~~ -The :class:`StreamRecoder` provide a frontend - backend view of encoding data +The :class:`StreamRecoder` translates data from one encoding to another, which is sometimes useful when dealing with different encoding environments. The design is such that one can use the factory functions returned by the @@ -813,22 +811,20 @@ .. class:: StreamRecoder(stream, encode, decode, Reader, Writer, errors) Creates a :class:`StreamRecoder` instance which implements a two-way conversion: - *encode* and *decode* work on the frontend (the input to :meth:`read` and output - of :meth:`write`) while *Reader* and *Writer* work on the backend (reading and - writing to the stream). + *encode* and *decode* work on the frontend?? the data visible to + code calling :meth:`read` and :meth:`write`, while *Reader* and *Writer* + work on the backend?? the data in *stream*. - You can use these objects to do transparent direct recodings from e.g. Latin-1 + You can use these objects to do transparent transcodings from e.g. Latin-1 to UTF-8 and back. - *stream* must be a file-like object. + The *stream* argument must be a file-like object. - *encode*, *decode* must adhere to the :class:`Codec` interface. *Reader*, + The *encode* and *decode* arguments must + adhere to the :class:`Codec` interface. *Reader* and *Writer* must be factory functions or classes providing objects of the :class:`StreamReader` and :class:`StreamWriter` interface respectively. - *encode* and *decode* are needed for the frontend translation, *Reader* and - *Writer* for the backend translation. - Error handling is done in the same way as defined for the stream readers and writers. @@ -843,20 +839,23 @@ Encodings and Unicode --------------------- -Strings are stored internally as sequences of codepoints in range ``0 - 10FFFF`` -(see :pep:`393` for more details about the implementation). -Once a string object is used outside of CPU and memory, CPU endianness -and how these arrays are stored as bytes become an issue. Transforming a -string object into a sequence of bytes is called encoding and recreating the -string object from the sequence of bytes is known as decoding. There are many -different methods for how this transformation can be done (these methods are -also called encodings). The simplest method is to map the codepoints 0-255 to -the bytes ``0x0``-``0xff``. This means that a string object that contains -codepoints above ``U+00FF`` can't be encoded with this method (which is called -``'latin-1'`` or ``'iso-8859-1'``). :func:`str.encode` will raise a -:exc:`UnicodeEncodeError` that looks like this: ``UnicodeEncodeError: 'latin-1' -codec can't encode character '\u1234' in position 3: ordinal not in -range(256)``. +Strings are stored internally as sequences of codepoints in +range ``0x0``-``0x10FFFF``. (See :pep:`393` for +more details about the implementation.) +Once a string object is used outside of CPU and memory, endianness +and how these arrays are stored as bytes become an issue. As with other +codecs, serialising a string into a sequence of bytes is known as *encoding*, +and recreating the string from the sequence of bytes is known as *decoding*. +There are a variety of different text serialisation codecs, which are +collectivity referred to as :term:`text encodings `. + +The simplest text encoding (called ``'latin-1'`` or ``'iso-8859-1'``) maps +the codepoints 0-255 to the bytes ``0x0``-``0xff``, which means that a string +object that contains codepoints above ``U+00FF`` can't be encoded with this +codec. Doing so will raise a :exc:`UnicodeEncodeError` that looks +like the following (although the details of the error message may differ): +``UnicodeEncodeError: 'latin-1' codec can't encode character '\u1234' in +position 3: ordinal not in range(256)``. There's another group of encodings (the so called charmap encodings) that choose a different subset of all Unicode code points and how these codepoints are @@ -1203,7 +1202,8 @@ .. versionchanged:: 3.4 The utf-16\* and utf-32\* encoders no longer allow surrogate code points - (U+D800--U+DFFF) to be encoded. The utf-32\* decoders no longer decode + (``U+D800``--``U+DFFF``) to be encoded. + The utf-32\* decoders no longer decode byte sequences that correspond to surrogate code points. @@ -1231,7 +1231,9 @@ +====================+=========+===========================+ | idna | | Implements :rfc:`3490`, | | | | see also | -| | | :mod:`encodings.idna` | +| | | :mod:`encodings.idna`. | +| | | Only ``errors='strict'`` | +| | | is supported. | +--------------------+---------+---------------------------+ | mbcs | dbcs | Windows only: Encode | | | | operand according to the | @@ -1239,31 +1241,44 @@ +--------------------+---------+---------------------------+ | palmos | | Encoding of PalmOS 3.5 | +--------------------+---------+---------------------------+ -| punycode | | Implements :rfc:`3492` | +| punycode | | Implements :rfc:`3492`. | +| | | Stateful codecs are not | +| | | supported. | +--------------------+---------+---------------------------+ -| raw_unicode_escape | | Produce a string that is | -| | | suitable as raw Unicode | -| | | literal in Python source | -| | | code | +| raw_unicode_escape | | Latin-1 encoding with | +| | | ``\uXXXX`` and | +| | | ``\UXXXXXXXX`` for other | +| | | code points. Existing | +| | | backslashes are not | +| | | escaped in any way. | +| | | It is used in the Python | +| | | pickle protocol. | +--------------------+---------+---------------------------+ | undefined | | Raise an exception for | -| | | all conversions. Can be | -| | | used as the system | -| | | encoding if no automatic | -| | | coercion between byte and | -| | | Unicode strings is | -| | | desired. | +| | | all conversions, even | +| | | empty strings. The error | +| | | handler is ignored. | +--------------------+---------+---------------------------+ -| unicode_escape | | Produce a string that is | -| | | suitable as Unicode | -| | | literal in Python source | -| | | code | +| unicode_escape | | Encoding suitable as the | +| | | contents of a Unicode | +| | | literal in ASCII-encoded | +| | | Python source code, | +| | | except that quotes are | +| | | not escaped. Decodes from | +| | | Latin-1 source code. | +| | | Beware that Python source | +| | | code actually uses UTF-8 | +| | | by default. | +--------------------+---------+---------------------------+ | unicode_internal | | Return the internal | | | | representation of the | -| | | operand | +| | | operand. Stateful codecs | +| | | are not supported. | | | | | | | | .. deprecated:: 3.3 | +| | | This representation is | +| | | obsoleted by | +| | | :pep:`393`. | +--------------------+---------+---------------------------+ .. _binary-transforms: @@ -1272,7 +1287,8 @@ ^^^^^^^^^^^^^^^^^ The following codecs provide binary transforms: :term:`bytes-like object` -to :class:`bytes` mappings. +to :class:`bytes` mappings. They are not supported by :meth:`bytes.decode` +(which only produces :class:`str` output). .. tabularcolumns:: |l|L|L|L| @@ -1327,7 +1343,8 @@ ^^^^^^^^^^^^^^^ The following codec provides a text transform: a :class:`str` to :class:`str` -mapping. +mapping. It is not supported by :meth:`str.encode` (which only produces +:class:`bytes` output). .. tabularcolumns:: |l|l|L| diff --git a/Doc/library/functions.rst b/Doc/library/functions.rst --- a/Doc/library/functions.rst +++ b/Doc/library/functions.rst @@ -940,15 +940,17 @@ *encoding* is the name of the encoding used to decode or encode the file. This should only be used in text mode. The default encoding is platform dependent (whatever :func:`locale.getpreferredencoding` returns), but any - encoding supported by Python can be used. See the :mod:`codecs` module for + :term:`text encoding` supported by Python + can be used. See the :mod:`codecs` module for the list of supported encodings. *errors* is an optional string that specifies how encoding and decoding errors are to be handled--this cannot be used in binary mode. - A variety of standard error handlers are available, though any + A variety of standard error handlers are available + (listed under :ref:`error-handlers`), though any error handling name that has been registered with :func:`codecs.register_error` is also valid. The standard names - are: + include: * ``'strict'`` to raise a :exc:`ValueError` exception if there is an encoding error. The default value of ``None`` has the same diff --git a/Doc/library/stdtypes.rst b/Doc/library/stdtypes.rst --- a/Doc/library/stdtypes.rst +++ b/Doc/library/stdtypes.rst @@ -1512,7 +1512,7 @@ a :exc:`UnicodeError`. Other possible values are ``'ignore'``, ``'replace'``, ``'xmlcharrefreplace'``, ``'backslashreplace'`` and any other name registered via - :func:`codecs.register_error`, see section :ref:`codec-base-classes`. For a + :func:`codecs.register_error`, see section :ref:`error-handlers`. For a list of possible encodings, see section :ref:`standard-encodings`. .. versionchanged:: 3.1 @@ -2384,7 +2384,7 @@ error handling scheme. The default for *errors* is ``'strict'``, meaning that encoding errors raise a :exc:`UnicodeError`. Other possible values are ``'ignore'``, ``'replace'`` and any other name registered via - :func:`codecs.register_error`, see section :ref:`codec-base-classes`. For a + :func:`codecs.register_error`, see section :ref:`error-handlers`. For a list of possible encodings, see section :ref:`standard-encodings`. .. note:: diff --git a/Doc/library/tarfile.rst b/Doc/library/tarfile.rst --- a/Doc/library/tarfile.rst +++ b/Doc/library/tarfile.rst @@ -798,7 +798,7 @@ appropriately, this conversion may fail. The *errors* argument defines how characters are treated that cannot be -converted. Possible values are listed in section :ref:`codec-base-classes`. +converted. Possible values are listed in section :ref:`error-handlers`. The default scheme is ``'surrogateescape'`` which Python also uses for its file system calls, see :ref:`os-filenames`. diff --git a/Lib/codecs.py b/Lib/codecs.py --- a/Lib/codecs.py +++ b/Lib/codecs.py @@ -347,8 +347,7 @@ """ Creates a StreamWriter instance. - stream must be a file-like object open for writing - (binary) data. + stream must be a file-like object open for writing. The StreamWriter may use different error handling schemes by providing the errors keyword argument. These @@ -422,8 +421,7 @@ """ Creates a StreamReader instance. - stream must be a file-like object open for reading - (binary) data. + stream must be a file-like object open for reading. The StreamReader may use different error handling schemes by providing the errors keyword argument. These @@ -451,13 +449,12 @@ """ Decodes data from the stream self.stream and returns the resulting object. - chars indicates the number of characters to read from the - stream. read() will never return more than chars - characters, but it might return less, if there are not enough - characters available. + chars indicates the number of decoded code points or bytes to + return. read() will never return more data than requested, + but it might return less, if there is not enough available. - size indicates the approximate maximum number of bytes to - read from the stream for decoding purposes. The decoder + size indicates the approximate maximum number of decoded + bytes or code points to read for decoding. The decoder can modify this setting as appropriate. The default value -1 indicates to read and decode as much as possible. size is intended to prevent having to decode huge files in one @@ -468,7 +465,7 @@ will be returned, the rest of the input will be kept until the next call to read(). - The method should use a greedy read strategy meaning that + The method should use a greedy read strategy, meaning that it should read as much data as is allowed within the definition of the encoding and the given size, e.g. if optional encoding endings or state markers are available @@ -603,7 +600,7 @@ def readlines(self, sizehint=None, keepends=True): """ Read all lines available on the input stream - and return them as list of lines. + and return them as a list. Line breaks are implemented using the codec's decoder method and are included in the list entries. @@ -751,19 +748,18 @@ class StreamRecoder: - """ StreamRecoder instances provide a frontend - backend - view of encoding data. + """ StreamRecoder instances translate data from one encoding to another. They use the complete set of APIs returned by the codecs.lookup() function to implement their task. - Data written to the stream is first decoded into an - intermediate format (which is dependent on the given codec - combination) and then written to the stream using an instance - of the provided Writer class. + Data written to the StreamRecoder is first decoded into an + intermediate format (depending on the "decode" codec) and then + written to the underlying stream using an instance of the provided + Writer class. - In the other direction, data is read from the stream using a - Reader instance and then return encoded data to the caller. + In the other direction, data is read from the underlying stream using + a Reader instance and then encoded and returned to the caller. """ # Optional attributes set by the file wrappers below @@ -775,22 +771,17 @@ """ Creates a StreamRecoder instance which implements a two-way conversion: encode and decode work on the frontend (the - input to .read() and output of .write()) while - Reader and Writer work on the backend (reading and - writing to the stream). + data visible to .read() and .write()) while Reader and Writer + work on the backend (the data in stream). - You can use these objects to do transparent direct - recodings from e.g. latin-1 to utf-8 and back. + You can use these objects to do transparent + transcodings from e.g. latin-1 to utf-8 and back. stream must be a file-like object. - encode, decode must adhere to the Codec interface, Reader, + encode and decode must adhere to the Codec interface; Reader and Writer must be factory functions or classes providing the - StreamReader, StreamWriter interface resp. - - encode and decode are needed for the frontend translation, - Reader and Writer for the backend translation. Unicode is - used as intermediate encoding. + StreamReader and StreamWriter interfaces resp. Error handling is done in the same way as defined for the StreamWriter/Readers. @@ -865,7 +856,7 @@ ### Shortcuts -def open(filename, mode='rb', encoding=None, errors='strict', buffering=1): +def open(filename, mode='r', encoding=None, errors='strict', buffering=1): """ Open an encoded file using the given mode and return a wrapped version providing transparent encoding/decoding. @@ -875,10 +866,8 @@ codecs. Output is also codec dependent and will usually be Unicode as well. - Files are always opened in binary mode, even if no binary mode - was specified. This is done to avoid data loss due to encodings - using 8-bit values. The default file mode is 'rb' meaning to - open the file in binary read mode. + Underlying encoded files are always opened in binary mode. + The default file mode is 'r', meaning to open the file in read mode. encoding specifies the encoding which is to be used for the file. @@ -914,13 +903,13 @@ """ Return a wrapped version of file which provides transparent encoding translation. - Strings written to the wrapped file are interpreted according - to the given data_encoding and then written to the original - file as string using file_encoding. The intermediate encoding + Data written to the wrapped file is decoded according + to the given data_encoding and then encoded to the underlying + file using file_encoding. The intermediate data type will usually be Unicode but depends on the specified codecs. - Strings are read from the file using file_encoding and then - passed back to the caller as string using data_encoding. + Bytes read from the file are decoded using file_encoding and then + passed back to the caller encoded using data_encoding. If file_encoding is not given, it defaults to data_encoding. diff --git a/Lib/test/test_codecs.py b/Lib/test/test_codecs.py --- a/Lib/test/test_codecs.py +++ b/Lib/test/test_codecs.py @@ -1140,6 +1140,8 @@ # Python used to crash on this at exit because of a refcount # bug in _codecsmodule.c + self.assertTrue(f.closed) + # From RFC 3492 punycode_testcases = [ # A Arabic (Egyptian): @@ -1592,6 +1594,16 @@ self.assertEqual(encoder.encode("ample.org."), b"xn--xample-9ta.org.") self.assertEqual(encoder.encode("", True), b"") + def test_errors(self): + """Only supports "strict" error handler""" + "python.org".encode("idna", "strict") + b"python.org".decode("idna", "strict") + for errors in ("ignore", "replace", "backslashreplace", + "surrogateescape"): + self.assertRaises(Exception, "python.org".encode, "idna", errors) + self.assertRaises(Exception, + b"python.org".decode, "idna", errors) + class CodecsModuleTest(unittest.TestCase): def test_decode(self): @@ -1682,6 +1694,24 @@ for api in codecs.__all__: getattr(codecs, api) + def test_open(self): + self.addCleanup(support.unlink, support.TESTFN) + for mode in ('w', 'r', 'r+', 'w+', 'a', 'a+'): + with self.subTest(mode), \ + codecs.open(support.TESTFN, mode, 'ascii') as file: + self.assertIsInstance(file, codecs.StreamReaderWriter) + + def test_undefined(self): + self.assertRaises(UnicodeError, codecs.encode, 'abc', 'undefined') + self.assertRaises(UnicodeError, codecs.decode, b'abc', 'undefined') + self.assertRaises(UnicodeError, codecs.encode, '', 'undefined') + self.assertRaises(UnicodeError, codecs.decode, b'', 'undefined') + for errors in ('strict', 'ignore', 'replace', 'backslashreplace'): + self.assertRaises(UnicodeError, + codecs.encode, 'abc', 'undefined', errors) + self.assertRaises(UnicodeError, + codecs.decode, b'abc', 'undefined', errors) + class StreamReaderTest(unittest.TestCase): def setUp(self): @@ -1815,13 +1845,10 @@ # "undefined" # The following encodings don't work in stateful mode -broken_unicode_with_streams = [ +broken_unicode_with_stateful = [ "punycode", "unicode_internal" ] -broken_incremental_coders = broken_unicode_with_streams + [ - "idna", -] class BasicUnicodeTest(unittest.TestCase, MixInCheckStateHandling): def test_basics(self): @@ -1841,7 +1868,7 @@ (chars, size) = codecs.getdecoder(encoding)(b) self.assertEqual(chars, s, "encoding=%r" % encoding) - if encoding not in broken_unicode_with_streams: + if encoding not in broken_unicode_with_stateful: # check stream reader/writer q = Queue(b"") writer = codecs.getwriter(encoding)(q) @@ -1859,7 +1886,7 @@ decodedresult += reader.read() self.assertEqual(decodedresult, s, "encoding=%r" % encoding) - if encoding not in broken_incremental_coders: + if encoding not in broken_unicode_with_stateful: # check incremental decoder/encoder and iterencode()/iterdecode() try: encoder = codecs.getincrementalencoder(encoding)() @@ -1908,7 +1935,7 @@ from _testcapi import codec_incrementalencoder, codec_incrementaldecoder s = "abc123" # all codecs should be able to encode these for encoding in all_unicode_encodings: - if encoding not in broken_incremental_coders: + if encoding not in broken_unicode_with_stateful: # check incremental decoder/encoder (fetched via the C API) try: cencoder = codec_incrementalencoder(encoding) @@ -1948,7 +1975,7 @@ for encoding in all_unicode_encodings: if encoding == "idna": # FIXME: See SF bug #1163178 continue - if encoding in broken_unicode_with_streams: + if encoding in broken_unicode_with_stateful: continue reader = codecs.getreader(encoding)(io.BytesIO(s.encode(encoding))) for t in range(5): @@ -1981,7 +2008,7 @@ # Check that getstate() and setstate() handle the state properly u = "abc123" for encoding in all_unicode_encodings: - if encoding not in broken_incremental_coders: + if encoding not in broken_unicode_with_stateful: self.check_state_handling_decode(encoding, u, u.encode(encoding)) self.check_state_handling_encode(encoding, u, u.encode(encoding)) @@ -2185,6 +2212,7 @@ f = io.BytesIO(b"\xc3\xbc") with codecs.EncodedFile(f, "latin-1", "utf-8") as ef: self.assertEqual(ef.read(), b"\xfc") + self.assertTrue(f.closed) def test_streamreaderwriter(self): f = io.BytesIO(b"\xc3\xbc") diff --git a/Misc/NEWS b/Misc/NEWS --- a/Misc/NEWS +++ b/Misc/NEWS @@ -1441,6 +1441,10 @@ Documentation ------------- +- Issue #19548: Update the codecs module documentation to better cover the + distinction between text encodings and other codecs, together with other + clarifications. Patch by Martin Panter. + - Issue #22394: Doc/Makefile now supports ``make venv PYTHON=../python`` to create a venv for generating the documentation, e.g., ``make html PYTHON=venv/bin/python3``. @@ -1477,6 +1481,10 @@ Tests ----- +- Issue #19548: Added some additional checks to test_codecs to ensure that + statements in the updated documentation remain accurate. Patch by Martin + Panter. + - Issue #22838: All test_re tests now work with unittest test discovery. - Issue #22173: Update lib2to3 tests to use unittest test discovery. diff --git a/Modules/_codecsmodule.c b/Modules/_codecsmodule.c --- a/Modules/_codecsmodule.c +++ b/Modules/_codecsmodule.c @@ -54,9 +54,9 @@ "register(search_function)\n\ \n\ Register a codec search function. Search functions are expected to take\n\ -one argument, the encoding name in all lower case letters, and return\n\ -a tuple of functions (encoder, decoder, stream_reader, stream_writer)\n\ -(or a CodecInfo object)."); +one argument, the encoding name in all lower case letters, and either\n\ +return None, or a tuple of functions (encoder, decoder, stream_reader,\n\ +stream_writer) (or a CodecInfo object)."); static PyObject *codec_register(PyObject *self, PyObject *search_function) -- Repository URL: https://hg.python.org/cpython From python-checkins at python.org Tue Jan 6 15:38:52 2015 From: python-checkins at python.org (nick.coghlan) Date: Tue, 06 Jan 2015 14:38:52 +0000 Subject: [Python-checkins] =?utf-8?b?Y3B5dGhvbiAoMy40KTogSXNzdWUgMTk1NDg6?= =?utf-8?q?_update_codecs_module_documentation?= Message-ID: <20150106143716.22405.91104@psf.io> https://hg.python.org/cpython/rev/0646eee8296a changeset: 94053:0646eee8296a branch: 3.4 parent: 94050:7f82f50fdad0 user: Nick Coghlan date: Wed Jan 07 00:22:00 2015 +1000 summary: Issue 19548: update codecs module documentation - clarified the distinction between text encodings and other codecs - clarified relationship with builtin open and the io module - consolidated documentation of error handlers into one section - clarified type constraints of some behaviours - added tests for some of the new statements in the docs files: Doc/glossary.rst | 5 +- Doc/library/codecs.rst | 648 +++++++++++++------------ Doc/library/functions.rst | 8 +- Doc/library/stdtypes.rst | 4 +- Doc/library/tarfile.rst | 2 +- Lib/codecs.py | 71 +- Lib/test/test_codecs.py | 46 +- Misc/NEWS | 8 + Modules/_codecsmodule.c | 6 +- 9 files changed, 426 insertions(+), 372 deletions(-) diff --git a/Doc/glossary.rst b/Doc/glossary.rst --- a/Doc/glossary.rst +++ b/Doc/glossary.rst @@ -820,10 +820,13 @@ :meth:`~collections.somenamedtuple._asdict`. Examples of struct sequences include :data:`sys.float_info` and the return value of :func:`os.stat`. + text encoding + A codec which encodes Unicode strings to bytes. + text file A :term:`file object` able to read and write :class:`str` objects. Often, a text file actually accesses a byte-oriented datastream - and handles the text encoding automatically. + and handles the :term:`text encoding` automatically. .. seealso:: A :term:`binary file` reads and write :class:`bytes` objects. diff --git a/Doc/library/codecs.rst b/Doc/library/codecs.rst --- a/Doc/library/codecs.rst +++ b/Doc/library/codecs.rst @@ -17,10 +17,17 @@ pair: stackable; streams This module defines base classes for standard Python codecs (encoders and -decoders) and provides access to the internal Python codec registry which -manages the codec and error handling lookup process. +decoders) and provides access to the internal Python codec registry, which +manages the codec and error handling lookup process. Most standard codecs +are :term:`text encodings `, which encode text to bytes, +but there are also codecs provided that encode text to text, and bytes to +bytes. Custom codecs may encode and decode between arbitrary types, but some +module features are restricted to use specifically with +:term:`text encodings `, or with codecs that encode to +:class:`bytes`. -It defines the following functions: +The module defines the following functions for encoding and decoding with +any codec: .. function:: encode(obj, [encoding[, errors]]) @@ -28,7 +35,7 @@ encoding is ``utf-8``. *Errors* may be given to set the desired error handling scheme. The - default error handler is ``strict`` meaning that encoding errors raise + default error handler is ``'strict'`` meaning that encoding errors raise :exc:`ValueError` (or a more codec specific subclass, such as :exc:`UnicodeEncodeError`). Refer to :ref:`codec-base-classes` for more information on codec error handling. @@ -39,90 +46,63 @@ encoding is ``utf-8``. *Errors* may be given to set the desired error handling scheme. The - default error handler is ``strict`` meaning that decoding errors raise + default error handler is ``'strict'`` meaning that decoding errors raise :exc:`ValueError` (or a more codec specific subclass, such as :exc:`UnicodeDecodeError`). Refer to :ref:`codec-base-classes` for more information on codec error handling. -.. function:: register(search_function) - - Register a codec search function. Search functions are expected to take one - argument, the encoding name in all lower case letters, and return a - :class:`CodecInfo` object having the following attributes: - - * ``name`` The name of the encoding; - - * ``encode`` The stateless encoding function; - - * ``decode`` The stateless decoding function; - - * ``incrementalencoder`` An incremental encoder class or factory function; - - * ``incrementaldecoder`` An incremental decoder class or factory function; - - * ``streamwriter`` A stream writer class or factory function; - - * ``streamreader`` A stream reader class or factory function. - - The various functions or classes take the following arguments: - - *encode* and *decode*: These must be functions or methods which have the same - interface as the :meth:`~Codec.encode`/:meth:`~Codec.decode` methods of Codec - instances (see :ref:`Codec Interface `). The functions/methods - are expected to work in a stateless mode. - - *incrementalencoder* and *incrementaldecoder*: These have to be factory - functions providing the following interface: - - ``factory(errors='strict')`` - - The factory functions must return objects providing the interfaces defined by - the base classes :class:`IncrementalEncoder` and :class:`IncrementalDecoder`, - respectively. Incremental codecs can maintain state. - - *streamreader* and *streamwriter*: These have to be factory functions providing - the following interface: - - ``factory(stream, errors='strict')`` - - The factory functions must return objects providing the interfaces defined by - the base classes :class:`StreamReader` and :class:`StreamWriter`, respectively. - Stream codecs can maintain state. - - Possible values for errors are - - * ``'strict'``: raise an exception in case of an encoding error - * ``'replace'``: replace malformed data with a suitable replacement marker, - such as ``'?'`` or ``'\ufffd'`` - * ``'ignore'``: ignore malformed data and continue without further notice - * ``'xmlcharrefreplace'``: replace with the appropriate XML character - reference (for encoding only) - * ``'backslashreplace'``: replace with backslashed escape sequences (for - encoding only) - * ``'surrogateescape'``: on decoding, replace with code points in the Unicode - Private Use Area ranging from U+DC80 to U+DCFF. These private code - points will then be turned back into the same bytes when the - ``surrogateescape`` error handler is used when encoding the data. - (See :pep:`383` for more.) - - as well as any other error handling name defined via :func:`register_error`. - - In case a search function cannot find a given encoding, it should return - ``None``. - +The full details for each codec can also be looked up directly: .. function:: lookup(encoding) Looks up the codec info in the Python codec registry and returns a - :class:`CodecInfo` object as defined above. + :class:`CodecInfo` object as defined below. Encodings are first looked up in the registry's cache. If not found, the list of registered search functions is scanned. If no :class:`CodecInfo` object is found, a :exc:`LookupError` is raised. Otherwise, the :class:`CodecInfo` object is stored in the cache and returned to the caller. -To simplify access to the various codecs, the module provides these additional -functions which use :func:`lookup` for the codec lookup: +.. class:: CodecInfo(encode, decode, streamreader=None, streamwriter=None, incrementalencoder=None, incrementaldecoder=None, name=None) + + Codec details when looking up the codec registry. The constructor + arguments are stored in attributes of the same name: + + + .. attribute:: name + + The name of the encoding. + + + .. attribute:: encode + decode + + The stateless encoding and decoding functions. These must be + functions or methods which have the same interface as + the :meth:`~Codec.encode` and :meth:`~Codec.decode` methods of Codec + instances (see :ref:`Codec Interface `). + The functions or methods are expected to work in a stateless mode. + + + .. attribute:: incrementalencoder + incrementaldecoder + + Incremental encoder and decoder classes or factory functions. + These have to provide the interface defined by the base classes + :class:`IncrementalEncoder` and :class:`IncrementalDecoder`, + respectively. Incremental codecs can maintain state. + + + .. attribute:: streamwriter + streamreader + + Stream writer and reader classes or factory functions. These have to + provide the interface defined by the base classes + :class:`StreamWriter` and :class:`StreamReader`, respectively. + Stream codecs can maintain state. + +To simplify access to the various codec components, the module provides +these additional functions which use :func:`lookup` for the codec lookup: .. function:: getencoder(encoding) @@ -172,90 +152,43 @@ Raises a :exc:`LookupError` in case the encoding cannot be found. +Custom codecs are made available by registering a suitable codec search +function: -.. function:: register_error(name, error_handler) +.. function:: register(search_function) - Register the error handling function *error_handler* under the name *name*. - *error_handler* will be called during encoding and decoding in case of an error, - when *name* is specified as the errors parameter. - - For encoding *error_handler* will be called with a :exc:`UnicodeEncodeError` - instance, which contains information about the location of the error. The - error handler must either raise this or a different exception or return a - tuple with a replacement for the unencodable part of the input and a position - where encoding should continue. The replacement may be either :class:`str` or - :class:`bytes`. If the replacement is bytes, the encoder will simply copy - them into the output buffer. If the replacement is a string, the encoder will - encode the replacement. Encoding continues on original input at the - specified position. Negative position values will be treated as being - relative to the end of the input string. If the resulting position is out of - bound an :exc:`IndexError` will be raised. - - Decoding and translating works similar, except :exc:`UnicodeDecodeError` or - :exc:`UnicodeTranslateError` will be passed to the handler and that the - replacement from the error handler will be put into the output directly. - - -.. function:: lookup_error(name) - - Return the error handler previously registered under the name *name*. - - Raises a :exc:`LookupError` in case the handler cannot be found. - - -.. function:: strict_errors(exception) - - Implements the ``strict`` error handling: each encoding or decoding error - raises a :exc:`UnicodeError`. - - -.. function:: replace_errors(exception) - - Implements the ``replace`` error handling: malformed data is replaced with a - suitable replacement character such as ``'?'`` in bytestrings and - ``'\ufffd'`` in Unicode strings. - - -.. function:: ignore_errors(exception) - - Implements the ``ignore`` error handling: malformed data is ignored and - encoding or decoding is continued without further notice. - - -.. function:: xmlcharrefreplace_errors(exception) - - Implements the ``xmlcharrefreplace`` error handling (for encoding only): the - unencodable character is replaced by an appropriate XML character reference. - - -.. function:: backslashreplace_errors(exception) - - Implements the ``backslashreplace`` error handling (for encoding only): the - unencodable character is replaced by a backslashed escape sequence. - -To simplify working with encoded files or stream, the module also defines these -utility functions: - - -.. function:: open(filename, mode[, encoding[, errors[, buffering]]]) - - Open an encoded file using the given *mode* and return a wrapped version - providing transparent encoding/decoding. The default file mode is ``'r'`` - meaning to open the file in read mode. + Register a codec search function. Search functions are expected to take one + argument, being the encoding name in all lower case letters, and return a + :class:`CodecInfo` object. In case a search function cannot find + a given encoding, it should return ``None``. .. note:: - The wrapped version's methods will accept and return strings only. Bytes - arguments will be rejected. + Search function registration is not currently reversible, + which may cause problems in some cases, such as unit testing or + module reloading. + +While the builtin :func:`open` and the associated :mod:`io` module are the +recommended approach for working with encoded text files, this module +provides additional utility functions and classes that allow the use of a +wider range of codecs when working with binary files: + +.. function:: open(filename, mode='r', encoding=None, errors='strict', buffering=1) + + Open an encoded file using the given *mode* and return an instance of + :class:`StreamReaderWriter`, providing transparent encoding/decoding. + The default file mode is ``'r'``, meaning to open the file in read mode. .. note:: - Files are always opened in binary mode, even if no binary mode was - specified. This is done to avoid data loss due to encodings using 8-bit - values. This means that no automatic conversion of ``b'\n'`` is done - on reading and writing. + Underlying encoded files are always opened in binary mode. + No automatic conversion of ``'\n'`` is done on reading and writing. + The *mode* argument may be any binary mode acceptable to the built-in + :func:`open` function; the ``'b'`` is automatically added. *encoding* specifies the encoding which is to be used for the file. + Any encoding that encodes to and decodes from bytes is allowed, and + the data types supported by the file methods depend on the codec used. *errors* may be given to define the error handling. It defaults to ``'strict'`` which causes a :exc:`ValueError` to be raised in case an encoding error occurs. @@ -266,12 +199,15 @@ .. function:: EncodedFile(file, data_encoding, file_encoding=None, errors='strict') - Return a wrapped version of file which provides transparent encoding - translation. + Return a :class:`StreamRecoder` instance, a wrapped version of *file* + which provides transparent transcoding. The original file is closed + when the wrapped version is closed. - Bytes written to the wrapped file are interpreted according to the given - *data_encoding* and then written to the original file as bytes using the - *file_encoding*. + Data written to the wrapped file is decoded according to the given + *data_encoding* and then written to the original file as bytes using + *file_encoding*. Bytes read from the original file are decoded + according to *file_encoding*, and the result is encoded + using *data_encoding*. If *file_encoding* is not given, it defaults to *data_encoding*. @@ -283,14 +219,16 @@ .. function:: iterencode(iterator, encoding, errors='strict', **kwargs) Uses an incremental encoder to iteratively encode the input provided by - *iterator*. This function is a :term:`generator`. *errors* (as well as any + *iterator*. This function is a :term:`generator`. + The *errors* argument (as well as any other keyword argument) is passed through to the incremental encoder. .. function:: iterdecode(iterator, encoding, errors='strict', **kwargs) Uses an incremental decoder to iteratively decode the input provided by - *iterator*. This function is a :term:`generator`. *errors* (as well as any + *iterator*. This function is a :term:`generator`. + The *errors* argument (as well as any other keyword argument) is passed through to the incremental decoder. @@ -309,9 +247,10 @@ BOM_UTF32_BE BOM_UTF32_LE - These constants define various encodings of the Unicode byte order mark (BOM) - used in UTF-16 and UTF-32 data streams to indicate the byte order used in the - stream or file and in UTF-8 as a Unicode signature. :const:`BOM_UTF16` is either + These constants define various byte sequences, + being Unicode byte order marks (BOMs) for several encodings. They are + used in UTF-16 and UTF-32 data streams to indicate the byte order used, + and in UTF-8 as a Unicode signature. :const:`BOM_UTF16` is either :const:`BOM_UTF16_BE` or :const:`BOM_UTF16_LE` depending on the platform's native byte order, :const:`BOM` is an alias for :const:`BOM_UTF16`, :const:`BOM_LE` for :const:`BOM_UTF16_LE` and :const:`BOM_BE` for @@ -325,20 +264,25 @@ ------------------ The :mod:`codecs` module defines a set of base classes which define the -interface and can also be used to easily write your own codecs for use in -Python. +interfaces for working with codec objects, and can also be used as the basis +for custom codec implementations. Each codec has to define four interfaces to make it usable as codec in Python: stateless encoder, stateless decoder, stream reader and stream writer. The stream reader and writers typically reuse the stateless encoder/decoder to -implement the file protocols. +implement the file protocols. Codec authors also need to define how the +codec will handle encoding and decoding errors. -The :class:`Codec` class defines the interface for stateless encoders/decoders. -To simplify and standardize error handling, the :meth:`~Codec.encode` and -:meth:`~Codec.decode` methods may implement different error handling schemes by -providing the *errors* string argument. The following string values are defined -and implemented by all standard Python codecs: +.. _error-handlers: + +Error Handlers +^^^^^^^^^^^^^^ + +To simplify and standardize error handling, +codecs may implement different error handling schemes by +accepting the *errors* string argument. The following string values are +defined and implemented by all standard Python codecs: .. tabularcolumns:: |l|L| @@ -346,36 +290,52 @@ | Value | Meaning | +=========================+===============================================+ | ``'strict'`` | Raise :exc:`UnicodeError` (or a subclass); | -| | this is the default. | +| | this is the default. Implemented in | +| | :func:`strict_errors`. | +-------------------------+-----------------------------------------------+ -| ``'ignore'`` | Ignore the character and continue with the | -| | next. | +| ``'ignore'`` | Ignore the malformed data and continue | +| | without further notice. Implemented in | +| | :func:`ignore_errors`. | +-------------------------+-----------------------------------------------+ + +The following error handlers are only applicable to +:term:`text encodings `: + ++-------------------------+-----------------------------------------------+ +| Value | Meaning | ++=========================+===============================================+ | ``'replace'`` | Replace with a suitable replacement | -| | character; Python will use the official | -| | U+FFFD REPLACEMENT CHARACTER for the built-in | -| | Unicode codecs on decoding and '?' on | -| | encoding. | +| | marker; Python will use the official | +| | ``U+FFFD`` REPLACEMENT CHARACTER for the | +| | built-in codecs on decoding, and '?' on | +| | encoding. Implemented in | +| | :func:`replace_errors`. | +-------------------------+-----------------------------------------------+ | ``'xmlcharrefreplace'`` | Replace with the appropriate XML character | -| | reference (only for encoding). | +| | reference (only for encoding). Implemented | +| | in :func:`xmlcharrefreplace_errors`. | +-------------------------+-----------------------------------------------+ | ``'backslashreplace'`` | Replace with backslashed escape sequences | -| | (only for encoding). | +| | (only for encoding). Implemented in | +| | :func:`backslashreplace_errors`. | +-------------------------+-----------------------------------------------+ -| ``'surrogateescape'`` | Replace byte with surrogate U+DCxx, as defined| -| | in :pep:`383`. | +| ``'surrogateescape'`` | On decoding, replace byte with individual | +| | surrogate code ranging from ``U+DC80`` to | +| | ``U+DCFF``. This code will then be turned | +| | back into the same byte when the | +| | ``'surrogateescape'`` error handler is used | +| | when encoding the data. (See :pep:`383` for | +| | more.) | +-------------------------+-----------------------------------------------+ -In addition, the following error handlers are specific to Unicode encoding -schemes: +In addition, the following error handler is specific to the given codecs: +-------------------+------------------------+-------------------------------------------+ -| Value | Codec | Meaning | +| Value | Codecs | Meaning | +===================+========================+===========================================+ |``'surrogatepass'``| utf-8, utf-16, utf-32, | Allow encoding and decoding of surrogate | -| | utf-16-be, utf-16-le, | codes in all the Unicode encoding schemes.| -| | utf-32-be, utf-32-le | | +| | utf-16-be, utf-16-le, | codes. These codecs normally treat the | +| | utf-32-be, utf-32-le | presence of surrogates as an error. | +-------------------+------------------------+-------------------------------------------+ .. versionadded:: 3.1 @@ -384,26 +344,96 @@ .. versionchanged:: 3.4 The ``'surrogatepass'`` error handlers now works with utf-16\* and utf-32\* codecs. -The set of allowed values can be extended via :meth:`register_error`. +The set of allowed values can be extended by registering a new named error +handler: + +.. function:: register_error(name, error_handler) + + Register the error handling function *error_handler* under the name *name*. + The *error_handler* argument will be called during encoding and decoding + in case of an error, when *name* is specified as the errors parameter. + + For encoding, *error_handler* will be called with a :exc:`UnicodeEncodeError` + instance, which contains information about the location of the error. The + error handler must either raise this or a different exception, or return a + tuple with a replacement for the unencodable part of the input and a position + where encoding should continue. The replacement may be either :class:`str` or + :class:`bytes`. If the replacement is bytes, the encoder will simply copy + them into the output buffer. If the replacement is a string, the encoder will + encode the replacement. Encoding continues on original input at the + specified position. Negative position values will be treated as being + relative to the end of the input string. If the resulting position is out of + bound an :exc:`IndexError` will be raised. + + Decoding and translating works similarly, except :exc:`UnicodeDecodeError` or + :exc:`UnicodeTranslateError` will be passed to the handler and that the + replacement from the error handler will be put into the output directly. + + +Previously registered error handlers (including the standard error handlers) +can be looked up by name: + +.. function:: lookup_error(name) + + Return the error handler previously registered under the name *name*. + + Raises a :exc:`LookupError` in case the handler cannot be found. + +The following standard error handlers are also made available as module level +functions: + +.. function:: strict_errors(exception) + + Implements the ``'strict'`` error handling: each encoding or + decoding error raises a :exc:`UnicodeError`. + + +.. function:: replace_errors(exception) + + Implements the ``'replace'`` error handling (for :term:`text encodings + ` only): substitutes ``'?'`` for encoding errors + (to be encoded by the codec), and ``'\ufffd'`` (the Unicode replacement + character, ``'?'``) for decoding errors. + + +.. function:: ignore_errors(exception) + + Implements the ``'ignore'`` error handling: malformed data is ignored and + encoding or decoding is continued without further notice. + + +.. function:: xmlcharrefreplace_errors(exception) + + Implements the ``'xmlcharrefreplace'`` error handling (for encoding with + :term:`text encodings ` only): the + unencodable character is replaced by an appropriate XML character reference. + + +.. function:: backslashreplace_errors(exception) + + Implements the ``'backslashreplace'`` error handling (for encoding with + :term:`text encodings ` only): the + unencodable character is replaced by a backslashed escape sequence. .. _codec-objects: -Codec Objects -^^^^^^^^^^^^^ +Stateless Encoding and Decoding +^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -The :class:`Codec` class defines these methods which also define the function -interfaces of the stateless encoder and decoder: +The base :class:`Codec` class defines these methods which also define the +function interfaces of the stateless encoder and decoder: .. method:: Codec.encode(input[, errors]) Encodes the object *input* and returns a tuple (output object, length consumed). - Encoding converts a string object to a bytes object using a particular + For instance, :term:`text encoding` converts + a string object to a bytes object using a particular character set encoding (e.g., ``cp1252`` or ``iso-8859-1``). - *errors* defines the error handling to apply. It defaults to ``'strict'`` - handling. + The *errors* argument defines the error handling to apply. + It defaults to ``'strict'`` handling. The method may not store state in the :class:`Codec` instance. Use :class:`StreamCodec` for codecs which have to keep state in order to make @@ -416,14 +446,16 @@ .. method:: Codec.decode(input[, errors]) Decodes the object *input* and returns a tuple (output object, length - consumed). Decoding converts a bytes object encoded using a particular + consumed). For instance, for a :term:`text encoding`, decoding converts + a bytes object encoded using a particular character set encoding to a string object. - *input* must be a bytes object or one which provides the read-only character + For text encodings and bytes-to-bytes codecs, + *input* must be a bytes object or one which provides the read-only buffer interface -- for example, buffer objects and memory mapped files. - *errors* defines the error handling to apply. It defaults to ``'strict'`` - handling. + The *errors* argument defines the error handling to apply. + It defaults to ``'strict'`` handling. The method may not store state in the :class:`Codec` instance. Use :class:`StreamCodec` for codecs which have to keep state in order to make @@ -432,6 +464,10 @@ The decoder must be able to handle zero length input and return an empty object of the output object type in this situation. + +Incremental Encoding and Decoding +^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ + The :class:`IncrementalEncoder` and :class:`IncrementalDecoder` classes provide the basic interface for incremental encoding and decoding. Encoding/decoding the input isn't done with one call to the stateless encoder/decoder function, but @@ -449,14 +485,14 @@ .. _incremental-encoder-objects: IncrementalEncoder Objects -^^^^^^^^^^^^^^^^^^^^^^^^^^ +~~~~~~~~~~~~~~~~~~~~~~~~~~ The :class:`IncrementalEncoder` class is used for encoding an input in multiple steps. It defines the following methods which every incremental encoder must define in order to be compatible with the Python codec registry. -.. class:: IncrementalEncoder([errors]) +.. class:: IncrementalEncoder(errors='strict') Constructor for an :class:`IncrementalEncoder` instance. @@ -465,26 +501,14 @@ the Python codec registry. The :class:`IncrementalEncoder` may implement different error handling schemes - by providing the *errors* keyword argument. These parameters are predefined: - - * ``'strict'`` Raise :exc:`ValueError` (or a subclass); this is the default. - - * ``'ignore'`` Ignore the character and continue with the next. - - * ``'replace'`` Replace with a suitable replacement character - - * ``'xmlcharrefreplace'`` Replace with the appropriate XML character reference - - * ``'backslashreplace'`` Replace with backslashed escape sequences. + by providing the *errors* keyword argument. See :ref:`error-handlers` for + possible values. The *errors* argument will be assigned to an attribute of the same name. Assigning to this attribute makes it possible to switch between different error handling strategies during the lifetime of the :class:`IncrementalEncoder` object. - The set of allowed values for the *errors* argument can be extended with - :func:`register_error`. - .. method:: encode(object[, final]) @@ -496,7 +520,8 @@ .. method:: reset() Reset the encoder to the initial state. The output is discarded: call - ``.encode('', final=True)`` to reset the encoder and to get the output. + ``.encode(object, final=True)``, passing an empty byte or text string + if necessary, to reset the encoder and to get the output. .. method:: IncrementalEncoder.getstate() @@ -517,14 +542,14 @@ .. _incremental-decoder-objects: IncrementalDecoder Objects -^^^^^^^^^^^^^^^^^^^^^^^^^^ +~~~~~~~~~~~~~~~~~~~~~~~~~~ The :class:`IncrementalDecoder` class is used for decoding an input in multiple steps. It defines the following methods which every incremental decoder must define in order to be compatible with the Python codec registry. -.. class:: IncrementalDecoder([errors]) +.. class:: IncrementalDecoder(errors='strict') Constructor for an :class:`IncrementalDecoder` instance. @@ -533,22 +558,14 @@ the Python codec registry. The :class:`IncrementalDecoder` may implement different error handling schemes - by providing the *errors* keyword argument. These parameters are predefined: - - * ``'strict'`` Raise :exc:`ValueError` (or a subclass); this is the default. - - * ``'ignore'`` Ignore the character and continue with the next. - - * ``'replace'`` Replace with a suitable replacement character. + by providing the *errors* keyword argument. See :ref:`error-handlers` for + possible values. The *errors* argument will be assigned to an attribute of the same name. Assigning to this attribute makes it possible to switch between different error handling strategies during the lifetime of the :class:`IncrementalDecoder` object. - The set of allowed values for the *errors* argument can be extended with - :func:`register_error`. - .. method:: decode(object[, final]) @@ -587,6 +604,10 @@ returned by :meth:`getstate`. +Stream Encoding and Decoding +^^^^^^^^^^^^^^^^^^^^^^^^^^^^ + + The :class:`StreamWriter` and :class:`StreamReader` classes provide generic working interfaces which can be used to implement new encoding submodules very easily. See :mod:`encodings.utf_8` for an example of how this is done. @@ -595,14 +616,14 @@ .. _stream-writer-objects: StreamWriter Objects -^^^^^^^^^^^^^^^^^^^^ +~~~~~~~~~~~~~~~~~~~~ The :class:`StreamWriter` class is a subclass of :class:`Codec` and defines the following methods which every stream writer must define in order to be compatible with the Python codec registry. -.. class:: StreamWriter(stream[, errors]) +.. class:: StreamWriter(stream, errors='strict') Constructor for a :class:`StreamWriter` instance. @@ -610,29 +631,17 @@ additional keyword arguments, but only the ones defined here are used by the Python codec registry. - *stream* must be a file-like object open for writing binary data. + The *stream* argument must be a file-like object open for writing + text or binary data, as appropriate for the specific codec. The :class:`StreamWriter` may implement different error handling schemes by - providing the *errors* keyword argument. These parameters are predefined: - - * ``'strict'`` Raise :exc:`ValueError` (or a subclass); this is the default. - - * ``'ignore'`` Ignore the character and continue with the next. - - * ``'replace'`` Replace with a suitable replacement character - - * ``'xmlcharrefreplace'`` Replace with the appropriate XML character reference - - * ``'backslashreplace'`` Replace with backslashed escape sequences. + providing the *errors* keyword argument. See :ref:`error-handlers` for + the standard error handlers the underlying stream codec may support. The *errors* argument will be assigned to an attribute of the same name. Assigning to this attribute makes it possible to switch between different error handling strategies during the lifetime of the :class:`StreamWriter` object. - The set of allowed values for the *errors* argument can be extended with - :func:`register_error`. - - .. method:: write(object) Writes the object's contents encoded to the stream. @@ -641,7 +650,8 @@ .. method:: writelines(list) Writes the concatenated list of strings to the stream (possibly by reusing - the :meth:`write` method). + the :meth:`write` method). The standard bytes-to-bytes codecs + do not support this method. .. method:: reset() @@ -660,14 +670,14 @@ .. _stream-reader-objects: StreamReader Objects -^^^^^^^^^^^^^^^^^^^^ +~~~~~~~~~~~~~~~~~~~~ The :class:`StreamReader` class is a subclass of :class:`Codec` and defines the following methods which every stream reader must define in order to be compatible with the Python codec registry. -.. class:: StreamReader(stream[, errors]) +.. class:: StreamReader(stream, errors='strict') Constructor for a :class:`StreamReader` instance. @@ -675,16 +685,12 @@ additional keyword arguments, but only the ones defined here are used by the Python codec registry. - *stream* must be a file-like object open for reading (binary) data. + The *stream* argument must be a file-like object open for reading + text or binary data, as appropriate for the specific codec. The :class:`StreamReader` may implement different error handling schemes by - providing the *errors* keyword argument. These parameters are defined: - - * ``'strict'`` Raise :exc:`ValueError` (or a subclass); this is the default. - - * ``'ignore'`` Ignore the character and continue with the next. - - * ``'replace'`` Replace with a suitable replacement character. + providing the *errors* keyword argument. See :ref:`error-handlers` for + the standard error handlers the underlying stream codec may support. The *errors* argument will be assigned to an attribute of the same name. Assigning to this attribute makes it possible to switch between different error @@ -698,17 +704,20 @@ Decodes data from the stream and returns the resulting object. - *chars* indicates the number of characters to read from the - stream. :func:`read` will never return more than *chars* characters, but - it might return less, if there are not enough characters available. + The *chars* argument indicates the number of decoded + code points or bytes to return. The :func:`read` method will + never return more data than requested, but it might return less, + if there is not enough available. - *size* indicates the approximate maximum number of bytes to read from the - stream for decoding purposes. The decoder can modify this setting as + The *size* argument indicates the approximate maximum + number of encoded bytes or code points to read + for decoding. The decoder can modify this setting as appropriate. The default value -1 indicates to read and decode as much as - possible. *size* is intended to prevent having to decode huge files in - one step. + possible. This parameter is intended to + prevent having to decode huge files in one step. - *firstline* indicates that it would be sufficient to only return the first + The *firstline* flag indicates that + it would be sufficient to only return the first line, if there are decoding errors on later lines. The method should use a greedy read strategy meaning that it should read @@ -751,17 +760,13 @@ In addition to the above methods, the :class:`StreamReader` must also inherit all other methods and attributes from the underlying stream. -The next two base classes are included for convenience. They are not needed by -the codec registry, but may provide useful in practice. - - .. _stream-reader-writer: StreamReaderWriter Objects -^^^^^^^^^^^^^^^^^^^^^^^^^^ +~~~~~~~~~~~~~~~~~~~~~~~~~~ -The :class:`StreamReaderWriter` allows wrapping streams which work in both read -and write modes. +The :class:`StreamReaderWriter` is a convenience class that allows wrapping +streams which work in both read and write modes. The design is such that one can use the factory functions returned by the :func:`lookup` function to construct the instance. @@ -782,9 +787,9 @@ .. _stream-recoder-objects: StreamRecoder Objects -^^^^^^^^^^^^^^^^^^^^^ +~~~~~~~~~~~~~~~~~~~~~ -The :class:`StreamRecoder` provide a frontend - backend view of encoding data +The :class:`StreamRecoder` translates data from one encoding to another, which is sometimes useful when dealing with different encoding environments. The design is such that one can use the factory functions returned by the @@ -794,22 +799,20 @@ .. class:: StreamRecoder(stream, encode, decode, Reader, Writer, errors) Creates a :class:`StreamRecoder` instance which implements a two-way conversion: - *encode* and *decode* work on the frontend (the input to :meth:`read` and output - of :meth:`write`) while *Reader* and *Writer* work on the backend (reading and - writing to the stream). + *encode* and *decode* work on the frontend?? the data visible to + code calling :meth:`read` and :meth:`write`, while *Reader* and *Writer* + work on the backend?? the data in *stream*. - You can use these objects to do transparent direct recodings from e.g. Latin-1 + You can use these objects to do transparent transcodings from e.g. Latin-1 to UTF-8 and back. - *stream* must be a file-like object. + The *stream* argument must be a file-like object. - *encode*, *decode* must adhere to the :class:`Codec` interface. *Reader*, + The *encode* and *decode* arguments must + adhere to the :class:`Codec` interface. *Reader* and *Writer* must be factory functions or classes providing objects of the :class:`StreamReader` and :class:`StreamWriter` interface respectively. - *encode* and *decode* are needed for the frontend translation, *Reader* and - *Writer* for the backend translation. - Error handling is done in the same way as defined for the stream readers and writers. @@ -824,20 +827,23 @@ Encodings and Unicode --------------------- -Strings are stored internally as sequences of codepoints in range ``0 - 10FFFF`` -(see :pep:`393` for more details about the implementation). -Once a string object is used outside of CPU and memory, CPU endianness -and how these arrays are stored as bytes become an issue. Transforming a -string object into a sequence of bytes is called encoding and recreating the -string object from the sequence of bytes is known as decoding. There are many -different methods for how this transformation can be done (these methods are -also called encodings). The simplest method is to map the codepoints 0-255 to -the bytes ``0x0``-``0xff``. This means that a string object that contains -codepoints above ``U+00FF`` can't be encoded with this method (which is called -``'latin-1'`` or ``'iso-8859-1'``). :func:`str.encode` will raise a -:exc:`UnicodeEncodeError` that looks like this: ``UnicodeEncodeError: 'latin-1' -codec can't encode character '\u1234' in position 3: ordinal not in -range(256)``. +Strings are stored internally as sequences of codepoints in +range ``0x0``-``0x10FFFF``. (See :pep:`393` for +more details about the implementation.) +Once a string object is used outside of CPU and memory, endianness +and how these arrays are stored as bytes become an issue. As with other +codecs, serialising a string into a sequence of bytes is known as *encoding*, +and recreating the string from the sequence of bytes is known as *decoding*. +There are a variety of different text serialisation codecs, which are +collectivity referred to as :term:`text encodings `. + +The simplest text encoding (called ``'latin-1'`` or ``'iso-8859-1'``) maps +the codepoints 0-255 to the bytes ``0x0``-``0xff``, which means that a string +object that contains codepoints above ``U+00FF`` can't be encoded with this +codec. Doing so will raise a :exc:`UnicodeEncodeError` that looks +like the following (although the details of the error message may differ): +``UnicodeEncodeError: 'latin-1' codec can't encode character '\u1234' in +position 3: ordinal not in range(256)``. There's another group of encodings (the so called charmap encodings) that choose a different subset of all Unicode code points and how these codepoints are @@ -1184,7 +1190,8 @@ .. versionchanged:: 3.4 The utf-16\* and utf-32\* encoders no longer allow surrogate code points - (U+D800--U+DFFF) to be encoded. The utf-32\* decoders no longer decode + (``U+D800``--``U+DFFF``) to be encoded. + The utf-32\* decoders no longer decode byte sequences that correspond to surrogate code points. @@ -1212,7 +1219,9 @@ +====================+=========+===========================+ | idna | | Implements :rfc:`3490`, | | | | see also | -| | | :mod:`encodings.idna` | +| | | :mod:`encodings.idna`. | +| | | Only ``errors='strict'`` | +| | | is supported. | +--------------------+---------+---------------------------+ | mbcs | dbcs | Windows only: Encode | | | | operand according to the | @@ -1220,31 +1229,44 @@ +--------------------+---------+---------------------------+ | palmos | | Encoding of PalmOS 3.5 | +--------------------+---------+---------------------------+ -| punycode | | Implements :rfc:`3492` | +| punycode | | Implements :rfc:`3492`. | +| | | Stateful codecs are not | +| | | supported. | +--------------------+---------+---------------------------+ -| raw_unicode_escape | | Produce a string that is | -| | | suitable as raw Unicode | -| | | literal in Python source | -| | | code | +| raw_unicode_escape | | Latin-1 encoding with | +| | | ``\uXXXX`` and | +| | | ``\UXXXXXXXX`` for other | +| | | code points. Existing | +| | | backslashes are not | +| | | escaped in any way. | +| | | It is used in the Python | +| | | pickle protocol. | +--------------------+---------+---------------------------+ | undefined | | Raise an exception for | -| | | all conversions. Can be | -| | | used as the system | -| | | encoding if no automatic | -| | | coercion between byte and | -| | | Unicode strings is | -| | | desired. | +| | | all conversions, even | +| | | empty strings. The error | +| | | handler is ignored. | +--------------------+---------+---------------------------+ -| unicode_escape | | Produce a string that is | -| | | suitable as Unicode | -| | | literal in Python source | -| | | code | +| unicode_escape | | Encoding suitable as the | +| | | contents of a Unicode | +| | | literal in ASCII-encoded | +| | | Python source code, | +| | | except that quotes are | +| | | not escaped. Decodes from | +| | | Latin-1 source code. | +| | | Beware that Python source | +| | | code actually uses UTF-8 | +| | | by default. | +--------------------+---------+---------------------------+ | unicode_internal | | Return the internal | | | | representation of the | -| | | operand | +| | | operand. Stateful codecs | +| | | are not supported. | | | | | | | | .. deprecated:: 3.3 | +| | | This representation is | +| | | obsoleted by | +| | | :pep:`393`. | +--------------------+---------+---------------------------+ .. _binary-transforms: @@ -1253,7 +1275,8 @@ ^^^^^^^^^^^^^^^^^ The following codecs provide binary transforms: :term:`bytes-like object` -to :class:`bytes` mappings. +to :class:`bytes` mappings. They are not supported by :meth:`bytes.decode` +(which only produces :class:`str` output). .. tabularcolumns:: |l|L|L|L| @@ -1308,7 +1331,8 @@ ^^^^^^^^^^^^^^^ The following codec provides a text transform: a :class:`str` to :class:`str` -mapping. +mapping. It is not supported by :meth:`str.encode` (which only produces +:class:`bytes` output). .. tabularcolumns:: |l|l|L| diff --git a/Doc/library/functions.rst b/Doc/library/functions.rst --- a/Doc/library/functions.rst +++ b/Doc/library/functions.rst @@ -939,15 +939,17 @@ *encoding* is the name of the encoding used to decode or encode the file. This should only be used in text mode. The default encoding is platform dependent (whatever :func:`locale.getpreferredencoding` returns), but any - encoding supported by Python can be used. See the :mod:`codecs` module for + :term:`text encoding` supported by Python + can be used. See the :mod:`codecs` module for the list of supported encodings. *errors* is an optional string that specifies how encoding and decoding errors are to be handled--this cannot be used in binary mode. - A variety of standard error handlers are available, though any + A variety of standard error handlers are available + (listed under :ref:`error-handlers`), though any error handling name that has been registered with :func:`codecs.register_error` is also valid. The standard names - are: + include: * ``'strict'`` to raise a :exc:`ValueError` exception if there is an encoding error. The default value of ``None`` has the same diff --git a/Doc/library/stdtypes.rst b/Doc/library/stdtypes.rst --- a/Doc/library/stdtypes.rst +++ b/Doc/library/stdtypes.rst @@ -1512,7 +1512,7 @@ a :exc:`UnicodeError`. Other possible values are ``'ignore'``, ``'replace'``, ``'xmlcharrefreplace'``, ``'backslashreplace'`` and any other name registered via - :func:`codecs.register_error`, see section :ref:`codec-base-classes`. For a + :func:`codecs.register_error`, see section :ref:`error-handlers`. For a list of possible encodings, see section :ref:`standard-encodings`. .. versionchanged:: 3.1 @@ -2384,7 +2384,7 @@ error handling scheme. The default for *errors* is ``'strict'``, meaning that encoding errors raise a :exc:`UnicodeError`. Other possible values are ``'ignore'``, ``'replace'`` and any other name registered via - :func:`codecs.register_error`, see section :ref:`codec-base-classes`. For a + :func:`codecs.register_error`, see section :ref:`error-handlers`. For a list of possible encodings, see section :ref:`standard-encodings`. .. note:: diff --git a/Doc/library/tarfile.rst b/Doc/library/tarfile.rst --- a/Doc/library/tarfile.rst +++ b/Doc/library/tarfile.rst @@ -794,7 +794,7 @@ appropriately, this conversion may fail. The *errors* argument defines how characters are treated that cannot be -converted. Possible values are listed in section :ref:`codec-base-classes`. +converted. Possible values are listed in section :ref:`error-handlers`. The default scheme is ``'surrogateescape'`` which Python also uses for its file system calls, see :ref:`os-filenames`. diff --git a/Lib/codecs.py b/Lib/codecs.py --- a/Lib/codecs.py +++ b/Lib/codecs.py @@ -346,8 +346,7 @@ """ Creates a StreamWriter instance. - stream must be a file-like object open for writing - (binary) data. + stream must be a file-like object open for writing. The StreamWriter may use different error handling schemes by providing the errors keyword argument. These @@ -421,8 +420,7 @@ """ Creates a StreamReader instance. - stream must be a file-like object open for reading - (binary) data. + stream must be a file-like object open for reading. The StreamReader may use different error handling schemes by providing the errors keyword argument. These @@ -450,13 +448,12 @@ """ Decodes data from the stream self.stream and returns the resulting object. - chars indicates the number of characters to read from the - stream. read() will never return more than chars - characters, but it might return less, if there are not enough - characters available. + chars indicates the number of decoded code points or bytes to + return. read() will never return more data than requested, + but it might return less, if there is not enough available. - size indicates the approximate maximum number of bytes to - read from the stream for decoding purposes. The decoder + size indicates the approximate maximum number of decoded + bytes or code points to read for decoding. The decoder can modify this setting as appropriate. The default value -1 indicates to read and decode as much as possible. size is intended to prevent having to decode huge files in one @@ -467,7 +464,7 @@ will be returned, the rest of the input will be kept until the next call to read(). - The method should use a greedy read strategy meaning that + The method should use a greedy read strategy, meaning that it should read as much data as is allowed within the definition of the encoding and the given size, e.g. if optional encoding endings or state markers are available @@ -602,7 +599,7 @@ def readlines(self, sizehint=None, keepends=True): """ Read all lines available on the input stream - and return them as list of lines. + and return them as a list. Line breaks are implemented using the codec's decoder method and are included in the list entries. @@ -750,19 +747,18 @@ class StreamRecoder: - """ StreamRecoder instances provide a frontend - backend - view of encoding data. + """ StreamRecoder instances translate data from one encoding to another. They use the complete set of APIs returned by the codecs.lookup() function to implement their task. - Data written to the stream is first decoded into an - intermediate format (which is dependent on the given codec - combination) and then written to the stream using an instance - of the provided Writer class. + Data written to the StreamRecoder is first decoded into an + intermediate format (depending on the "decode" codec) and then + written to the underlying stream using an instance of the provided + Writer class. - In the other direction, data is read from the stream using a - Reader instance and then return encoded data to the caller. + In the other direction, data is read from the underlying stream using + a Reader instance and then encoded and returned to the caller. """ # Optional attributes set by the file wrappers below @@ -774,22 +770,17 @@ """ Creates a StreamRecoder instance which implements a two-way conversion: encode and decode work on the frontend (the - input to .read() and output of .write()) while - Reader and Writer work on the backend (reading and - writing to the stream). + data visible to .read() and .write()) while Reader and Writer + work on the backend (the data in stream). - You can use these objects to do transparent direct - recodings from e.g. latin-1 to utf-8 and back. + You can use these objects to do transparent + transcodings from e.g. latin-1 to utf-8 and back. stream must be a file-like object. - encode, decode must adhere to the Codec interface, Reader, + encode and decode must adhere to the Codec interface; Reader and Writer must be factory functions or classes providing the - StreamReader, StreamWriter interface resp. - - encode and decode are needed for the frontend translation, - Reader and Writer for the backend translation. Unicode is - used as intermediate encoding. + StreamReader and StreamWriter interfaces resp. Error handling is done in the same way as defined for the StreamWriter/Readers. @@ -864,7 +855,7 @@ ### Shortcuts -def open(filename, mode='rb', encoding=None, errors='strict', buffering=1): +def open(filename, mode='r', encoding=None, errors='strict', buffering=1): """ Open an encoded file using the given mode and return a wrapped version providing transparent encoding/decoding. @@ -874,10 +865,8 @@ codecs. Output is also codec dependent and will usually be Unicode as well. - Files are always opened in binary mode, even if no binary mode - was specified. This is done to avoid data loss due to encodings - using 8-bit values. The default file mode is 'rb' meaning to - open the file in binary read mode. + Underlying encoded files are always opened in binary mode. + The default file mode is 'r', meaning to open the file in read mode. encoding specifies the encoding which is to be used for the file. @@ -913,13 +902,13 @@ """ Return a wrapped version of file which provides transparent encoding translation. - Strings written to the wrapped file are interpreted according - to the given data_encoding and then written to the original - file as string using file_encoding. The intermediate encoding + Data written to the wrapped file is decoded according + to the given data_encoding and then encoded to the underlying + file using file_encoding. The intermediate data type will usually be Unicode but depends on the specified codecs. - Strings are read from the file using file_encoding and then - passed back to the caller as string using data_encoding. + Bytes read from the file are decoded using file_encoding and then + passed back to the caller encoded using data_encoding. If file_encoding is not given, it defaults to data_encoding. diff --git a/Lib/test/test_codecs.py b/Lib/test/test_codecs.py --- a/Lib/test/test_codecs.py +++ b/Lib/test/test_codecs.py @@ -1139,6 +1139,8 @@ # Python used to crash on this at exit because of a refcount # bug in _codecsmodule.c + self.assertTrue(f.closed) + # From RFC 3492 punycode_testcases = [ # A Arabic (Egyptian): @@ -1591,6 +1593,16 @@ self.assertEqual(encoder.encode("ample.org."), b"xn--xample-9ta.org.") self.assertEqual(encoder.encode("", True), b"") + def test_errors(self): + """Only supports "strict" error handler""" + "python.org".encode("idna", "strict") + b"python.org".decode("idna", "strict") + for errors in ("ignore", "replace", "backslashreplace", + "surrogateescape"): + self.assertRaises(Exception, "python.org".encode, "idna", errors) + self.assertRaises(Exception, + b"python.org".decode, "idna", errors) + class CodecsModuleTest(unittest.TestCase): def test_decode(self): @@ -1668,6 +1680,24 @@ for api in codecs.__all__: getattr(codecs, api) + def test_open(self): + self.addCleanup(support.unlink, support.TESTFN) + for mode in ('w', 'r', 'r+', 'w+', 'a', 'a+'): + with self.subTest(mode), \ + codecs.open(support.TESTFN, mode, 'ascii') as file: + self.assertIsInstance(file, codecs.StreamReaderWriter) + + def test_undefined(self): + self.assertRaises(UnicodeError, codecs.encode, 'abc', 'undefined') + self.assertRaises(UnicodeError, codecs.decode, b'abc', 'undefined') + self.assertRaises(UnicodeError, codecs.encode, '', 'undefined') + self.assertRaises(UnicodeError, codecs.decode, b'', 'undefined') + for errors in ('strict', 'ignore', 'replace', 'backslashreplace'): + self.assertRaises(UnicodeError, + codecs.encode, 'abc', 'undefined', errors) + self.assertRaises(UnicodeError, + codecs.decode, b'abc', 'undefined', errors) + class StreamReaderTest(unittest.TestCase): def setUp(self): @@ -1801,13 +1831,10 @@ # "undefined" # The following encodings don't work in stateful mode -broken_unicode_with_streams = [ +broken_unicode_with_stateful = [ "punycode", "unicode_internal" ] -broken_incremental_coders = broken_unicode_with_streams + [ - "idna", -] class BasicUnicodeTest(unittest.TestCase, MixInCheckStateHandling): def test_basics(self): @@ -1827,7 +1854,7 @@ (chars, size) = codecs.getdecoder(encoding)(b) self.assertEqual(chars, s, "encoding=%r" % encoding) - if encoding not in broken_unicode_with_streams: + if encoding not in broken_unicode_with_stateful: # check stream reader/writer q = Queue(b"") writer = codecs.getwriter(encoding)(q) @@ -1845,7 +1872,7 @@ decodedresult += reader.read() self.assertEqual(decodedresult, s, "encoding=%r" % encoding) - if encoding not in broken_incremental_coders: + if encoding not in broken_unicode_with_stateful: # check incremental decoder/encoder and iterencode()/iterdecode() try: encoder = codecs.getincrementalencoder(encoding)() @@ -1894,7 +1921,7 @@ from _testcapi import codec_incrementalencoder, codec_incrementaldecoder s = "abc123" # all codecs should be able to encode these for encoding in all_unicode_encodings: - if encoding not in broken_incremental_coders: + if encoding not in broken_unicode_with_stateful: # check incremental decoder/encoder (fetched via the C API) try: cencoder = codec_incrementalencoder(encoding) @@ -1934,7 +1961,7 @@ for encoding in all_unicode_encodings: if encoding == "idna": # FIXME: See SF bug #1163178 continue - if encoding in broken_unicode_with_streams: + if encoding in broken_unicode_with_stateful: continue reader = codecs.getreader(encoding)(io.BytesIO(s.encode(encoding))) for t in range(5): @@ -1967,7 +1994,7 @@ # Check that getstate() and setstate() handle the state properly u = "abc123" for encoding in all_unicode_encodings: - if encoding not in broken_incremental_coders: + if encoding not in broken_unicode_with_stateful: self.check_state_handling_decode(encoding, u, u.encode(encoding)) self.check_state_handling_encode(encoding, u, u.encode(encoding)) @@ -2171,6 +2198,7 @@ f = io.BytesIO(b"\xc3\xbc") with codecs.EncodedFile(f, "latin-1", "utf-8") as ef: self.assertEqual(ef.read(), b"\xfc") + self.assertTrue(f.closed) def test_streamreaderwriter(self): f = io.BytesIO(b"\xc3\xbc") diff --git a/Misc/NEWS b/Misc/NEWS --- a/Misc/NEWS +++ b/Misc/NEWS @@ -265,6 +265,10 @@ Tests ----- +- Issue #19548: Added some additional checks to test_codecs to ensure that + statements in the updated documentation remain accurate. Patch by Martin + Panter. + - Issue #22838: All test_re tests now work with unittest test discovery. - Issue #22173: Update lib2to3 tests to use unittest test discovery. @@ -297,6 +301,10 @@ Documentation ------------- +- Issue #19548: Update the codecs module documentation to better cover the + distinction between text encodings and other codecs, together with other + clarifications. Patch by Martin Panter. + - Issue #22914: Update the Python 2/3 porting HOWTO to describe a more automated approach. diff --git a/Modules/_codecsmodule.c b/Modules/_codecsmodule.c --- a/Modules/_codecsmodule.c +++ b/Modules/_codecsmodule.c @@ -54,9 +54,9 @@ "register(search_function)\n\ \n\ Register a codec search function. Search functions are expected to take\n\ -one argument, the encoding name in all lower case letters, and return\n\ -a tuple of functions (encoder, decoder, stream_reader, stream_writer)\n\ -(or a CodecInfo object)."); +one argument, the encoding name in all lower case letters, and either\n\ +return None, or a tuple of functions (encoder, decoder, stream_reader,\n\ +stream_writer) (or a CodecInfo object)."); static PyObject *codec_register(PyObject *self, PyObject *search_function) -- Repository URL: https://hg.python.org/cpython From python-checkins at python.org Tue Jan 6 17:27:16 2015 From: python-checkins at python.org (berker.peksag) Date: Tue, 06 Jan 2015 16:27:16 +0000 Subject: [Python-checkins] =?utf-8?q?cpython_=282=2E7=29=3A_Fix_a_markup_e?= =?utf-8?q?rror_in_the_argparse_documentation=2E?= Message-ID: <20150106162648.72555.74400@psf.io> https://hg.python.org/cpython/rev/35b5ff543d4b changeset: 94055:35b5ff543d4b branch: 2.7 parent: 94049:eddcb6671a48 user: Berker Peksag date: Tue Jan 06 18:29:04 2015 +0200 summary: Fix a markup error in the argparse documentation. Reported by Jason Sachs on docs at . files: Doc/library/argparse.rst | 4 ++-- 1 files changed, 2 insertions(+), 2 deletions(-) diff --git a/Doc/library/argparse.rst b/Doc/library/argparse.rst --- a/Doc/library/argparse.rst +++ b/Doc/library/argparse.rst @@ -1216,8 +1216,8 @@ which processes arguments from the command-line. Any object which follows this API may be passed as the ``action`` parameter to :meth:`add_argument`. -.. class:: Action(option_strings, dest, nargs=None, const=None, default=None, - type=None, choices=None, required=False, help=None, +.. class:: Action(option_strings, dest, nargs=None, const=None, default=None, \ + type=None, choices=None, required=False, help=None, \ metavar=None) Action objects are used by an ArgumentParser to represent the information needed -- Repository URL: https://hg.python.org/cpython From python-checkins at python.org Tue Jan 6 17:33:50 2015 From: python-checkins at python.org (berker.peksag) Date: Tue, 06 Jan 2015 16:33:50 +0000 Subject: [Python-checkins] =?utf-8?q?cpython_=282=2E7=29=3A_Silence_a_Sphi?= =?utf-8?q?nx_warning_in_ftplib=2Erst=2E?= Message-ID: <20150106163346.11571.53580@psf.io> https://hg.python.org/cpython/rev/67224c88144e changeset: 94056:67224c88144e branch: 2.7 user: Berker Peksag date: Tue Jan 06 18:36:02 2015 +0200 summary: Silence a Sphinx warning in ftplib.rst. ftplib.rst:61: WARNING: Block quote ends without a blank line; unexpect ed unindent. files: Doc/library/ftplib.rst | 2 +- 1 files changed, 1 insertions(+), 1 deletions(-) diff --git a/Doc/library/ftplib.rst b/Doc/library/ftplib.rst --- a/Doc/library/ftplib.rst +++ b/Doc/library/ftplib.rst @@ -57,7 +57,7 @@ .. class:: FTP_TLS([host[, user[, passwd[, acct[, keyfile[, certfile[, context[, timeout]]]]]]]]) - A :class:`FTP` subclass which adds TLS support to FTP as described in + A :class:`FTP` subclass which adds TLS support to FTP as described in :rfc:`4217`. Connect as usual to port 21 implicitly securing the FTP control connection before authenticating. Securing the data connection requires the user to -- Repository URL: https://hg.python.org/cpython From python-checkins at python.org Tue Jan 6 22:54:28 2015 From: python-checkins at python.org (georg.brandl) Date: Tue, 06 Jan 2015 21:54:28 +0000 Subject: [Python-checkins] =?utf-8?q?release=3A_comment_update?= Message-ID: <20150106215425.8751.16553@psf.io> https://hg.python.org/release/rev/d21e7f948b8b changeset: 92:d21e7f948b8b user: Georg Brandl date: Tue Jan 06 22:54:14 2015 +0100 summary: comment update files: add-to-pydotorg.py | 4 ++-- 1 files changed, 2 insertions(+), 2 deletions(-) diff --git a/add-to-pydotorg.py b/add-to-pydotorg.py --- a/add-to-pydotorg.py +++ b/add-to-pydotorg.py @@ -3,11 +3,11 @@ Script to add ReleaseFile objects for Python releases on the new pydotorg. To use (RELEASE is something like 3.3.5rc2): -* Copy this script to dinsdale (it needs access to all the release files). +* Copy this script to dl-files (it needs access to all the release files). You could also download all files, then you need to adapt the "ftp_root" string below. -* Make sure all download files are in place in the correct /data/ftp.python.org +* Make sure all download files are in place in the correct /srv/www.python.org subdirectory. * Create a new Release object via the Django admin (adding via API is -- Repository URL: https://hg.python.org/release From python-checkins at python.org Wed Jan 7 04:15:03 2015 From: python-checkins at python.org (nick.coghlan) Date: Wed, 07 Jan 2015 03:15:03 +0000 Subject: [Python-checkins] =?utf-8?q?cpython=3A_Issue_=2319548=3A_clean_up?= =?utf-8?q?_merge_issues_in_codecs_docs?= Message-ID: <20150107031458.8767.41129@psf.io> https://hg.python.org/cpython/rev/20a5a56ce090 changeset: 94057:20a5a56ce090 parent: 94054:4d00d0109147 user: Nick Coghlan date: Wed Jan 07 13:14:47 2015 +1000 summary: Issue #19548: clean up merge issues in codecs docs Patch by Martin Panter to clean up some problems with the merge of the codecs docs changes from Python 3.4. files: Doc/library/codecs.rst | 8 +++++--- 1 files changed, 5 insertions(+), 3 deletions(-) diff --git a/Doc/library/codecs.rst b/Doc/library/codecs.rst --- a/Doc/library/codecs.rst +++ b/Doc/library/codecs.rst @@ -256,7 +256,6 @@ encodings. -.. _surrogateescape: .. _codec-base-classes: Codec Base Classes @@ -273,6 +272,7 @@ codec will handle encoding and decoding errors. +.. _surrogateescape: .. _error-handlers: Error Handlers @@ -319,7 +319,8 @@ | | :func:`backslashreplace_errors`. | +-------------------------+-----------------------------------------------+ | ``'namereplace'`` | Replace with ``\N{...}`` escape sequences | -| | (only for encoding). | +| | (only for encoding). Implemented in | +| | :func:`namereplace_errors`. | +-------------------------+-----------------------------------------------+ | ``'surrogateescape'`` | On decoding, replace byte with individual | | | surrogate code ranging from ``U+DC80`` to | @@ -422,7 +423,8 @@ .. function:: namereplace_errors(exception) - Implements the ``namereplace`` error handling (for encoding only): the + Implements the ``'namereplace'`` error handling (for encoding with + :term:`text encodings ` only): the unencodable character is replaced by a ``\N{...}`` escape sequence. .. versionadded:: 3.5 -- Repository URL: https://hg.python.org/cpython From python-checkins at python.org Wed Jan 7 07:16:18 2015 From: python-checkins at python.org (raymond.hettinger) Date: Wed, 07 Jan 2015 06:16:18 +0000 Subject: [Python-checkins] =?utf-8?q?cpython=3A_Minor_speed-up=2E__Use_loc?= =?utf-8?q?al_variable_instead_of_a_global_lookup=2E?= Message-ID: <20150107061616.11571.19695@psf.io> https://hg.python.org/cpython/rev/98376cf9133d changeset: 94058:98376cf9133d user: Raymond Hettinger date: Tue Jan 06 22:16:10 2015 -0800 summary: Minor speed-up. Use local variable instead of a global lookup. files: Lib/functools.py | 20 ++++++++++---------- 1 files changed, 10 insertions(+), 10 deletions(-) diff --git a/Lib/functools.py b/Lib/functools.py --- a/Lib/functools.py +++ b/Lib/functools.py @@ -98,7 +98,7 @@ 'Return a > b. Computed by @total_ordering from (not a < b) and (a != b).' op_result = self.__lt__(other) if op_result is NotImplemented: - return NotImplemented + return op_result return not op_result and self != other def _le_from_lt(self, other): @@ -110,35 +110,35 @@ 'Return a >= b. Computed by @total_ordering from (not a < b).' op_result = self.__lt__(other) if op_result is NotImplemented: - return NotImplemented + return op_result return not op_result def _ge_from_le(self, other): 'Return a >= b. Computed by @total_ordering from (not a <= b) or (a == b).' op_result = self.__le__(other) if op_result is NotImplemented: - return NotImplemented + return op_result return not op_result or self == other def _lt_from_le(self, other): 'Return a < b. Computed by @total_ordering from (a <= b) and (a != b).' op_result = self.__le__(other) if op_result is NotImplemented: - return NotImplemented + return op_result return op_result and self != other def _gt_from_le(self, other): 'Return a > b. Computed by @total_ordering from (not a <= b).' op_result = self.__le__(other) if op_result is NotImplemented: - return NotImplemented + return op_result return not op_result def _lt_from_gt(self, other): 'Return a < b. Computed by @total_ordering from (not a > b) and (a != b).' op_result = self.__gt__(other) if op_result is NotImplemented: - return NotImplemented + return op_result return not op_result and self != other def _ge_from_gt(self, other): @@ -150,28 +150,28 @@ 'Return a <= b. Computed by @total_ordering from (not a > b).' op_result = self.__gt__(other) if op_result is NotImplemented: - return NotImplemented + return op_result return not op_result def _le_from_ge(self, other): 'Return a <= b. Computed by @total_ordering from (not a >= b) or (a == b).' op_result = self.__ge__(other) if op_result is NotImplemented: - return NotImplemented + return op_result return not op_result or self == other def _gt_from_ge(self, other): 'Return a > b. Computed by @total_ordering from (a >= b) and (a != b).' op_result = self.__ge__(other) if op_result is NotImplemented: - return NotImplemented + return op_result return op_result and self != other def _lt_from_ge(self, other): 'Return a < b. Computed by @total_ordering from (not a >= b).' op_result = self.__ge__(other) if op_result is NotImplemented: - return NotImplemented + return op_result return not op_result def total_ordering(cls): -- Repository URL: https://hg.python.org/cpython From solipsis at pitrou.net Wed Jan 7 09:39:40 2015 From: solipsis at pitrou.net (solipsis at pitrou.net) Date: Wed, 07 Jan 2015 09:39:40 +0100 Subject: [Python-checkins] Daily reference leaks (4d00d0109147): sum=9 Message-ID: results for 4d00d0109147 on branch "default" -------------------------------------------- test_asyncio leaked [0, 3, 3] memory blocks, sum=6 test_functools leaked [0, 0, 3] memory blocks, sum=3 Command line was: ['./python', '-m', 'test.regrtest', '-uall', '-R', '3:3:/home/antoine/cpython/refleaks/reflogDf86iG', '-x'] From python-checkins at python.org Wed Jan 7 18:14:57 2015 From: python-checkins at python.org (benjamin.peterson) Date: Wed, 07 Jan 2015 17:14:57 +0000 Subject: [Python-checkins] =?utf-8?q?cpython=3A_expose_the_client=27s_ciph?= =?utf-8?q?er_suites_from_the_handshake_=28closes_=2323186=29?= Message-ID: <20150107171447.8767.17724@psf.io> https://hg.python.org/cpython/rev/bc34fcccaca3 changeset: 94059:bc34fcccaca3 user: Benjamin Peterson date: Wed Jan 07 11:14:26 2015 -0600 summary: expose the client's cipher suites from the handshake (closes #23186) files: Doc/library/ssl.rst | 12 +++++ Lib/ssl.py | 10 ++++ Lib/test/test_ssl.py | 17 +++++++ Misc/NEWS | 4 + Modules/_ssl.c | 72 ++++++++++++++++++++++--------- 5 files changed, 94 insertions(+), 21 deletions(-) diff --git a/Doc/library/ssl.rst b/Doc/library/ssl.rst --- a/Doc/library/ssl.rst +++ b/Doc/library/ssl.rst @@ -925,6 +925,17 @@ version of the SSL protocol that defines its use, and the number of secret bits being used. If no connection has been established, returns ``None``. +.. method:: SSLSocket.shared_ciphers() + + Return the list of ciphers shared by the client during the handshake. Each + entry of the returned list is a three-value tuple containing the name of the + cipher, the version of the SSL protocol that defines its use, and the number + of secret bits the cipher uses. :meth:`~SSLSocket.shared_ciphers` returns + ``None`` if no connection has been established or the socket is a client + socket. + + .. versionadded:: 3.5 + .. method:: SSLSocket.compression() Return the compression algorithm being used as a string, or ``None`` @@ -1784,6 +1795,7 @@ - :meth:`~SSLSocket.getpeercert` - :meth:`~SSLSocket.selected_npn_protocol` - :meth:`~SSLSocket.cipher` + - :meth:`~SSLSocket.shared_ciphers` - :meth:`~SSLSocket.compression` - :meth:`~SSLSocket.pending` - :meth:`~SSLSocket.do_handshake` diff --git a/Lib/ssl.py b/Lib/ssl.py --- a/Lib/ssl.py +++ b/Lib/ssl.py @@ -572,6 +572,10 @@ ssl_version, secret_bits)``.""" return self._sslobj.cipher() + def shared_ciphers(self): + """Return the ciphers shared by the client during the handshake.""" + return self._sslobj.shared_ciphers() + def compression(self): """Return the current compression algorithm in use, or ``None`` if compression was not negotiated or not supported by one of the peers.""" @@ -784,6 +788,12 @@ else: return self._sslobj.cipher() + def shared_ciphers(self): + self._checkClosed() + if not self._sslobj: + return None + return self._sslobj.shared_ciphers() + def compression(self): self._checkClosed() if not self._sslobj: diff --git a/Lib/test/test_ssl.py b/Lib/test/test_ssl.py --- a/Lib/test/test_ssl.py +++ b/Lib/test/test_ssl.py @@ -1698,11 +1698,13 @@ sslobj = ctx.wrap_bio(incoming, outgoing, False, 'svn.python.org') self.assertIs(sslobj._sslobj.owner, sslobj) self.assertIsNone(sslobj.cipher()) + self.assertIsNone(sslobj.shared_ciphers()) self.assertRaises(ValueError, sslobj.getpeercert) if 'tls-unique' in ssl.CHANNEL_BINDING_TYPES: self.assertIsNone(sslobj.get_channel_binding('tls-unique')) self.ssl_io_loop(sock, incoming, outgoing, sslobj.do_handshake) self.assertTrue(sslobj.cipher()) + self.assertIsNone(sslobj.shared_ciphers()) self.assertTrue(sslobj.getpeercert()) if 'tls-unique' in ssl.CHANNEL_BINDING_TYPES: self.assertTrue(sslobj.get_channel_binding('tls-unique')) @@ -1776,6 +1778,7 @@ self.close() return False else: + self.server.shared_ciphers.append(self.sslconn.shared_ciphers()) if self.server.context.verify_mode == ssl.CERT_REQUIRED: cert = self.sslconn.getpeercert() if support.verbose and self.server.chatty: @@ -1891,6 +1894,7 @@ self.flag = None self.active = False self.selected_protocols = [] + self.shared_ciphers = [] self.conn_errors = [] threading.Thread.__init__(self) self.daemon = True @@ -2121,6 +2125,7 @@ }) s.close() stats['server_npn_protocols'] = server.selected_protocols + stats['server_shared_ciphers'] = server.shared_ciphers return stats def try_protocol_combo(server_protocol, client_protocol, expect_success, @@ -3157,6 +3162,18 @@ self.assertEqual(cm.exception.reason, 'TLSV1_ALERT_INTERNAL_ERROR') self.assertIn("TypeError", stderr.getvalue()) + def test_shared_ciphers(self): + server_context = ssl.SSLContext(ssl.PROTOCOL_SSLv23) + client_context = ssl.SSLContext(ssl.PROTOCOL_SSLv23) + client_context.set_ciphers("3DES") + server_context.set_ciphers("3DES:AES") + stats = server_params_test(client_context, server_context) + ciphers = stats['server_shared_ciphers'][0] + self.assertGreater(len(ciphers), 0) + for name, tls_version, bits in ciphers: + self.assertIn("DES-CBC3-", name) + self.assertEqual(bits, 112) + def test_read_write_after_close_raises_valuerror(self): context = ssl.SSLContext(ssl.PROTOCOL_SSLv23) context.verify_mode = ssl.CERT_REQUIRED diff --git a/Misc/NEWS b/Misc/NEWS --- a/Misc/NEWS +++ b/Misc/NEWS @@ -199,6 +199,10 @@ Library ------- +- Issue #23186: Add ssl.SSLObject.shared_ciphers() and + ssl.SSLSocket.shared_ciphers() to fetch the client's list ciphers sent at + handshake. + - Issue #23143: Remove compatibility with OpenSSLs older than 0.9.8. - Issue #23132: Improve performance and introspection support of comparison diff --git a/Modules/_ssl.c b/Modules/_ssl.c --- a/Modules/_ssl.c +++ b/Modules/_ssl.c @@ -1360,54 +1360,83 @@ peer certificate, or None if no certificate was provided. This will\n\ return the certificate even if it wasn't validated."); -static PyObject *PySSL_cipher (PySSLSocket *self) { - - PyObject *retval, *v; - const SSL_CIPHER *current; - char *cipher_name; - char *cipher_protocol; - - if (self->ssl == NULL) - Py_RETURN_NONE; - current = SSL_get_current_cipher(self->ssl); - if (current == NULL) - Py_RETURN_NONE; - - retval = PyTuple_New(3); +static PyObject * +cipher_to_tuple(const SSL_CIPHER *cipher) +{ + const char *cipher_name, *cipher_protocol; + PyObject *v, *retval = PyTuple_New(3); if (retval == NULL) return NULL; - cipher_name = (char *) SSL_CIPHER_get_name(current); + cipher_name = SSL_CIPHER_get_name(cipher); if (cipher_name == NULL) { Py_INCREF(Py_None); PyTuple_SET_ITEM(retval, 0, Py_None); } else { v = PyUnicode_FromString(cipher_name); if (v == NULL) - goto fail0; + goto fail; PyTuple_SET_ITEM(retval, 0, v); } - cipher_protocol = (char *) SSL_CIPHER_get_version(current); + + cipher_protocol = SSL_CIPHER_get_version(cipher); if (cipher_protocol == NULL) { Py_INCREF(Py_None); PyTuple_SET_ITEM(retval, 1, Py_None); } else { v = PyUnicode_FromString(cipher_protocol); if (v == NULL) - goto fail0; + goto fail; PyTuple_SET_ITEM(retval, 1, v); } - v = PyLong_FromLong(SSL_CIPHER_get_bits(current, NULL)); + + v = PyLong_FromLong(SSL_CIPHER_get_bits(cipher, NULL)); if (v == NULL) - goto fail0; + goto fail; PyTuple_SET_ITEM(retval, 2, v); + return retval; - fail0: + fail: Py_DECREF(retval); return NULL; } +static PyObject *PySSL_shared_ciphers(PySSLSocket *self) +{ + STACK_OF(SSL_CIPHER) *ciphers; + int i; + PyObject *res; + + if (!self->ssl->session || !self->ssl->session->ciphers) + Py_RETURN_NONE; + ciphers = self->ssl->session->ciphers; + res = PyList_New(sk_SSL_CIPHER_num(ciphers)); + if (!res) + return NULL; + for (i = 0; i < sk_SSL_CIPHER_num(ciphers); i++) { + PyObject *tup = cipher_to_tuple(sk_SSL_CIPHER_value(ciphers, i)); + if (!tup) { + Py_DECREF(res); + return NULL; + } + PyList_SET_ITEM(res, i, tup); + } + return res; +} + +static PyObject *PySSL_cipher (PySSLSocket *self) +{ + const SSL_CIPHER *current; + + if (self->ssl == NULL) + Py_RETURN_NONE; + current = SSL_get_current_cipher(self->ssl); + if (current == NULL) + Py_RETURN_NONE; + return cipher_to_tuple(current); +} + static PyObject *PySSL_version(PySSLSocket *self) { const char *version; @@ -2019,6 +2048,7 @@ {"peer_certificate", (PyCFunction)PySSL_peercert, METH_VARARGS, PySSL_peercert_doc}, {"cipher", (PyCFunction)PySSL_cipher, METH_NOARGS}, + {"shared_ciphers", (PyCFunction)PySSL_shared_ciphers, METH_NOARGS}, {"version", (PyCFunction)PySSL_version, METH_NOARGS}, #ifdef OPENSSL_NPN_NEGOTIATED {"selected_npn_protocol", (PyCFunction)PySSL_selected_npn_protocol, METH_NOARGS}, -- Repository URL: https://hg.python.org/cpython From python-checkins at python.org Wed Jan 7 18:32:25 2015 From: python-checkins at python.org (benjamin.peterson) Date: Wed, 07 Jan 2015 17:32:25 +0000 Subject: [Python-checkins] =?utf-8?q?cpython=3A_explain_None_can_be_return?= =?utf-8?q?ed?= Message-ID: <20150107173205.22419.96082@psf.io> https://hg.python.org/cpython/rev/7c208844f560 changeset: 94060:7c208844f560 user: Benjamin Peterson date: Wed Jan 07 11:26:50 2015 -0600 summary: explain None can be returned files: Lib/ssl.py | 4 +++- 1 files changed, 3 insertions(+), 1 deletions(-) diff --git a/Lib/ssl.py b/Lib/ssl.py --- a/Lib/ssl.py +++ b/Lib/ssl.py @@ -573,7 +573,9 @@ return self._sslobj.cipher() def shared_ciphers(self): - """Return the ciphers shared by the client during the handshake.""" + """Return the a list of ciphers shared by the client during the + handshake or None if this is not a valid server connection. + """ return self._sslobj.shared_ciphers() def compression(self): -- Repository URL: https://hg.python.org/cpython From python-checkins at python.org Wed Jan 7 18:32:25 2015 From: python-checkins at python.org (benjamin.peterson) Date: Wed, 07 Jan 2015 17:32:25 +0000 Subject: [Python-checkins] =?utf-8?q?cpython=3A_use_SSL=5Fget=5Fsession?= Message-ID: <20150107173205.72581.21726@psf.io> https://hg.python.org/cpython/rev/dd41bc9c5f60 changeset: 94061:dd41bc9c5f60 user: Benjamin Peterson date: Wed Jan 07 11:32:00 2015 -0600 summary: use SSL_get_session files: Modules/_ssl.c | 5 +++-- 1 files changed, 3 insertions(+), 2 deletions(-) diff --git a/Modules/_ssl.c b/Modules/_ssl.c --- a/Modules/_ssl.c +++ b/Modules/_ssl.c @@ -1404,13 +1404,14 @@ static PyObject *PySSL_shared_ciphers(PySSLSocket *self) { + SSL_SESSION *sess = SSL_get_session(self->ssl); STACK_OF(SSL_CIPHER) *ciphers; int i; PyObject *res; - if (!self->ssl->session || !self->ssl->session->ciphers) + if (!sess || !sess->ciphers) Py_RETURN_NONE; - ciphers = self->ssl->session->ciphers; + ciphers = sess->ciphers; res = PyList_New(sk_SSL_CIPHER_num(ciphers)); if (!res) return NULL; -- Repository URL: https://hg.python.org/cpython From python-checkins at python.org Wed Jan 7 18:35:27 2015 From: python-checkins at python.org (benjamin.peterson) Date: Wed, 07 Jan 2015 17:35:27 +0000 Subject: [Python-checkins] =?utf-8?q?cpython=3A_remove_apparently_wrong_as?= =?utf-8?q?sertion_about_des_bit_size?= Message-ID: <20150107173355.72575.75267@psf.io> https://hg.python.org/cpython/rev/9c68831ff153 changeset: 94062:9c68831ff153 user: Benjamin Peterson date: Wed Jan 07 11:33:51 2015 -0600 summary: remove apparently wrong assertion about des bit size files: Lib/test/test_ssl.py | 1 - 1 files changed, 0 insertions(+), 1 deletions(-) diff --git a/Lib/test/test_ssl.py b/Lib/test/test_ssl.py --- a/Lib/test/test_ssl.py +++ b/Lib/test/test_ssl.py @@ -3172,7 +3172,6 @@ self.assertGreater(len(ciphers), 0) for name, tls_version, bits in ciphers: self.assertIn("DES-CBC3-", name) - self.assertEqual(bits, 112) def test_read_write_after_close_raises_valuerror(self): context = ssl.SSLContext(ssl.PROTOCOL_SSLv23) -- Repository URL: https://hg.python.org/cpython From python-checkins at python.org Wed Jan 7 18:42:54 2015 From: python-checkins at python.org (benjamin.peterson) Date: Wed, 07 Jan 2015 17:42:54 +0000 Subject: [Python-checkins] =?utf-8?q?cpython=3A_force_test_server_to_speak?= =?utf-8?q?_tlsv1?= Message-ID: <20150107174243.22417.97506@psf.io> https://hg.python.org/cpython/rev/d2fbefb1c818 changeset: 94063:d2fbefb1c818 user: Benjamin Peterson date: Wed Jan 07 11:42:38 2015 -0600 summary: force test server to speak tlsv1 files: Lib/test/test_ssl.py | 2 +- 1 files changed, 1 insertions(+), 1 deletions(-) diff --git a/Lib/test/test_ssl.py b/Lib/test/test_ssl.py --- a/Lib/test/test_ssl.py +++ b/Lib/test/test_ssl.py @@ -3163,7 +3163,7 @@ self.assertIn("TypeError", stderr.getvalue()) def test_shared_ciphers(self): - server_context = ssl.SSLContext(ssl.PROTOCOL_SSLv23) + server_context = ssl.SSLContext(ssl.PROTOCOL_TLSv1) client_context = ssl.SSLContext(ssl.PROTOCOL_SSLv23) client_context.set_ciphers("3DES") server_context.set_ciphers("3DES:AES") -- Repository URL: https://hg.python.org/cpython From python-checkins at python.org Wed Jan 7 19:59:36 2015 From: python-checkins at python.org (benjamin.peterson) Date: Wed, 07 Jan 2015 18:59:36 +0000 Subject: [Python-checkins] =?utf-8?q?cpython=3A_include_some_more_ciphers?= Message-ID: <20150107185926.72551.12533@psf.io> https://hg.python.org/cpython/rev/15ce5fee0508 changeset: 94064:15ce5fee0508 user: Benjamin Peterson date: Wed Jan 07 12:59:20 2015 -0600 summary: include some more ciphers files: Lib/test/test_ssl.py | 6 +++--- 1 files changed, 3 insertions(+), 3 deletions(-) diff --git a/Lib/test/test_ssl.py b/Lib/test/test_ssl.py --- a/Lib/test/test_ssl.py +++ b/Lib/test/test_ssl.py @@ -3165,13 +3165,13 @@ def test_shared_ciphers(self): server_context = ssl.SSLContext(ssl.PROTOCOL_TLSv1) client_context = ssl.SSLContext(ssl.PROTOCOL_SSLv23) - client_context.set_ciphers("3DES") - server_context.set_ciphers("3DES:AES") + client_context.set_ciphers("3DES:DES") + server_context.set_ciphers("3DES:DES:AES") stats = server_params_test(client_context, server_context) ciphers = stats['server_shared_ciphers'][0] self.assertGreater(len(ciphers), 0) for name, tls_version, bits in ciphers: - self.assertIn("DES-CBC3-", name) + self.assertIn("DES", name.split("-")) def test_read_write_after_close_raises_valuerror(self): context = ssl.SSLContext(ssl.PROTOCOL_SSLv23) -- Repository URL: https://hg.python.org/cpython From python-checkins at python.org Wed Jan 7 20:15:56 2015 From: python-checkins at python.org (berker.peksag) Date: Wed, 07 Jan 2015 19:15:56 +0000 Subject: [Python-checkins] =?utf-8?b?Y3B5dGhvbiAoMy40KTogSXNzdWUgIzIwNDg3?= =?utf-8?q?=3A_Clarify_meaning_of_=22side_effect=22_in_the_magic_mock_docu?= =?utf-8?q?mentation=2E?= Message-ID: <20150107191544.125888.92064@psf.io> https://hg.python.org/cpython/rev/230a1bfb0f59 changeset: 94065:230a1bfb0f59 branch: 3.4 parent: 94053:0646eee8296a user: Berker Peksag date: Wed Jan 07 21:15:02 2015 +0200 summary: Issue #20487: Clarify meaning of "side effect" in the magic mock documentation. Patch by A.M. Kuchling. files: Doc/library/unittest.mock.rst | 7 ++++--- 1 files changed, 4 insertions(+), 3 deletions(-) diff --git a/Doc/library/unittest.mock.rst b/Doc/library/unittest.mock.rst --- a/Doc/library/unittest.mock.rst +++ b/Doc/library/unittest.mock.rst @@ -1678,9 +1678,10 @@ >>> object() in mock False -The two equality method, :meth:`__eq__` and :meth:`__ne__`, are special. -They do the default equality comparison on identity, using a side -effect, unless you change their return value to return something else: +The two equality methods, :meth:`__eq__` and :meth:`__ne__`, are special. +They do the default equality comparison on identity, using the +:attr:`~Mock.side_effect` attribute, unless you change their return value to +return something else:: >>> MagicMock() == 3 False -- Repository URL: https://hg.python.org/cpython From python-checkins at python.org Wed Jan 7 20:15:56 2015 From: python-checkins at python.org (berker.peksag) Date: Wed, 07 Jan 2015 19:15:56 +0000 Subject: [Python-checkins] =?utf-8?q?cpython_=28merge_3=2E4_-=3E_default?= =?utf-8?q?=29=3A_Issue_=2320487=3A_Clarify_meaning_of_=22side_effect=22_i?= =?utf-8?q?n_the_magic_mock_documentation=2E?= Message-ID: <20150107191545.72573.65161@psf.io> https://hg.python.org/cpython/rev/3cf91d2aeab3 changeset: 94066:3cf91d2aeab3 parent: 94064:15ce5fee0508 parent: 94065:230a1bfb0f59 user: Berker Peksag date: Wed Jan 07 21:15:33 2015 +0200 summary: Issue #20487: Clarify meaning of "side effect" in the magic mock documentation. Patch by A.M. Kuchling. files: Doc/library/unittest.mock.rst | 7 ++++--- 1 files changed, 4 insertions(+), 3 deletions(-) diff --git a/Doc/library/unittest.mock.rst b/Doc/library/unittest.mock.rst --- a/Doc/library/unittest.mock.rst +++ b/Doc/library/unittest.mock.rst @@ -1720,9 +1720,10 @@ >>> object() in mock False -The two equality method, :meth:`__eq__` and :meth:`__ne__`, are special. -They do the default equality comparison on identity, using a side -effect, unless you change their return value to return something else: +The two equality methods, :meth:`__eq__` and :meth:`__ne__`, are special. +They do the default equality comparison on identity, using the +:attr:`~Mock.side_effect` attribute, unless you change their return value to +return something else:: >>> MagicMock() == 3 False -- Repository URL: https://hg.python.org/cpython From python-checkins at python.org Wed Jan 7 20:29:27 2015 From: python-checkins at python.org (benjamin.peterson) Date: Wed, 07 Jan 2015 19:29:27 +0000 Subject: [Python-checkins] =?utf-8?q?cpython=3A_everyone_should_support_AE?= =?utf-8?q?S_ciphers?= Message-ID: <20150107192924.72559.61214@psf.io> https://hg.python.org/cpython/rev/7a9a2a9c2e2b changeset: 94067:7a9a2a9c2e2b user: Benjamin Peterson date: Wed Jan 07 13:28:40 2015 -0600 summary: everyone should support AES ciphers files: Lib/test/test_ssl.py | 7 ++++--- 1 files changed, 4 insertions(+), 3 deletions(-) diff --git a/Lib/test/test_ssl.py b/Lib/test/test_ssl.py --- a/Lib/test/test_ssl.py +++ b/Lib/test/test_ssl.py @@ -3165,13 +3165,14 @@ def test_shared_ciphers(self): server_context = ssl.SSLContext(ssl.PROTOCOL_TLSv1) client_context = ssl.SSLContext(ssl.PROTOCOL_SSLv23) - client_context.set_ciphers("3DES:DES") - server_context.set_ciphers("3DES:DES:AES") + client_context.set_ciphers("AES128") + server_context.set_ciphers("AES128:AES256") stats = server_params_test(client_context, server_context) ciphers = stats['server_shared_ciphers'][0] self.assertGreater(len(ciphers), 0) for name, tls_version, bits in ciphers: - self.assertIn("DES", name.split("-")) + self.assertIn("AES128", name.split("-")) + self.assertEqual(bits, 128) def test_read_write_after_close_raises_valuerror(self): context = ssl.SSLContext(ssl.PROTOCOL_SSLv23) -- Repository URL: https://hg.python.org/cpython From python-checkins at python.org Wed Jan 7 21:21:32 2015 From: python-checkins at python.org (benjamin.peterson) Date: Wed, 07 Jan 2015 20:21:32 +0000 Subject: [Python-checkins] =?utf-8?q?cpython=3A_rc4_is_a_long_time_favorit?= =?utf-8?q?e?= Message-ID: <20150107202128.8745.8287@psf.io> https://hg.python.org/cpython/rev/803898b6bee3 changeset: 94068:803898b6bee3 user: Benjamin Peterson date: Wed Jan 07 14:21:22 2015 -0600 summary: rc4 is a long time favorite files: Lib/test/test_ssl.py | 4 ++-- 1 files changed, 2 insertions(+), 2 deletions(-) diff --git a/Lib/test/test_ssl.py b/Lib/test/test_ssl.py --- a/Lib/test/test_ssl.py +++ b/Lib/test/test_ssl.py @@ -3165,8 +3165,8 @@ def test_shared_ciphers(self): server_context = ssl.SSLContext(ssl.PROTOCOL_TLSv1) client_context = ssl.SSLContext(ssl.PROTOCOL_SSLv23) - client_context.set_ciphers("AES128") - server_context.set_ciphers("AES128:AES256") + client_context.set_ciphers("RC4") + server_context.set_ciphers("AES:RC4") stats = server_params_test(client_context, server_context) ciphers = stats['server_shared_ciphers'][0] self.assertGreater(len(ciphers), 0) -- Repository URL: https://hg.python.org/cpython From python-checkins at python.org Wed Jan 7 21:30:00 2015 From: python-checkins at python.org (benjamin.peterson) Date: Wed, 07 Jan 2015 20:30:00 +0000 Subject: [Python-checkins] =?utf-8?q?cpython=3A_fix_assertions_after_ciphe?= =?utf-8?q?rs_were_changed?= Message-ID: <20150107202955.11561.11225@psf.io> https://hg.python.org/cpython/rev/a969b9c53675 changeset: 94069:a969b9c53675 user: Benjamin Peterson date: Wed Jan 07 14:29:45 2015 -0600 summary: fix assertions after ciphers were changed files: Lib/test/test_ssl.py | 3 +-- 1 files changed, 1 insertions(+), 2 deletions(-) diff --git a/Lib/test/test_ssl.py b/Lib/test/test_ssl.py --- a/Lib/test/test_ssl.py +++ b/Lib/test/test_ssl.py @@ -3171,8 +3171,7 @@ ciphers = stats['server_shared_ciphers'][0] self.assertGreater(len(ciphers), 0) for name, tls_version, bits in ciphers: - self.assertIn("AES128", name.split("-")) - self.assertEqual(bits, 128) + self.assertIn("RC4", name.split("-")) def test_read_write_after_close_raises_valuerror(self): context = ssl.SSLContext(ssl.PROTOCOL_SSLv23) -- Repository URL: https://hg.python.org/cpython From python-checkins at python.org Thu Jan 8 03:03:44 2015 From: python-checkins at python.org (benjamin.peterson) Date: Thu, 08 Jan 2015 02:03:44 +0000 Subject: [Python-checkins] =?utf-8?q?cpython=3A_try_using_AES256?= Message-ID: <20150108020331.22419.65654@psf.io> https://hg.python.org/cpython/rev/e42301eb3ce8 changeset: 94070:e42301eb3ce8 user: Benjamin Peterson date: Wed Jan 07 20:03:27 2015 -0600 summary: try using AES256 files: Lib/test/test_ssl.py | 6 +++--- 1 files changed, 3 insertions(+), 3 deletions(-) diff --git a/Lib/test/test_ssl.py b/Lib/test/test_ssl.py --- a/Lib/test/test_ssl.py +++ b/Lib/test/test_ssl.py @@ -3165,13 +3165,13 @@ def test_shared_ciphers(self): server_context = ssl.SSLContext(ssl.PROTOCOL_TLSv1) client_context = ssl.SSLContext(ssl.PROTOCOL_SSLv23) - client_context.set_ciphers("RC4") - server_context.set_ciphers("AES:RC4") + client_context.set_ciphers("AES256") + server_context.set_ciphers("RC4:AES") stats = server_params_test(client_context, server_context) ciphers = stats['server_shared_ciphers'][0] self.assertGreater(len(ciphers), 0) for name, tls_version, bits in ciphers: - self.assertIn("RC4", name.split("-")) + self.assertIn("AES256", name.split("-")) def test_read_write_after_close_raises_valuerror(self): context = ssl.SSLContext(ssl.PROTOCOL_SSLv23) -- Repository URL: https://hg.python.org/cpython From python-checkins at python.org Thu Jan 8 03:31:04 2015 From: python-checkins at python.org (benjamin.peterson) Date: Thu, 08 Jan 2015 02:31:04 +0000 Subject: [Python-checkins] =?utf-8?q?cpython=3A_drop_256?= Message-ID: <20150108023102.22411.32872@psf.io> https://hg.python.org/cpython/rev/c65c054a605d changeset: 94071:c65c054a605d user: Benjamin Peterson date: Wed Jan 07 20:30:59 2015 -0600 summary: drop 256 files: Lib/test/test_ssl.py | 4 ++-- 1 files changed, 2 insertions(+), 2 deletions(-) diff --git a/Lib/test/test_ssl.py b/Lib/test/test_ssl.py --- a/Lib/test/test_ssl.py +++ b/Lib/test/test_ssl.py @@ -3165,13 +3165,13 @@ def test_shared_ciphers(self): server_context = ssl.SSLContext(ssl.PROTOCOL_TLSv1) client_context = ssl.SSLContext(ssl.PROTOCOL_SSLv23) - client_context.set_ciphers("AES256") + client_context.set_ciphers("AES") server_context.set_ciphers("RC4:AES") stats = server_params_test(client_context, server_context) ciphers = stats['server_shared_ciphers'][0] self.assertGreater(len(ciphers), 0) for name, tls_version, bits in ciphers: - self.assertIn("AES256", name.split("-")) + self.assertIn("AES", name) def test_read_write_after_close_raises_valuerror(self): context = ssl.SSLContext(ssl.PROTOCOL_SSLv23) -- Repository URL: https://hg.python.org/cpython From python-checkins at python.org Thu Jan 8 03:52:46 2015 From: python-checkins at python.org (benjamin.peterson) Date: Thu, 08 Jan 2015 02:52:46 +0000 Subject: [Python-checkins] =?utf-8?q?cpython=3A_reorder_cipher_prefs?= Message-ID: <20150108025243.11591.95302@psf.io> https://hg.python.org/cpython/rev/9968783893e5 changeset: 94072:9968783893e5 user: Benjamin Peterson date: Wed Jan 07 20:52:40 2015 -0600 summary: reorder cipher prefs files: Lib/test/test_ssl.py | 2 +- 1 files changed, 1 insertions(+), 1 deletions(-) diff --git a/Lib/test/test_ssl.py b/Lib/test/test_ssl.py --- a/Lib/test/test_ssl.py +++ b/Lib/test/test_ssl.py @@ -3166,7 +3166,7 @@ server_context = ssl.SSLContext(ssl.PROTOCOL_TLSv1) client_context = ssl.SSLContext(ssl.PROTOCOL_SSLv23) client_context.set_ciphers("AES") - server_context.set_ciphers("RC4:AES") + server_context.set_ciphers("AES:RC4") stats = server_params_test(client_context, server_context) ciphers = stats['server_shared_ciphers'][0] self.assertGreater(len(ciphers), 0) -- Repository URL: https://hg.python.org/cpython From python-checkins at python.org Thu Jan 8 04:06:12 2015 From: python-checkins at python.org (nick.coghlan) Date: Thu, 08 Jan 2015 03:06:12 +0000 Subject: [Python-checkins] =?utf-8?q?peps=3A_PEP_474=3A_Updated_forge=2Epy?= =?utf-8?q?thon=2Eorg_proposal?= Message-ID: <20150108030602.125880.91336@psf.io> https://hg.python.org/peps/rev/dadbf78cddcb changeset: 5657:dadbf78cddcb user: Nick Coghlan date: Thu Jan 08 13:05:54 2015 +1000 summary: PEP 474: Updated forge.python.org proposal files: pep-0474.txt | 242 +++++++++++++++++++++++++++++++------- 1 files changed, 197 insertions(+), 45 deletions(-) diff --git a/pep-0474.txt b/pep-0474.txt --- a/pep-0474.txt +++ b/pep-0474.txt @@ -23,18 +23,11 @@ for CPython itself (see PEP 462 in relation to that). -PEP Deferral -============ - -This PEP is deferred largely because I don't currently have time to work on -it. If anyone would like to take it over, let me know. - - Proposal ======== -This PEP proposes that, once the Kallithea project has made an official -release, that a Kallithea instance be deployed as "forge.python.org". +This PEP proposes that an instance of the self-hosted Kallithea code +repository management system be deployed as "forge.python.org". Individual repositories (such as the developer guide or the PEPs repository) may then be migrated from the existing hg.python.org infrastructure to the @@ -42,9 +35,24 @@ will need to decide whether to retain a read-only mirror on hg.python.org, or whether to just migrate wholesale to the new location. -This would not be a general purpose hosting site for arbitrary Python -projects, but a more narrowly focused site akin to the existing -hg.python.org. +In addition to supporting read-only mirrors on hg.python.org, +forge.python.org will also aim to support hosting mirrors on popular +proprietary hosting sites like GitHub and BitBucket. The aim will be to +allow users familiar with these sites to submit and discuss pull requests +using their preferred workflow, with forge.python.org automatically bringing +those contributions over to the master repository. + +Given the availability and popularity of commercially backed "free for open +source projects" repository hosting services, this would not be a general +purpose hosting site for arbitrary Python projects. The initial focus will be +specifically on CPython and other repositories currently hosted on +hg.python.org. In the future, this could potentially be expanded to +consolidating other PSF managed repositories that are currently externally +hosted to gain access to a pull request based workflow, such as the +repository for the python.org Django application. As with the initial +migrations, any such future migrations would be considered on a case-by-case +basis, taking into account the preferences of the primary users of each +repository. Rationale @@ -64,36 +72,42 @@ The key requirements proposed for a PSF provided software forge are: -* Must support self-hosting on PSF infrastructure without ongoing fees -* Must support Mercurial (for consistency with existing tooling) -* Must support simple "pull request" style workflows -* Must support online editing for simple changes +* MUST support simple "pull request" style workflows +* MUST support online editing for simple changes +* MUST be backed by an active development organisation (community or + commercial) -Ideally, the chosen solution would also be a fully open source application -written in Python. +Additional recommended requirements that are satisfied by this proposal, +but may be negotiable if a sufficiently compelling alternative is presented: -The requirement for self-hosting without ongoing fees rules out the +* SHOULD support self-hosting on PSF infrastructure without ongoing fees +* SHOULD be a fully open source application written in Python +* SHOULD support Mercurial (for consistency with existing tooling) +* SHOULD support Git (to provide that option to users that prefer it) +* SHOULD allow users of git and Mercurial clients to transparently + collaborate on the same repository +* SHOULD be open to customisation to meet the needs of CPython core + development, including providing a potential path forward for the + proposed migration to a core reviewer model in PEP 462 + + +The preference for self-hosting without ongoing fees rules out the free-as-in-beer providers like GitHub and BitBucket, in addition to the various proprietary source code management offerings. -The requirement for Mercurial support not only rules out GitHub, but also +The preference for Mercurial support not only rules out GitHub, but also other Git-only solutions like GitLab and Gitorious. -The requirement for online editing support rules out the Apache +The hard requirement for online editing support rules out the Apache Allura/HgForge combination. -That leaves two main contenders from a technical perspective: +The preference for a fully open source solution rules out RhodeCode. -* `RhodeCode `__ -* `Kallithea SCM `__ +Of the various options considered by the author of this proposal, that +leaves `Kallithea SCM `__ as the proposed +foundation for a forge.python.org service. -The `legal uncertainty -`__ over the -interaction between RhodeCode's current "Business Source" licensing model and -the various GPL components it relies on currently make it unclear whether it -is legally permissible to deploy it. - -By contrast, Kallithea is a full GPLv3 application (derived from the clearly +Kallithea is a full GPLv3 application (derived from the clearly and unambiguously GPLv3 licensed components of RhodeCode) that is being developed under the auspices of the `Software Freedom Conservancy `__. The @@ -106,7 +120,7 @@ Twisted, Gevent, BuildBot and PyPy. -Perceived Benefits +Intended Benefits ================== The primary benefit of deploying Kallithea as forge.python.org is that @@ -122,24 +136,97 @@ installation. -Technical Challenges -==================== +Sustaining Engineering Considerations +===================================== + +Even with its current workflow, CPython itself remains one of the largest +open source projects in the world (in the +`top 2% ` +of projects tracked on OpenHub). Unfortunately, we have been significantly +less effective at encouraging contributions to the projects that make up +CPython's workflow infrastructure, including ensuring that our installations +track upstream, and that wherever feasible, our own customisations are +contributed back to the original project. + +As such, a core component of this proposal is to actively engage with the +upstream Kallithea community to lower the barriers to working with and on +the Kallithea SCM, as well as with the PSF Infrastructure team to ensure +the forge.python.org service integrates cleanly with the PSF's infrastructure +automation. + +This approach aims to provide a number of key benefits: + +* allowing those of us contributing to maintenance of this service to be + as productive as possible in the time we have available +* offering a compelling professional development opportunity to those + volunteers that choose to participate in maintenance of this service +* making the Kallithea project itself more attractive to other potential + users by making it as easy as possible to adopt, deploy and manage +* as a result of the above benefits, attracting sufficient contributors both + in the upstream Kallithea community, and within the CPython infrastructure + community, to allow the forge.python.org service to evolve effectively to + meet changing developer expectations + +Some initial steps have already been taken to address these sustaining +engineering concerns: + +* Tymoteusz Jankowski has been working with Donald Stufft to work out `what + would be involved `__ + in deploying Kallithea using the PSF's Salt based infrastructure automation. +* Graham Dumpleton and I have been working on + `making it easy + +`__ to deploy + demonstration Kallithea instances to the free tier of Red Hat's open source + hosting service, OpenShift Online. (See the comments on that post, or the + `quickstart issue tracker + `__ for links to + Graham's follow on work) + +The next major step to be undertaken is to come up with a local development +workflow that allows contributors on Windows, Mac OS X and Linux to run +the Kallithea tests locally, without interfering with the operation of +their own system. The currently planned approach for this is to focus on +Vagrant, which is a popular automated virtual machine management system +specifically aimed at developers running local VMs for testing purposes. +The `Vagrant based development guidelines +`__ +for +OpenShift Origin provide an extended example of the kind of workflow this +approach enables. It's also worth noting that Vagrant is one of the options +for working with a local build of the `main python.org website +`__. + +If these workflow proposals end up working well for Kallithea, they may also +be worth proposing for use by the upstream projects backing other PSF and +CPython infrastructure services, including Roundup, BuildBot, and the main +python.org web site. + + +Technical Concerns and Challenges +================================= + +Introducing a new service into the CPython infrastructure presents a number +of interesting technical concerns and challenges. This section covers several +of the most significant ones. User account management ----------------------- -Ideally we'd be able to offer a single account that spans all python.org -services, including Kallithea, Roundup/Rietveld, PyPI and the back end for -the new python.org site, but actually implementing that would be a distinct -infrastructure project, independent of this PEP. +Ideally we'd like to be able to offer a single account that spans all +python.org services, including Kallithea, Roundup/Rietveld, PyPI and the +back end for the new python.org site, but actually implementing that would +be a distinct infrastructure project, independent of this PEP. -A potentially simpler near term solution would be if Kallithea's user -account management could be integrated with the existing account management -in Roundup, similar to what was done for Rietveld. However, if that also -turns out to be impractical in the near term, we would merely end up with -another identity silo to be integrated at a later date, suggesting that -this doesn't need to be considered a blocker for an initial Kallithea -deployment. +For the initial rollout of forge.python.org, we will likely create yet another +identity silo within the PSF infrastructure. A potentially superior +alternative would be to add support for `python-social-auth +`__ to Kallithea, but actually +doing so would not be a requirement for the initial rollout of the service +(the main technical concern there is that Kallithea is a Pylons application +that has not yet been ported to Pyramid, so integration will require either +adding a Pylons backend to python-social-auth, or else embarking on the +Pyramid migration in Kallithea). Breaking existing SSH access and links for Mercurial repositories @@ -153,6 +240,71 @@ break). +Integration with Roundup +------------------------ + +Kallithea provides configurable issue tracker integration. This will need +to be set up appropriately to integrate with the Roundup issue tracker at +bugs.python.org before the initial rollout of the forge.python.org service. + + +Accepting pull requests on GitHub and BitBucket +----------------------------------------------- + +The initial rollout of forge.python.org would support publication of read-only +mirrors, both on hg.python.org and other services, as that is a relatively +straightforward operation that can be implemented in a commit hook. + +While a highly desirable feature, accepting pull requests on external +services, and mirroring them as submissions to the master repositories on +forge.python.org is a more complex problem, and would likely not be included +as part of the initial rollout of the forge.python.org service. + + +Transparent Git and Mercurial interoperability +---------------------------------------------- + +Kallithea's native support for both Git and Mercurial offers an opportunity +to make it relatively straightforward for developers to use the client +of their choice to interact with repositories hosted on forge.python.org. + +This transparent interoperability does *not* exist yet, but running our own +multi-VCS repository hosting service provides the opportunity to make this +capability a reality, rather than passively waiting for a proprietary +provider to deign to provide a feature that likely isn't in their commercial +interest. There's a significant misalignment of incentives between open +source communities and commercial providers in this particular area, as even +though offering VCS client choice can significantly reduce community friction +by eliminating the need for projects to make autocratic decisions that force +particular tooling choices on potential contributors, top down enforcement +of tool selection (regardless of developer preference) is currently still +the norm in the corporate and other organisational environments that produce +GitHub and Atlassian's paying customers. + +Prior to acceptance, in the absence of transparent interoperability, this PEP +should propose specific recommendations for inclusion in the CPython +developer's guide for using git-hg to create pull requests against +forge.python.org hosted Mercurial repositories. + + +Future Implications for CPython Core Development +================================================ + +The workflow requirements for the main CPython development repository are +significantly more complex than those for the repositories being discussed +in this PEP. These concerns are covered in more detail in PEP 462. + +Given Guido's recommendation to replace Reitveld with a more actively +maintained code review system, my current plan is to rewrite that PEP to +use Kallithea as the proposed glue layer, with enhanced Kallithea pull +requests eventually replacing the current practice of uploading patche files +directly to the issue tracker. + +I've also started working with Pierre Yves-David on a `custom Mercurial +extension `__ +that automates some aspects of the CPython core development workflow. + + Copyright ========= -- Repository URL: https://hg.python.org/peps From python-checkins at python.org Thu Jan 8 04:08:11 2015 From: python-checkins at python.org (nick.coghlan) Date: Thu, 08 Jan 2015 03:08:11 +0000 Subject: [Python-checkins] =?utf-8?q?peps=3A_I_never_get_Rietveld_right_fi?= =?utf-8?b?cnN0IHRpbWUuLi4=?= Message-ID: <20150108030759.8759.71976@psf.io> https://hg.python.org/peps/rev/b4e78f3d80cd changeset: 5658:b4e78f3d80cd user: Nick Coghlan date: Thu Jan 08 13:07:52 2015 +1000 summary: I never get Rietveld right first time... files: pep-0474.txt | 2 +- 1 files changed, 1 insertions(+), 1 deletions(-) diff --git a/pep-0474.txt b/pep-0474.txt --- a/pep-0474.txt +++ b/pep-0474.txt @@ -294,7 +294,7 @@ significantly more complex than those for the repositories being discussed in this PEP. These concerns are covered in more detail in PEP 462. -Given Guido's recommendation to replace Reitveld with a more actively +Given Guido's recommendation to replace Rietveld with a more actively maintained code review system, my current plan is to rewrite that PEP to use Kallithea as the proposed glue layer, with enhanced Kallithea pull requests eventually replacing the current practice of uploading patche files -- Repository URL: https://hg.python.org/peps From python-checkins at python.org Thu Jan 8 04:24:53 2015 From: python-checkins at python.org (benjamin.peterson) Date: Thu, 08 Jan 2015 03:24:53 +0000 Subject: [Python-checkins] =?utf-8?q?cpython=3A_trying_again?= Message-ID: <20150108032446.125894.99202@psf.io> https://hg.python.org/cpython/rev/4ee1a459540a changeset: 94073:4ee1a459540a user: Benjamin Peterson date: Wed Jan 07 21:21:34 2015 -0600 summary: trying again files: Lib/test/test_ssl.py | 4 ++-- 1 files changed, 2 insertions(+), 2 deletions(-) diff --git a/Lib/test/test_ssl.py b/Lib/test/test_ssl.py --- a/Lib/test/test_ssl.py +++ b/Lib/test/test_ssl.py @@ -3165,13 +3165,13 @@ def test_shared_ciphers(self): server_context = ssl.SSLContext(ssl.PROTOCOL_TLSv1) client_context = ssl.SSLContext(ssl.PROTOCOL_SSLv23) - client_context.set_ciphers("AES") + client_context.set_ciphers("RC4") server_context.set_ciphers("AES:RC4") stats = server_params_test(client_context, server_context) ciphers = stats['server_shared_ciphers'][0] self.assertGreater(len(ciphers), 0) for name, tls_version, bits in ciphers: - self.assertIn("AES", name) + self.assertIn("RC4", name.split("-")) def test_read_write_after_close_raises_valuerror(self): context = ssl.SSLContext(ssl.PROTOCOL_SSLv23) -- Repository URL: https://hg.python.org/cpython From python-checkins at python.org Thu Jan 8 05:12:47 2015 From: python-checkins at python.org (benjamin.peterson) Date: Thu, 08 Jan 2015 04:12:47 +0000 Subject: [Python-checkins] =?utf-8?q?cpython=3A_enable_cert_validation_in_?= =?utf-8?q?test?= Message-ID: <20150108041246.8745.96180@psf.io> https://hg.python.org/cpython/rev/b89c84a4426a changeset: 94074:b89c84a4426a user: Benjamin Peterson date: Wed Jan 07 22:12:43 2015 -0600 summary: enable cert validation in test files: Lib/test/test_ssl.py | 5 ++++- 1 files changed, 4 insertions(+), 1 deletions(-) diff --git a/Lib/test/test_ssl.py b/Lib/test/test_ssl.py --- a/Lib/test/test_ssl.py +++ b/Lib/test/test_ssl.py @@ -3164,7 +3164,10 @@ def test_shared_ciphers(self): server_context = ssl.SSLContext(ssl.PROTOCOL_TLSv1) - client_context = ssl.SSLContext(ssl.PROTOCOL_SSLv23) + server_context.load_cert_chain(SIGNED_CERTFILE) + client_context = ssl.SSLContext(ssl.PROTOCOL_TLSv1) + client_context.verify_mode = ssl.CERT_REQUIRED + client_context.load_verify_locations(SIGNING_CA) client_context.set_ciphers("RC4") server_context.set_ciphers("AES:RC4") stats = server_params_test(client_context, server_context) -- Repository URL: https://hg.python.org/cpython From python-checkins at python.org Thu Jan 8 05:49:35 2015 From: python-checkins at python.org (terry.reedy) Date: Thu, 08 Jan 2015 04:49:35 +0000 Subject: [Python-checkins] =?utf-8?q?cpython_=28merge_3=2E4_-=3E_default?= =?utf-8?q?=29=3A_Merge_with_3=2E4?= Message-ID: <20150108044934.11589.19426@psf.io> https://hg.python.org/cpython/rev/58674a027d10 changeset: 94077:58674a027d10 parent: 94074:b89c84a4426a parent: 94076:364e44a49a31 user: Terry Jan Reedy date: Wed Jan 07 23:49:06 2015 -0500 summary: Merge with 3.4 files: Lib/idlelib/testcode.py | 31 ----------------------------- 1 files changed, 0 insertions(+), 31 deletions(-) diff --git a/Lib/idlelib/testcode.py b/Lib/idlelib/testcode.py deleted file mode 100644 --- a/Lib/idlelib/testcode.py +++ /dev/null @@ -1,31 +0,0 @@ -import string - -def f(): - a = 0 - b = 1 - c = 2 - d = 3 - e = 4 - g() - -def g(): - h() - -def h(): - i() - -def i(): - j() - -def j(): - k() - -def k(): - l() - -l = lambda: test() - -def test(): - string.capwords(1) - -f() -- Repository URL: https://hg.python.org/cpython From python-checkins at python.org Thu Jan 8 05:49:35 2015 From: python-checkins at python.org (terry.reedy) Date: Thu, 08 Jan 2015 04:49:35 +0000 Subject: [Python-checkins] =?utf-8?b?Y3B5dGhvbiAoMi43KTogSXNzdWUgIzIzMTg0?= =?utf-8?q?=3A_delete_unused_idlelib_file=2E?= Message-ID: <20150108044934.11569.3337@psf.io> https://hg.python.org/cpython/rev/1c3f8d044589 changeset: 94075:1c3f8d044589 branch: 2.7 parent: 94056:67224c88144e user: Terry Jan Reedy date: Wed Jan 07 23:48:28 2015 -0500 summary: Issue #23184: delete unused idlelib file. files: Lib/idlelib/testcode.py | 31 ----------------------------- 1 files changed, 0 insertions(+), 31 deletions(-) diff --git a/Lib/idlelib/testcode.py b/Lib/idlelib/testcode.py deleted file mode 100644 --- a/Lib/idlelib/testcode.py +++ /dev/null @@ -1,31 +0,0 @@ -import string - -def f(): - a = 0 - b = 1 - c = 2 - d = 3 - e = 4 - g() - -def g(): - h() - -def h(): - i() - -def i(): - j() - -def j(): - k() - -def k(): - l() - -l = lambda: test() - -def test(): - string.capwords(1) - -f() -- Repository URL: https://hg.python.org/cpython From python-checkins at python.org Thu Jan 8 05:49:35 2015 From: python-checkins at python.org (terry.reedy) Date: Thu, 08 Jan 2015 04:49:35 +0000 Subject: [Python-checkins] =?utf-8?b?Y3B5dGhvbiAoMy40KTogSXNzdWUgIzIzMTg0?= =?utf-8?q?=3A_delete_unused_idlelib_file=2E?= Message-ID: <20150108044934.72555.53433@psf.io> https://hg.python.org/cpython/rev/364e44a49a31 changeset: 94076:364e44a49a31 branch: 3.4 parent: 94065:230a1bfb0f59 user: Terry Jan Reedy date: Wed Jan 07 23:48:46 2015 -0500 summary: Issue #23184: delete unused idlelib file. files: Lib/idlelib/testcode.py | 31 ----------------------------- 1 files changed, 0 insertions(+), 31 deletions(-) diff --git a/Lib/idlelib/testcode.py b/Lib/idlelib/testcode.py deleted file mode 100644 --- a/Lib/idlelib/testcode.py +++ /dev/null @@ -1,31 +0,0 @@ -import string - -def f(): - a = 0 - b = 1 - c = 2 - d = 3 - e = 4 - g() - -def g(): - h() - -def h(): - i() - -def i(): - j() - -def j(): - k() - -def k(): - l() - -l = lambda: test() - -def test(): - string.capwords(1) - -f() -- Repository URL: https://hg.python.org/cpython From python-checkins at python.org Thu Jan 8 07:29:06 2015 From: python-checkins at python.org (nick.coghlan) Date: Thu, 08 Jan 2015 06:29:06 +0000 Subject: [Python-checkins] =?utf-8?q?peps=3A_PEP_474_is_no_longer_deferred?= Message-ID: <20150108062900.8763.85992@psf.io> https://hg.python.org/peps/rev/64d90a1873bf changeset: 5659:64d90a1873bf user: Nick Coghlan date: Thu Jan 08 16:28:51 2015 +1000 summary: PEP 474 is no longer deferred files: pep-0474.txt | 4 ++-- 1 files changed, 2 insertions(+), 2 deletions(-) diff --git a/pep-0474.txt b/pep-0474.txt --- a/pep-0474.txt +++ b/pep-0474.txt @@ -3,11 +3,11 @@ Version: $Revision$ Last-Modified: $Date$ Author: Nick Coghlan -Status: Deferred +Status: Draft Type: Process Content-Type: text/x-rst Created: 19-Jul-2014 -Post-History: 19-Jul-2014 +Post-History: 19-Jul-2014, 08-Jan-2015 Abstract -- Repository URL: https://hg.python.org/peps From python-checkins at python.org Thu Jan 8 07:33:02 2015 From: python-checkins at python.org (nick.coghlan) Date: Thu, 08 Jan 2015 06:33:02 +0000 Subject: [Python-checkins] =?utf-8?q?peps=3A_PEP_462=3A_next_revision_will?= =?utf-8?q?_depend_on_PEP_474?= Message-ID: <20150108063258.125898.91658@psf.io> https://hg.python.org/peps/rev/a1622867aa73 changeset: 5660:a1622867aa73 user: Nick Coghlan date: Thu Jan 08 16:32:49 2015 +1000 summary: PEP 462: next revision will depend on PEP 474 files: pep-0462.txt | 5 +++-- 1 files changed, 3 insertions(+), 2 deletions(-) diff --git a/pep-0462.txt b/pep-0462.txt --- a/pep-0462.txt +++ b/pep-0462.txt @@ -6,6 +6,7 @@ Status: Deferred Type: Process Content-Type: text/x-rst +Requires: 474 Created: 23-Jan-2014 Post-History: 25-Jan-2014, 27-Jan-2014 @@ -25,8 +26,8 @@ PEP Deferral ============ -This PEP is deferred largely because I don't currently have time to work on -it. If anyone would like to take it over, let me know. +This PEP is currently deferred pending updates to redesign it around +the proposal in PEP 474 to create a Kallithea-based forge.python.org service. Rationale for changes to the core development workflow -- Repository URL: https://hg.python.org/peps From python-checkins at python.org Thu Jan 8 07:33:55 2015 From: python-checkins at python.org (nick.coghlan) Date: Thu, 08 Jan 2015 06:33:55 +0000 Subject: [Python-checkins] =?utf-8?q?peps=3A_PEP_474=3A_markup_fixes?= Message-ID: <20150108063342.125896.84081@psf.io> https://hg.python.org/peps/rev/eb2774f9583a changeset: 5661:eb2774f9583a user: Nick Coghlan date: Thu Jan 08 16:33:36 2015 +1000 summary: PEP 474: markup fixes files: pep-0474.txt | 4 ++-- 1 files changed, 2 insertions(+), 2 deletions(-) diff --git a/pep-0474.txt b/pep-0474.txt --- a/pep-0474.txt +++ b/pep-0474.txt @@ -175,8 +175,8 @@ in deploying Kallithea using the PSF's Salt based infrastructure automation. * Graham Dumpleton and I have been working on `making it easy - -`__ to deploy + + `__ to deploy demonstration Kallithea instances to the free tier of Red Hat's open source hosting service, OpenShift Online. (See the comments on that post, or the `quickstart issue tracker -- Repository URL: https://hg.python.org/peps From solipsis at pitrou.net Thu Jan 8 10:22:38 2015 From: solipsis at pitrou.net (solipsis at pitrou.net) Date: Thu, 08 Jan 2015 10:22:38 +0100 Subject: [Python-checkins] Daily reference leaks (e42301eb3ce8): sum=3 Message-ID: results for e42301eb3ce8 on branch "default" -------------------------------------------- test_functools leaked [0, 0, 3] memory blocks, sum=3 Command line was: ['./python', '-m', 'test.regrtest', '-uall', '-R', '3:3:/home/antoine/cpython/refleaks/reflog0MAUR9', '-x'] From python-checkins at python.org Thu Jan 8 12:08:46 2015 From: python-checkins at python.org (victor.stinner) Date: Thu, 08 Jan 2015 11:08:46 +0000 Subject: [Python-checkins] =?utf-8?b?Y3B5dGhvbiAoMy40KTogYXN5bmNpbzogX21h?= =?utf-8?q?ke=5Fssl=5Ftransport=3A_make_the_waiter_parameter_optional?= Message-ID: <20150108110841.8739.63020@psf.io> https://hg.python.org/cpython/rev/f12b845a46c0 changeset: 94078:f12b845a46c0 branch: 3.4 parent: 94076:364e44a49a31 user: Victor Stinner date: Thu Jan 08 12:06:36 2015 +0100 summary: asyncio: _make_ssl_transport: make the waiter parameter optional files: Lib/asyncio/base_events.py | 2 +- Lib/asyncio/selector_events.py | 4 ++-- 2 files changed, 3 insertions(+), 3 deletions(-) diff --git a/Lib/asyncio/base_events.py b/Lib/asyncio/base_events.py --- a/Lib/asyncio/base_events.py +++ b/Lib/asyncio/base_events.py @@ -201,7 +201,7 @@ """Create socket transport.""" raise NotImplementedError - def _make_ssl_transport(self, rawsock, protocol, sslcontext, waiter, *, + def _make_ssl_transport(self, rawsock, protocol, sslcontext, waiter=None, *, server_side=False, server_hostname=None, extra=None, server=None): """Create SSL transport.""" diff --git a/Lib/asyncio/selector_events.py b/Lib/asyncio/selector_events.py --- a/Lib/asyncio/selector_events.py +++ b/Lib/asyncio/selector_events.py @@ -55,7 +55,7 @@ return _SelectorSocketTransport(self, sock, protocol, waiter, extra, server) - def _make_ssl_transport(self, rawsock, protocol, sslcontext, waiter, *, + def _make_ssl_transport(self, rawsock, protocol, sslcontext, waiter=None, *, server_side=False, server_hostname=None, extra=None, server=None): return _SelectorSslTransport( @@ -165,7 +165,7 @@ else: if sslcontext: self._make_ssl_transport( - conn, protocol_factory(), sslcontext, None, + conn, protocol_factory(), sslcontext, server_side=True, extra={'peername': addr}, server=server) else: self._make_socket_transport( -- Repository URL: https://hg.python.org/cpython From python-checkins at python.org Thu Jan 8 12:08:46 2015 From: python-checkins at python.org (victor.stinner) Date: Thu, 08 Jan 2015 11:08:46 +0000 Subject: [Python-checkins] =?utf-8?q?cpython_=28merge_3=2E4_-=3E_default?= =?utf-8?q?=29=3A_Merge_3=2E4_=28asyncio=29?= Message-ID: <20150108110841.8767.74069@psf.io> https://hg.python.org/cpython/rev/a325e2d93c06 changeset: 94079:a325e2d93c06 parent: 94077:58674a027d10 parent: 94078:f12b845a46c0 user: Victor Stinner date: Thu Jan 08 12:07:00 2015 +0100 summary: Merge 3.4 (asyncio) files: Lib/asyncio/base_events.py | 2 +- Lib/asyncio/selector_events.py | 4 ++-- 2 files changed, 3 insertions(+), 3 deletions(-) diff --git a/Lib/asyncio/base_events.py b/Lib/asyncio/base_events.py --- a/Lib/asyncio/base_events.py +++ b/Lib/asyncio/base_events.py @@ -201,7 +201,7 @@ """Create socket transport.""" raise NotImplementedError - def _make_ssl_transport(self, rawsock, protocol, sslcontext, waiter, *, + def _make_ssl_transport(self, rawsock, protocol, sslcontext, waiter=None, *, server_side=False, server_hostname=None, extra=None, server=None): """Create SSL transport.""" diff --git a/Lib/asyncio/selector_events.py b/Lib/asyncio/selector_events.py --- a/Lib/asyncio/selector_events.py +++ b/Lib/asyncio/selector_events.py @@ -55,7 +55,7 @@ return _SelectorSocketTransport(self, sock, protocol, waiter, extra, server) - def _make_ssl_transport(self, rawsock, protocol, sslcontext, waiter, *, + def _make_ssl_transport(self, rawsock, protocol, sslcontext, waiter=None, *, server_side=False, server_hostname=None, extra=None, server=None): return _SelectorSslTransport( @@ -165,7 +165,7 @@ else: if sslcontext: self._make_ssl_transport( - conn, protocol_factory(), sslcontext, None, + conn, protocol_factory(), sslcontext, server_side=True, extra={'peername': addr}, server=server) else: self._make_socket_transport( -- Repository URL: https://hg.python.org/cpython From python-checkins at python.org Thu Jan 8 20:10:40 2015 From: python-checkins at python.org (guido.van.rossum) Date: Thu, 08 Jan 2015 19:10:40 +0000 Subject: [Python-checkins] =?utf-8?q?peps=3A_Add_PEP_482=2C_483=2C_484_--_?= =?utf-8?q?type_hints=2E_The_latter_two_are_stubs=2E?= Message-ID: <20150108191038.8741.28350@psf.io> https://hg.python.org/peps/rev/7cbb166b30fd changeset: 5662:7cbb166b30fd user: Guido van Rossum date: Thu Jan 08 11:10:25 2015 -0800 summary: Add PEP 482, 483, 484 -- type hints. The latter two are stubs. files: pep-0482.txt | 222 +++++++++++++++++++++++++++++++++++++++ pep-0483.txt | 28 ++++ pep-0484.txt | 28 ++++ 3 files changed, 278 insertions(+), 0 deletions(-) diff --git a/pep-0482.txt b/pep-0482.txt new file mode 100644 --- /dev/null +++ b/pep-0482.txt @@ -0,0 +1,222 @@ +PEP: 482 +Title: Literature Overview for Type Hinting +Version: $Revision$ +Last-Modified: $Date$ +Author: ?ukasz Langa +Discussions-To: Python-Ideas +Status: Draft +Type: Informational +Content-Type: text/x-rst +Created: 08-Jan-2015 +Post-History: +Resolution: + +Abstract +======== + +This PEP is one of three related to type hinting. This PEP gives a +literature overview of related work. + + +Existing Approaches in Other Languages +====================================== + +mypy +---- + +(This section is a stub, since mypy [mypy]_ is essentially what we're +proposing.) + +ActionScript +------------ + +ActionScript [actionscript]_ is a class-based, single inheritance, +object-oriented superset of ECMAScript. It supports inferfaces and +strong runtime-checked static typing. Compilation supports a ?strict +dialect? where type mismatches are reported at compile-time. + +Example code with types:: + + package { + import flash.events.Event; + + public class BounceEvent extends Event { + public static const BOUNCE:String = "bounce"; + private var _side:String = "none"; + + public function get side():String { + return _side; + } + + public function BounceEvent(type:String, side:String){ + super(type, true); + _side = side; + } + + public override function clone():Event { + return new BounceEvent(type, _side); + } + } + } + +Dart +---- + +Dart [dart]_ is a class-based, single inheritance, object-oriented +language with C-style syntax. It supports interfaces, abstract classes, +reified generics, and optional typing. + +Types are inferred when possible. The runtime differentiates between two +modes of execution: *checked mode* aimed for development (catching type +errors at runtime) and *production mode* recommended for speed execution +(ignoring types and asserts). + +Example code with types:: + + class Point { + final num x, y; + + Point(this.x, this.y); + + num distanceTo(Point other) { + var dx = x - other.x; + var dy = y - other.y; + return math.sqrt(dx * dx + dy * dy); + } + } + +Hack +---- + +Hack [hack]_ is a programming language that interoperates seamlessly +with PHP. It provides opt-in static type checking, type aliasing, +generics, nullable types, and lambdas. + +Example code with types:: + + alpha(); + } + +TypeScript +---------- + +TypeScript [typescript]_ is a typed superset of JavaScript that adds +interfaces, classes, mixins and modules to the language. + +Type checks are duck typed. Multiple valid function signatures are +specified by supplying overloaded function declarations. Functions and +classes can use generics as type parametrization. Interfaces can have +optional fields. Interfaces can specify array and dictionary types. +Classes can have constructors that implicitly add arguments as fields. +Classes can have static fields. Classes can have private fields. +Classes can have getters/setters for fields (like property). Types are +inferred. + +Example code with types:: + + interface Drivable { + start(): void; + drive(distance: number): boolean; + getPosition(): number; + } + + class Car implements Drivable { + private _isRunning: boolean; + private _distanceFromStart: number; + + constructor() { + this._isRunning = false; + this._distanceFromStart = 0; + } + + public start() { + this._isRunning = true; + } + + public drive(distance: number): boolean { + if (this._isRunning) { + this._distanceFromStart += distance; + return true; + } + return false; + } + + public getPosition(): number { + return this._distanceFromStart; + } + } + + +References +========== + +.. [pep-3107] + http://www.python.org/dev/peps/pep-3107/ + +.. [mypy] + http://mypy-lang.org + +.. [obiwan] + http://pypi.python.org/pypi/obiwan + +.. [numba] + http://numba.pydata.org + +.. [pytypedecl] + https://github.com/google/pytypedecl + +.. [argumentclinic] + https://docs.python.org/3/howto/clinic.html + +.. [numpy] + http://www.numpy.org + +.. [typescript] + http://www.typescriptlang.org + +.. [hack] + http://hacklang.org + +.. [dart] + https://www.dartlang.org + +.. [actionscript] + http://livedocs.adobe.com/specs/actionscript/3/ + +.. [pyflakes] + https://github.com/pyflakes/pyflakes/ + +.. [pylint] + http://www.pylint.org + + +Copyright +========= + +This document has been placed in the public domain. + + + +.. + Local Variables: + mode: indented-text + indent-tabs-mode: nil + sentence-end-double-space: t + fill-column: 70 + coding: utf-8 + End: diff --git a/pep-0483.txt b/pep-0483.txt new file mode 100644 --- /dev/null +++ b/pep-0483.txt @@ -0,0 +1,28 @@ +PEP: 483 +Title: The Theory of Type Hinting +Version: $Revision$ +Last-Modified: $Date$ +Author: Guido van Rossum +Discussions-To: Python-Ideas +Status: Draft +Type: Informational +Content-Type: text/x-rst +Created: 08-Jan-2015 +Post-History: +Resolution: + +Abstract +======== + +This PEP is currently a stub. The content should be copied from +https://quip.com/r69HA9GhGa7J and reformatted. + + +.. + Local Variables: + mode: indented-text + indent-tabs-mode: nil + sentence-end-double-space: t + fill-column: 70 + coding: utf-8 + End: diff --git a/pep-0484.txt b/pep-0484.txt new file mode 100644 --- /dev/null +++ b/pep-0484.txt @@ -0,0 +1,28 @@ +PEP: 484 +Title: Type Hints +Version: $Revision$ +Last-Modified: $Date$ +Author: Guido van Rossum , Jukka Lehtosalo , ?ukasz Langa +Discussions-To: Python-Ideas +Status: Draft +Type: Standards Track +Content-Type: text/x-rst +Created: 08-Jan-2015 +Post-History: +Resolution: + +Abstract +======== + +This PEP is currently a stub. The content should be copied from +https://github.com/ambv/typehinting (but omitting the literature overview, which is PEP 482) and reformatted. + + +.. + Local Variables: + mode: indented-text + indent-tabs-mode: nil + sentence-end-double-space: t + fill-column: 70 + coding: utf-8 + End: -- Repository URL: https://hg.python.org/peps From python-checkins at python.org Fri Jan 9 00:11:33 2015 From: python-checkins at python.org (victor.stinner) Date: Thu, 08 Jan 2015 23:11:33 +0000 Subject: [Python-checkins] =?utf-8?b?Y3B5dGhvbiAoMy40KTogYXN5bmNpbzogVHJ1?= =?utf-8?q?ncate_to_80_columns?= Message-ID: <20150108231129.22395.3850@psf.io> https://hg.python.org/cpython/rev/756d09136a2c changeset: 94080:756d09136a2c branch: 3.4 parent: 94078:f12b845a46c0 user: Victor Stinner date: Fri Jan 09 00:09:10 2015 +0100 summary: asyncio: Truncate to 80 columns files: Lib/asyncio/base_events.py | 4 +- Lib/asyncio/coroutines.py | 12 +++- Lib/asyncio/selector_events.py | 7 +- Lib/asyncio/tasks.py | 2 +- Lib/asyncio/unix_events.py | 3 +- Lib/asyncio/windows_events.py | 3 +- Lib/asyncio/windows_utils.py | 7 +- Lib/test/test_asyncio/test_base_events.py | 9 ++- Lib/test/test_asyncio/test_futures.py | 24 +++++++--- Lib/test/test_asyncio/test_streams.py | 6 +- Lib/test/test_asyncio/test_subprocess.py | 4 +- Lib/test/test_asyncio/test_tasks.py | 12 +++- 12 files changed, 60 insertions(+), 33 deletions(-) diff --git a/Lib/asyncio/base_events.py b/Lib/asyncio/base_events.py --- a/Lib/asyncio/base_events.py +++ b/Lib/asyncio/base_events.py @@ -201,8 +201,8 @@ """Create socket transport.""" raise NotImplementedError - def _make_ssl_transport(self, rawsock, protocol, sslcontext, waiter=None, *, - server_side=False, server_hostname=None, + def _make_ssl_transport(self, rawsock, protocol, sslcontext, waiter=None, + *, server_side=False, server_hostname=None, extra=None, server=None): """Create SSL transport.""" raise NotImplementedError diff --git a/Lib/asyncio/coroutines.py b/Lib/asyncio/coroutines.py --- a/Lib/asyncio/coroutines.py +++ b/Lib/asyncio/coroutines.py @@ -182,14 +182,18 @@ and not inspect.isgeneratorfunction(coro.func)): filename, lineno = events._get_function_source(coro.func) if coro.gi_frame is None: - coro_repr = '%s() done, defined at %s:%s' % (coro_name, filename, lineno) + coro_repr = ('%s() done, defined at %s:%s' + % (coro_name, filename, lineno)) else: - coro_repr = '%s() running, defined at %s:%s' % (coro_name, filename, lineno) + coro_repr = ('%s() running, defined at %s:%s' + % (coro_name, filename, lineno)) elif coro.gi_frame is not None: lineno = coro.gi_frame.f_lineno - coro_repr = '%s() running at %s:%s' % (coro_name, filename, lineno) + coro_repr = ('%s() running at %s:%s' + % (coro_name, filename, lineno)) else: lineno = coro.gi_code.co_firstlineno - coro_repr = '%s() done, defined at %s:%s' % (coro_name, filename, lineno) + coro_repr = ('%s() done, defined at %s:%s' + % (coro_name, filename, lineno)) return coro_repr diff --git a/Lib/asyncio/selector_events.py b/Lib/asyncio/selector_events.py --- a/Lib/asyncio/selector_events.py +++ b/Lib/asyncio/selector_events.py @@ -55,8 +55,8 @@ return _SelectorSocketTransport(self, sock, protocol, waiter, extra, server) - def _make_ssl_transport(self, rawsock, protocol, sslcontext, waiter=None, *, - server_side=False, server_hostname=None, + def _make_ssl_transport(self, rawsock, protocol, sslcontext, waiter=None, + *, server_side=False, server_hostname=None, extra=None, server=None): return _SelectorSslTransport( self, rawsock, protocol, sslcontext, waiter, @@ -484,7 +484,8 @@ info.append('read=idle') polling = _test_selector_event(self._loop._selector, - self._sock_fd, selectors.EVENT_WRITE) + self._sock_fd, + selectors.EVENT_WRITE) if polling: state = 'polling' else: diff --git a/Lib/asyncio/tasks.py b/Lib/asyncio/tasks.py --- a/Lib/asyncio/tasks.py +++ b/Lib/asyncio/tasks.py @@ -68,7 +68,7 @@ return {t for t in cls._all_tasks if t._loop is loop} def __init__(self, coro, *, loop=None): - assert coroutines.iscoroutine(coro), repr(coro) # Not a coroutine function! + assert coroutines.iscoroutine(coro), repr(coro) super().__init__(loop=loop) if self._source_traceback: del self._source_traceback[-1] diff --git a/Lib/asyncio/unix_events.py b/Lib/asyncio/unix_events.py --- a/Lib/asyncio/unix_events.py +++ b/Lib/asyncio/unix_events.py @@ -69,7 +69,8 @@ """ if (coroutines.iscoroutine(callback) or coroutines.iscoroutinefunction(callback)): - raise TypeError("coroutines cannot be used with add_signal_handler()") + raise TypeError("coroutines cannot be used " + "with add_signal_handler()") self._check_signal(sig) self._check_closed() try: diff --git a/Lib/asyncio/windows_events.py b/Lib/asyncio/windows_events.py --- a/Lib/asyncio/windows_events.py +++ b/Lib/asyncio/windows_events.py @@ -424,7 +424,8 @@ else: return windows_utils.PipeHandle(handle) - return self._register(ov, None, finish_connect_pipe, wait_for_post=True) + return self._register(ov, None, finish_connect_pipe, + wait_for_post=True) def wait_for_handle(self, handle, timeout=None): """Wait for a handle. diff --git a/Lib/asyncio/windows_utils.py b/Lib/asyncio/windows_utils.py --- a/Lib/asyncio/windows_utils.py +++ b/Lib/asyncio/windows_utils.py @@ -36,15 +36,16 @@ def socketpair(family=socket.AF_INET, type=socket.SOCK_STREAM, proto=0): """A socket pair usable as a self-pipe, for Windows. - Origin: https://gist.github.com/4325783, by Geert Jansen. Public domain. + Origin: https://gist.github.com/4325783, by Geert Jansen. + Public domain. """ if family == socket.AF_INET: host = '127.0.0.1' elif family == socket.AF_INET6: host = '::1' else: - raise ValueError("Only AF_INET and AF_INET6 socket address families " - "are supported") + raise ValueError("Only AF_INET and AF_INET6 socket address " + "families are supported") if type != socket.SOCK_STREAM: raise ValueError("Only SOCK_STREAM socket type is supported") if proto != 0: diff --git a/Lib/test/test_asyncio/test_base_events.py b/Lib/test/test_asyncio/test_base_events.py --- a/Lib/test/test_asyncio/test_base_events.py +++ b/Lib/test/test_asyncio/test_base_events.py @@ -285,7 +285,8 @@ @mock.patch('asyncio.base_events.logger') def test__run_once_logging(self, m_logger): def slow_select(timeout): - # Sleep a bit longer than a second to avoid timer resolution issues. + # Sleep a bit longer than a second to avoid timer resolution + # issues. time.sleep(1.1) return [] @@ -1217,14 +1218,16 @@ self.loop.run_forever() fmt, *args = m_logger.warning.call_args[0] self.assertRegex(fmt % tuple(args), - "^Executing took .* seconds$") + "^Executing " + "took .* seconds$") # slow task asyncio.async(stop_loop_coro(self.loop), loop=self.loop) self.loop.run_forever() fmt, *args = m_logger.warning.call_args[0] self.assertRegex(fmt % tuple(args), - "^Executing took .* seconds$") + "^Executing " + "took .* seconds$") if __name__ == '__main__': diff --git a/Lib/test/test_asyncio/test_futures.py b/Lib/test/test_asyncio/test_futures.py --- a/Lib/test/test_asyncio/test_futures.py +++ b/Lib/test/test_asyncio/test_futures.py @@ -133,7 +133,8 @@ exc = RuntimeError() f_exception = asyncio.Future(loop=self.loop) f_exception.set_exception(exc) - self.assertEqual(repr(f_exception), '') + self.assertEqual(repr(f_exception), + '') self.assertIs(f_exception.exception(), exc) def func_repr(func): @@ -332,16 +333,21 @@ if debug: frame = source_traceback[-1] regex = (r'^Future exception was never retrieved\n' - r'future: \n' - r'source_traceback: Object created at \(most recent call last\):\n' + r'future: \n' + r'source_traceback: Object ' + r'created at \(most recent call last\):\n' r' File' r'.*\n' - r' File "{filename}", line {lineno}, in check_future_exception_never_retrieved\n' + r' File "{filename}", line {lineno}, ' + r'in check_future_exception_never_retrieved\n' r' future = asyncio\.Future\(loop=self\.loop\)$' - ).format(filename=re.escape(frame[0]), lineno=frame[1]) + ).format(filename=re.escape(frame[0]), + lineno=frame[1]) else: regex = (r'^Future exception was never retrieved\n' - r'future: $' + r'future: ' + r'$' ) exc_info = (type(exc), exc, exc.__traceback__) m_log.error.assert_called_once_with(mock.ANY, exc_info=exc_info) @@ -352,12 +358,14 @@ r'Future/Task created at \(most recent call last\):\n' r' File' r'.*\n' - r' File "{filename}", line {lineno}, in check_future_exception_never_retrieved\n' + r' File "{filename}", line {lineno}, ' + r'in check_future_exception_never_retrieved\n' r' future = asyncio\.Future\(loop=self\.loop\)\n' r'Traceback \(most recent call last\):\n' r'.*\n' r'MemoryError$' - ).format(filename=re.escape(frame[0]), lineno=frame[1]) + ).format(filename=re.escape(frame[0]), + lineno=frame[1]) else: regex = (r'^Future/Task exception was never retrieved\n' r'Traceback \(most recent call last\):\n' diff --git a/Lib/test/test_asyncio/test_streams.py b/Lib/test/test_asyncio/test_streams.py --- a/Lib/test/test_asyncio/test_streams.py +++ b/Lib/test/test_asyncio/test_streams.py @@ -613,8 +613,10 @@ watcher.attach_loop(self.loop) try: asyncio.set_child_watcher(watcher) - proc = self.loop.run_until_complete( - asyncio.create_subprocess_exec(*args, pass_fds={wfd}, loop=self.loop)) + create = asyncio.create_subprocess_exec(*args, + pass_fds={wfd}, + loop=self.loop) + proc = self.loop.run_until_complete(create) self.loop.run_until_complete(proc.wait()) finally: asyncio.set_child_watcher(None) diff --git a/Lib/test/test_asyncio/test_subprocess.py b/Lib/test/test_asyncio/test_subprocess.py --- a/Lib/test/test_asyncio/test_subprocess.py +++ b/Lib/test/test_asyncio/test_subprocess.py @@ -115,7 +115,9 @@ def test_send_signal(self): code = 'import time; print("sleeping", flush=True); time.sleep(3600)' args = [sys.executable, '-c', code] - create = asyncio.create_subprocess_exec(*args, loop=self.loop, stdout=subprocess.PIPE) + create = asyncio.create_subprocess_exec(*args, + stdout=subprocess.PIPE, + loop=self.loop) proc = self.loop.run_until_complete(create) @asyncio.coroutine diff --git a/Lib/test/test_asyncio/test_tasks.py b/Lib/test/test_asyncio/test_tasks.py --- a/Lib/test/test_asyncio/test_tasks.py +++ b/Lib/test/test_asyncio/test_tasks.py @@ -208,7 +208,8 @@ self.assertEqual(notmuch.__name__, 'notmuch') if PY35: self.assertEqual(notmuch.__qualname__, - 'TaskTests.test_task_repr_coro_decorator..notmuch') + 'TaskTests.test_task_repr_coro_decorator' + '..notmuch') self.assertEqual(notmuch.__module__, __name__) # test coroutine object @@ -218,7 +219,8 @@ # function, as expected, and have a qualified name (__qualname__ # attribute). coro_name = 'notmuch' - coro_qualname = 'TaskTests.test_task_repr_coro_decorator..notmuch' + coro_qualname = ('TaskTests.test_task_repr_coro_decorator' + '..notmuch') else: # On Python < 3.5, generators inherit the name of the code, not of # the function. See: http://bugs.python.org/issue21205 @@ -239,7 +241,8 @@ else: code = gen.gi_code coro = ('%s() running at %s:%s' - % (coro_qualname, code.co_filename, code.co_firstlineno)) + % (coro_qualname, code.co_filename, + code.co_firstlineno)) self.assertEqual(repr(gen), '' % coro) @@ -1678,7 +1681,8 @@ self.assertTrue(m_log.error.called) message = m_log.error.call_args[0][0] func_filename, func_lineno = test_utils.get_function_source(coro_noop) - regex = (r'^ was never yielded from\n' + regex = (r'^ ' + r'was never yielded from\n' r'Coroutine object created at \(most recent call last\):\n' r'.*\n' r' File "%s", line %s, in test_coroutine_never_yielded\n' -- Repository URL: https://hg.python.org/cpython From python-checkins at python.org Fri Jan 9 00:11:33 2015 From: python-checkins at python.org (victor.stinner) Date: Thu, 08 Jan 2015 23:11:33 +0000 Subject: [Python-checkins] =?utf-8?q?cpython_=28merge_3=2E4_-=3E_default?= =?utf-8?q?=29=3A_Merge_3=2E4_=28asyncio=29?= Message-ID: <20150108231129.72579.48016@psf.io> https://hg.python.org/cpython/rev/c644d49130b6 changeset: 94081:c644d49130b6 parent: 94079:a325e2d93c06 parent: 94080:756d09136a2c user: Victor Stinner date: Fri Jan 09 00:09:35 2015 +0100 summary: Merge 3.4 (asyncio) files: Lib/asyncio/base_events.py | 4 +- Lib/asyncio/coroutines.py | 12 +++- Lib/asyncio/selector_events.py | 7 +- Lib/asyncio/tasks.py | 2 +- Lib/asyncio/unix_events.py | 3 +- Lib/asyncio/windows_events.py | 3 +- Lib/asyncio/windows_utils.py | 7 +- Lib/test/test_asyncio/test_base_events.py | 9 ++- Lib/test/test_asyncio/test_futures.py | 24 +++++++--- Lib/test/test_asyncio/test_streams.py | 6 +- Lib/test/test_asyncio/test_subprocess.py | 4 +- Lib/test/test_asyncio/test_tasks.py | 12 +++- 12 files changed, 60 insertions(+), 33 deletions(-) diff --git a/Lib/asyncio/base_events.py b/Lib/asyncio/base_events.py --- a/Lib/asyncio/base_events.py +++ b/Lib/asyncio/base_events.py @@ -201,8 +201,8 @@ """Create socket transport.""" raise NotImplementedError - def _make_ssl_transport(self, rawsock, protocol, sslcontext, waiter=None, *, - server_side=False, server_hostname=None, + def _make_ssl_transport(self, rawsock, protocol, sslcontext, waiter=None, + *, server_side=False, server_hostname=None, extra=None, server=None): """Create SSL transport.""" raise NotImplementedError diff --git a/Lib/asyncio/coroutines.py b/Lib/asyncio/coroutines.py --- a/Lib/asyncio/coroutines.py +++ b/Lib/asyncio/coroutines.py @@ -182,14 +182,18 @@ and not inspect.isgeneratorfunction(coro.func)): filename, lineno = events._get_function_source(coro.func) if coro.gi_frame is None: - coro_repr = '%s() done, defined at %s:%s' % (coro_name, filename, lineno) + coro_repr = ('%s() done, defined at %s:%s' + % (coro_name, filename, lineno)) else: - coro_repr = '%s() running, defined at %s:%s' % (coro_name, filename, lineno) + coro_repr = ('%s() running, defined at %s:%s' + % (coro_name, filename, lineno)) elif coro.gi_frame is not None: lineno = coro.gi_frame.f_lineno - coro_repr = '%s() running at %s:%s' % (coro_name, filename, lineno) + coro_repr = ('%s() running at %s:%s' + % (coro_name, filename, lineno)) else: lineno = coro.gi_code.co_firstlineno - coro_repr = '%s() done, defined at %s:%s' % (coro_name, filename, lineno) + coro_repr = ('%s() done, defined at %s:%s' + % (coro_name, filename, lineno)) return coro_repr diff --git a/Lib/asyncio/selector_events.py b/Lib/asyncio/selector_events.py --- a/Lib/asyncio/selector_events.py +++ b/Lib/asyncio/selector_events.py @@ -55,8 +55,8 @@ return _SelectorSocketTransport(self, sock, protocol, waiter, extra, server) - def _make_ssl_transport(self, rawsock, protocol, sslcontext, waiter=None, *, - server_side=False, server_hostname=None, + def _make_ssl_transport(self, rawsock, protocol, sslcontext, waiter=None, + *, server_side=False, server_hostname=None, extra=None, server=None): return _SelectorSslTransport( self, rawsock, protocol, sslcontext, waiter, @@ -484,7 +484,8 @@ info.append('read=idle') polling = _test_selector_event(self._loop._selector, - self._sock_fd, selectors.EVENT_WRITE) + self._sock_fd, + selectors.EVENT_WRITE) if polling: state = 'polling' else: diff --git a/Lib/asyncio/tasks.py b/Lib/asyncio/tasks.py --- a/Lib/asyncio/tasks.py +++ b/Lib/asyncio/tasks.py @@ -68,7 +68,7 @@ return {t for t in cls._all_tasks if t._loop is loop} def __init__(self, coro, *, loop=None): - assert coroutines.iscoroutine(coro), repr(coro) # Not a coroutine function! + assert coroutines.iscoroutine(coro), repr(coro) super().__init__(loop=loop) if self._source_traceback: del self._source_traceback[-1] diff --git a/Lib/asyncio/unix_events.py b/Lib/asyncio/unix_events.py --- a/Lib/asyncio/unix_events.py +++ b/Lib/asyncio/unix_events.py @@ -69,7 +69,8 @@ """ if (coroutines.iscoroutine(callback) or coroutines.iscoroutinefunction(callback)): - raise TypeError("coroutines cannot be used with add_signal_handler()") + raise TypeError("coroutines cannot be used " + "with add_signal_handler()") self._check_signal(sig) self._check_closed() try: diff --git a/Lib/asyncio/windows_events.py b/Lib/asyncio/windows_events.py --- a/Lib/asyncio/windows_events.py +++ b/Lib/asyncio/windows_events.py @@ -424,7 +424,8 @@ else: return windows_utils.PipeHandle(handle) - return self._register(ov, None, finish_connect_pipe, wait_for_post=True) + return self._register(ov, None, finish_connect_pipe, + wait_for_post=True) def wait_for_handle(self, handle, timeout=None): """Wait for a handle. diff --git a/Lib/asyncio/windows_utils.py b/Lib/asyncio/windows_utils.py --- a/Lib/asyncio/windows_utils.py +++ b/Lib/asyncio/windows_utils.py @@ -36,15 +36,16 @@ def socketpair(family=socket.AF_INET, type=socket.SOCK_STREAM, proto=0): """A socket pair usable as a self-pipe, for Windows. - Origin: https://gist.github.com/4325783, by Geert Jansen. Public domain. + Origin: https://gist.github.com/4325783, by Geert Jansen. + Public domain. """ if family == socket.AF_INET: host = '127.0.0.1' elif family == socket.AF_INET6: host = '::1' else: - raise ValueError("Only AF_INET and AF_INET6 socket address families " - "are supported") + raise ValueError("Only AF_INET and AF_INET6 socket address " + "families are supported") if type != socket.SOCK_STREAM: raise ValueError("Only SOCK_STREAM socket type is supported") if proto != 0: diff --git a/Lib/test/test_asyncio/test_base_events.py b/Lib/test/test_asyncio/test_base_events.py --- a/Lib/test/test_asyncio/test_base_events.py +++ b/Lib/test/test_asyncio/test_base_events.py @@ -285,7 +285,8 @@ @mock.patch('asyncio.base_events.logger') def test__run_once_logging(self, m_logger): def slow_select(timeout): - # Sleep a bit longer than a second to avoid timer resolution issues. + # Sleep a bit longer than a second to avoid timer resolution + # issues. time.sleep(1.1) return [] @@ -1217,14 +1218,16 @@ self.loop.run_forever() fmt, *args = m_logger.warning.call_args[0] self.assertRegex(fmt % tuple(args), - "^Executing took .* seconds$") + "^Executing " + "took .* seconds$") # slow task asyncio.async(stop_loop_coro(self.loop), loop=self.loop) self.loop.run_forever() fmt, *args = m_logger.warning.call_args[0] self.assertRegex(fmt % tuple(args), - "^Executing took .* seconds$") + "^Executing " + "took .* seconds$") if __name__ == '__main__': diff --git a/Lib/test/test_asyncio/test_futures.py b/Lib/test/test_asyncio/test_futures.py --- a/Lib/test/test_asyncio/test_futures.py +++ b/Lib/test/test_asyncio/test_futures.py @@ -133,7 +133,8 @@ exc = RuntimeError() f_exception = asyncio.Future(loop=self.loop) f_exception.set_exception(exc) - self.assertEqual(repr(f_exception), '') + self.assertEqual(repr(f_exception), + '') self.assertIs(f_exception.exception(), exc) def func_repr(func): @@ -332,16 +333,21 @@ if debug: frame = source_traceback[-1] regex = (r'^Future exception was never retrieved\n' - r'future: \n' - r'source_traceback: Object created at \(most recent call last\):\n' + r'future: \n' + r'source_traceback: Object ' + r'created at \(most recent call last\):\n' r' File' r'.*\n' - r' File "{filename}", line {lineno}, in check_future_exception_never_retrieved\n' + r' File "{filename}", line {lineno}, ' + r'in check_future_exception_never_retrieved\n' r' future = asyncio\.Future\(loop=self\.loop\)$' - ).format(filename=re.escape(frame[0]), lineno=frame[1]) + ).format(filename=re.escape(frame[0]), + lineno=frame[1]) else: regex = (r'^Future exception was never retrieved\n' - r'future: $' + r'future: ' + r'$' ) exc_info = (type(exc), exc, exc.__traceback__) m_log.error.assert_called_once_with(mock.ANY, exc_info=exc_info) @@ -352,12 +358,14 @@ r'Future/Task created at \(most recent call last\):\n' r' File' r'.*\n' - r' File "{filename}", line {lineno}, in check_future_exception_never_retrieved\n' + r' File "{filename}", line {lineno}, ' + r'in check_future_exception_never_retrieved\n' r' future = asyncio\.Future\(loop=self\.loop\)\n' r'Traceback \(most recent call last\):\n' r'.*\n' r'MemoryError$' - ).format(filename=re.escape(frame[0]), lineno=frame[1]) + ).format(filename=re.escape(frame[0]), + lineno=frame[1]) else: regex = (r'^Future/Task exception was never retrieved\n' r'Traceback \(most recent call last\):\n' diff --git a/Lib/test/test_asyncio/test_streams.py b/Lib/test/test_asyncio/test_streams.py --- a/Lib/test/test_asyncio/test_streams.py +++ b/Lib/test/test_asyncio/test_streams.py @@ -613,8 +613,10 @@ watcher.attach_loop(self.loop) try: asyncio.set_child_watcher(watcher) - proc = self.loop.run_until_complete( - asyncio.create_subprocess_exec(*args, pass_fds={wfd}, loop=self.loop)) + create = asyncio.create_subprocess_exec(*args, + pass_fds={wfd}, + loop=self.loop) + proc = self.loop.run_until_complete(create) self.loop.run_until_complete(proc.wait()) finally: asyncio.set_child_watcher(None) diff --git a/Lib/test/test_asyncio/test_subprocess.py b/Lib/test/test_asyncio/test_subprocess.py --- a/Lib/test/test_asyncio/test_subprocess.py +++ b/Lib/test/test_asyncio/test_subprocess.py @@ -115,7 +115,9 @@ def test_send_signal(self): code = 'import time; print("sleeping", flush=True); time.sleep(3600)' args = [sys.executable, '-c', code] - create = asyncio.create_subprocess_exec(*args, loop=self.loop, stdout=subprocess.PIPE) + create = asyncio.create_subprocess_exec(*args, + stdout=subprocess.PIPE, + loop=self.loop) proc = self.loop.run_until_complete(create) @asyncio.coroutine diff --git a/Lib/test/test_asyncio/test_tasks.py b/Lib/test/test_asyncio/test_tasks.py --- a/Lib/test/test_asyncio/test_tasks.py +++ b/Lib/test/test_asyncio/test_tasks.py @@ -208,7 +208,8 @@ self.assertEqual(notmuch.__name__, 'notmuch') if PY35: self.assertEqual(notmuch.__qualname__, - 'TaskTests.test_task_repr_coro_decorator..notmuch') + 'TaskTests.test_task_repr_coro_decorator' + '..notmuch') self.assertEqual(notmuch.__module__, __name__) # test coroutine object @@ -218,7 +219,8 @@ # function, as expected, and have a qualified name (__qualname__ # attribute). coro_name = 'notmuch' - coro_qualname = 'TaskTests.test_task_repr_coro_decorator..notmuch' + coro_qualname = ('TaskTests.test_task_repr_coro_decorator' + '..notmuch') else: # On Python < 3.5, generators inherit the name of the code, not of # the function. See: http://bugs.python.org/issue21205 @@ -239,7 +241,8 @@ else: code = gen.gi_code coro = ('%s() running at %s:%s' - % (coro_qualname, code.co_filename, code.co_firstlineno)) + % (coro_qualname, code.co_filename, + code.co_firstlineno)) self.assertEqual(repr(gen), '' % coro) @@ -1678,7 +1681,8 @@ self.assertTrue(m_log.error.called) message = m_log.error.call_args[0][0] func_filename, func_lineno = test_utils.get_function_source(coro_noop) - regex = (r'^ was never yielded from\n' + regex = (r'^ ' + r'was never yielded from\n' r'Coroutine object created at \(most recent call last\):\n' r'.*\n' r' File "%s", line %s, in test_coroutine_never_yielded\n' -- Repository URL: https://hg.python.org/cpython From python-checkins at python.org Fri Jan 9 00:14:04 2015 From: python-checkins at python.org (victor.stinner) Date: Thu, 08 Jan 2015 23:14:04 +0000 Subject: [Python-checkins] =?utf-8?q?cpython=3A_selectors=3A_truncate_to_8?= =?utf-8?q?0_characters?= Message-ID: <20150108231350.125890.26512@psf.io> https://hg.python.org/cpython/rev/e86d0ef45e21 changeset: 94082:e86d0ef45e21 user: Victor Stinner date: Fri Jan 09 00:13:39 2015 +0100 summary: selectors: truncate to 80 characters files: Lib/selectors.py | 3 ++- 1 files changed, 2 insertions(+), 1 deletions(-) diff --git a/Lib/selectors.py b/Lib/selectors.py --- a/Lib/selectors.py +++ b/Lib/selectors.py @@ -576,7 +576,8 @@ super().close() -# Choose the best implementation: roughly, epoll|kqueue|devpoll > poll > select. +# Choose the best implementation, roughly: +# epoll|kqueue|devpoll > poll > select. # select() also can't accept a FD > FD_SETSIZE (usually around 1024) if 'KqueueSelector' in globals(): DefaultSelector = KqueueSelector -- Repository URL: https://hg.python.org/cpython From python-checkins at python.org Fri Jan 9 01:32:40 2015 From: python-checkins at python.org (victor.stinner) Date: Fri, 09 Jan 2015 00:32:40 +0000 Subject: [Python-checkins] =?utf-8?q?cpython_=28merge_3=2E4_-=3E_default?= =?utf-8?q?=29=3A_Merge_3=2E4_=28asyncio_doc=29?= Message-ID: <20150109003233.72579.89276@psf.io> https://hg.python.org/cpython/rev/94ee2fdf2df3 changeset: 94084:94ee2fdf2df3 parent: 94082:e86d0ef45e21 parent: 94083:a04d87fbff4e user: Victor Stinner date: Fri Jan 09 01:32:25 2015 +0100 summary: Merge 3.4 (asyncio doc) files: Doc/library/asyncio-dev.rst | 5 ++++ Doc/library/asyncio-subprocess.rst | 19 ++++++++++++++++++ 2 files changed, 24 insertions(+), 0 deletions(-) diff --git a/Doc/library/asyncio-dev.rst b/Doc/library/asyncio-dev.rst --- a/Doc/library/asyncio-dev.rst +++ b/Doc/library/asyncio-dev.rst @@ -74,6 +74,11 @@ The :ref:`Synchronization primitives ` section describes ways to synchronize tasks. + The :ref:`Subprocess and threads ` section lists + asyncio limitations to run subprocesses from different threads. + + + .. _asyncio-handle-blocking: diff --git a/Doc/library/asyncio-subprocess.rst b/Doc/library/asyncio-subprocess.rst --- a/Doc/library/asyncio-subprocess.rst +++ b/Doc/library/asyncio-subprocess.rst @@ -297,6 +297,25 @@ ``N`` (Unix only). +.. _asyncio-subprocess-threads: + +Subprocess and threads +====================== + +asyncio supports running subprocesses from different threads, but there +are limits: + +* An event loop must run in the main thread +* The child watcher must be instantiated in the main thread, before executing + subprocesses from other threads. Call the :func:`get_child_watcher` + function in the main thread to instantiate the child watcher. + +.. seealso:: + + The :ref:`Concurrency and multithreading in asyncio + ` section. + + Subprocess examples =================== -- Repository URL: https://hg.python.org/cpython From python-checkins at python.org Fri Jan 9 01:32:40 2015 From: python-checkins at python.org (victor.stinner) Date: Fri, 09 Jan 2015 00:32:40 +0000 Subject: [Python-checkins] =?utf-8?b?Y3B5dGhvbiAoMy40KTogYXN5bmNpbyBkb2M6?= =?utf-8?q?_list_limitations_to_run_subprocesses_from_different_threads?= Message-ID: <20150109003233.72575.26860@psf.io> https://hg.python.org/cpython/rev/a04d87fbff4e changeset: 94083:a04d87fbff4e branch: 3.4 parent: 94080:756d09136a2c user: Victor Stinner date: Fri Jan 09 01:32:02 2015 +0100 summary: asyncio doc: list limitations to run subprocesses from different threads files: Doc/library/asyncio-dev.rst | 5 ++++ Doc/library/asyncio-subprocess.rst | 19 ++++++++++++++++++ 2 files changed, 24 insertions(+), 0 deletions(-) diff --git a/Doc/library/asyncio-dev.rst b/Doc/library/asyncio-dev.rst --- a/Doc/library/asyncio-dev.rst +++ b/Doc/library/asyncio-dev.rst @@ -74,6 +74,11 @@ The :ref:`Synchronization primitives ` section describes ways to synchronize tasks. + The :ref:`Subprocess and threads ` section lists + asyncio limitations to run subprocesses from different threads. + + + .. _asyncio-handle-blocking: diff --git a/Doc/library/asyncio-subprocess.rst b/Doc/library/asyncio-subprocess.rst --- a/Doc/library/asyncio-subprocess.rst +++ b/Doc/library/asyncio-subprocess.rst @@ -297,6 +297,25 @@ ``N`` (Unix only). +.. _asyncio-subprocess-threads: + +Subprocess and threads +====================== + +asyncio supports running subprocesses from different threads, but there +are limits: + +* An event loop must run in the main thread +* The child watcher must be instantiated in the main thread, before executing + subprocesses from other threads. Call the :func:`get_child_watcher` + function in the main thread to instantiate the child watcher. + +.. seealso:: + + The :ref:`Concurrency and multithreading in asyncio + ` section. + + Subprocess examples =================== -- Repository URL: https://hg.python.org/cpython From python-checkins at python.org Fri Jan 9 01:44:04 2015 From: python-checkins at python.org (victor.stinner) Date: Fri, 09 Jan 2015 00:44:04 +0000 Subject: [Python-checkins] =?utf-8?b?Y3B5dGhvbiAoMy40KTogYXN5bmNpbzogc3lu?= =?utf-8?q?c_with_Tulip?= Message-ID: <20150109004400.22395.3543@psf.io> https://hg.python.org/cpython/rev/d07386812daa changeset: 94085:d07386812daa branch: 3.4 parent: 94083:a04d87fbff4e user: Victor Stinner date: Fri Jan 09 01:42:52 2015 +0100 summary: asyncio: sync with Tulip * Document why set_result() calls are safe * Cleanup gather(). Use public methods instead of hacks to consume the exception of a future. * sock_connect(): pass directly the fd to _sock_connect_done instead of the socket. files: Lib/asyncio/queues.py | 6 ++++++ Lib/asyncio/selector_events.py | 6 +++--- Lib/asyncio/tasks.py | 11 +++++++---- 3 files changed, 16 insertions(+), 7 deletions(-) diff --git a/Lib/asyncio/queues.py b/Lib/asyncio/queues.py --- a/Lib/asyncio/queues.py +++ b/Lib/asyncio/queues.py @@ -126,6 +126,8 @@ # Use _put and _get instead of passing item straight to getter, in # case a subclass has logic that must run (e.g. JoinableQueue). self._put(item) + + # getter cannot be cancelled, we just removed done getters getter.set_result(self._get()) elif self._maxsize > 0 and self._maxsize <= self.qsize(): @@ -152,6 +154,8 @@ # Use _put and _get instead of passing item straight to getter, in # case a subclass has logic that must run (e.g. JoinableQueue). self._put(item) + + # getter cannot be cancelled, we just removed done getters getter.set_result(self._get()) elif self._maxsize > 0 and self._maxsize <= self.qsize(): @@ -200,6 +204,8 @@ item, putter = self._putters.popleft() self._put(item) # Wake putter on next tick. + + # getter cannot be cancelled, we just removed done putters putter.set_result(None) return self._get() diff --git a/Lib/asyncio/selector_events.py b/Lib/asyncio/selector_events.py --- a/Lib/asyncio/selector_events.py +++ b/Lib/asyncio/selector_events.py @@ -363,15 +363,15 @@ break except BlockingIOError: fut.add_done_callback(functools.partial(self._sock_connect_done, - sock)) + fd)) self.add_writer(fd, self._sock_connect_cb, fut, sock, address) except Exception as exc: fut.set_exception(exc) else: fut.set_result(None) - def _sock_connect_done(self, sock, fut): - self.remove_writer(sock.fileno()) + def _sock_connect_done(self, fd, fut): + self.remove_writer(fd) def _sock_connect_cb(self, fut, sock, address): if fut.cancelled(): diff --git a/Lib/asyncio/tasks.py b/Lib/asyncio/tasks.py --- a/Lib/asyncio/tasks.py +++ b/Lib/asyncio/tasks.py @@ -582,11 +582,12 @@ def _done_callback(i, fut): nonlocal nfinished - if outer._state != futures._PENDING: - if fut._exception is not None: + if outer.done(): + if not fut.cancelled(): # Mark exception retrieved. fut.exception() return + if fut._state == futures._CANCELLED: res = futures.CancelledError() if not return_exceptions: @@ -644,9 +645,11 @@ def _done_callback(inner): if outer.cancelled(): - # Mark inner's result as retrieved. - inner.cancelled() or inner.exception() + if not inner.cancelled(): + # Mark inner's result as retrieved. + inner.exception() return + if inner.cancelled(): outer.cancel() else: -- Repository URL: https://hg.python.org/cpython From python-checkins at python.org Fri Jan 9 01:44:04 2015 From: python-checkins at python.org (victor.stinner) Date: Fri, 09 Jan 2015 00:44:04 +0000 Subject: [Python-checkins] =?utf-8?q?cpython_=28merge_3=2E4_-=3E_default?= =?utf-8?q?=29=3A_Merge_3=2E4_=28asyncio=29?= Message-ID: <20150109004400.72565.50204@psf.io> https://hg.python.org/cpython/rev/f7aef9f3daef changeset: 94086:f7aef9f3daef parent: 94084:94ee2fdf2df3 parent: 94085:d07386812daa user: Victor Stinner date: Fri Jan 09 01:43:04 2015 +0100 summary: Merge 3.4 (asyncio) files: Lib/asyncio/queues.py | 6 ++++++ Lib/asyncio/selector_events.py | 6 +++--- Lib/asyncio/tasks.py | 11 +++++++---- 3 files changed, 16 insertions(+), 7 deletions(-) diff --git a/Lib/asyncio/queues.py b/Lib/asyncio/queues.py --- a/Lib/asyncio/queues.py +++ b/Lib/asyncio/queues.py @@ -126,6 +126,8 @@ # Use _put and _get instead of passing item straight to getter, in # case a subclass has logic that must run (e.g. JoinableQueue). self._put(item) + + # getter cannot be cancelled, we just removed done getters getter.set_result(self._get()) elif self._maxsize > 0 and self._maxsize <= self.qsize(): @@ -152,6 +154,8 @@ # Use _put and _get instead of passing item straight to getter, in # case a subclass has logic that must run (e.g. JoinableQueue). self._put(item) + + # getter cannot be cancelled, we just removed done getters getter.set_result(self._get()) elif self._maxsize > 0 and self._maxsize <= self.qsize(): @@ -200,6 +204,8 @@ item, putter = self._putters.popleft() self._put(item) # Wake putter on next tick. + + # getter cannot be cancelled, we just removed done putters putter.set_result(None) return self._get() diff --git a/Lib/asyncio/selector_events.py b/Lib/asyncio/selector_events.py --- a/Lib/asyncio/selector_events.py +++ b/Lib/asyncio/selector_events.py @@ -363,15 +363,15 @@ break except BlockingIOError: fut.add_done_callback(functools.partial(self._sock_connect_done, - sock)) + fd)) self.add_writer(fd, self._sock_connect_cb, fut, sock, address) except Exception as exc: fut.set_exception(exc) else: fut.set_result(None) - def _sock_connect_done(self, sock, fut): - self.remove_writer(sock.fileno()) + def _sock_connect_done(self, fd, fut): + self.remove_writer(fd) def _sock_connect_cb(self, fut, sock, address): if fut.cancelled(): diff --git a/Lib/asyncio/tasks.py b/Lib/asyncio/tasks.py --- a/Lib/asyncio/tasks.py +++ b/Lib/asyncio/tasks.py @@ -582,11 +582,12 @@ def _done_callback(i, fut): nonlocal nfinished - if outer._state != futures._PENDING: - if fut._exception is not None: + if outer.done(): + if not fut.cancelled(): # Mark exception retrieved. fut.exception() return + if fut._state == futures._CANCELLED: res = futures.CancelledError() if not return_exceptions: @@ -644,9 +645,11 @@ def _done_callback(inner): if outer.cancelled(): - # Mark inner's result as retrieved. - inner.cancelled() or inner.exception() + if not inner.cancelled(): + # Mark inner's result as retrieved. + inner.exception() return + if inner.cancelled(): outer.cancel() else: -- Repository URL: https://hg.python.org/cpython From python-checkins at python.org Fri Jan 9 02:14:19 2015 From: python-checkins at python.org (victor.stinner) Date: Fri, 09 Jan 2015 01:14:19 +0000 Subject: [Python-checkins] =?utf-8?q?cpython=3A_Issue_=2322038=3A_pyatomic?= =?utf-8?q?=2Eh_now_uses_stdatomic=2Eh_or_GCC_built-in_functions_for?= Message-ID: <20150109011411.8765.83201@psf.io> https://hg.python.org/cpython/rev/fbe87fb071a6 changeset: 94087:fbe87fb071a6 user: Victor Stinner date: Fri Jan 09 02:13:19 2015 +0100 summary: Issue #22038: pyatomic.h now uses stdatomic.h or GCC built-in functions for atomic memory access if available. Patch written by Vitor de Lima and Gustavo Temple. files: Include/pyatomic.h | 80 ++++++++++++++++++++++++++++++++- Misc/ACKS | 2 + Misc/NEWS | 4 + configure | 65 +++++++++++++++++++++++++++ configure.ac | 39 ++++++++++++++++ pyconfig.h.in | 6 ++ 6 files changed, 193 insertions(+), 3 deletions(-) diff --git a/Include/pyatomic.h b/Include/pyatomic.h --- a/Include/pyatomic.h +++ b/Include/pyatomic.h @@ -1,12 +1,15 @@ #ifndef Py_LIMITED_API #ifndef Py_ATOMIC_H #define Py_ATOMIC_H -/* XXX: When compilers start offering a stdatomic.h with lock-free - atomic_int and atomic_address types, include that here and rewrite - the atomic operations in terms of it. */ #include "dynamic_annotations.h" +#include "pyconfig.h" + +#if defined(HAVE_STD_ATOMIC) +#include +#endif + #ifdef __cplusplus extern "C" { #endif @@ -20,6 +23,76 @@ * Beware, the implementations here are deep magic. */ +#if defined(HAVE_STD_ATOMIC) + +typedef enum _Py_memory_order { + _Py_memory_order_relaxed = memory_order_relaxed, + _Py_memory_order_acquire = memory_order_acquire, + _Py_memory_order_release = memory_order_release, + _Py_memory_order_acq_rel = memory_order_acq_rel, + _Py_memory_order_seq_cst = memory_order_seq_cst +} _Py_memory_order; + +typedef struct _Py_atomic_address { + _Atomic void *_value; +} _Py_atomic_address; + +typedef struct _Py_atomic_int { + atomic_int _value; +} _Py_atomic_int; + +#define _Py_atomic_signal_fence(/*memory_order*/ ORDER) \ + atomic_signal_fence(ORDER) + +#define _Py_atomic_thread_fence(/*memory_order*/ ORDER) \ + atomic_thread_fence(ORDER) + +#define _Py_atomic_store_explicit(ATOMIC_VAL, NEW_VAL, ORDER) \ + atomic_store_explicit(&(ATOMIC_VAL)->_value, NEW_VAL, ORDER) + +#define _Py_atomic_load_explicit(ATOMIC_VAL, ORDER) \ + atomic_load_explicit(&(ATOMIC_VAL)->_value, ORDER) + +/* Use builtin atomic operations in GCC >= 4.7 */ +#elif defined(HAVE_BUILTIN_ATOMIC) + +typedef enum _Py_memory_order { + _Py_memory_order_relaxed = __ATOMIC_RELAXED, + _Py_memory_order_acquire = __ATOMIC_ACQUIRE, + _Py_memory_order_release = __ATOMIC_RELEASE, + _Py_memory_order_acq_rel = __ATOMIC_ACQ_REL, + _Py_memory_order_seq_cst = __ATOMIC_SEQ_CST +} _Py_memory_order; + +typedef struct _Py_atomic_address { + void *_value; +} _Py_atomic_address; + +typedef struct _Py_atomic_int { + int _value; +} _Py_atomic_int; + +#define _Py_atomic_signal_fence(/*memory_order*/ ORDER) \ + __atomic_signal_fence(ORDER) + +#define _Py_atomic_thread_fence(/*memory_order*/ ORDER) \ + __atomic_thread_fence(ORDER) + +#define _Py_atomic_store_explicit(ATOMIC_VAL, NEW_VAL, ORDER) \ + (assert((ORDER) == __ATOMIC_RELAXED \ + || (ORDER) == __ATOMIC_SEQ_CST \ + || (ORDER) == __ATOMIC_RELEASE), \ + __atomic_store_n(&(ATOMIC_VAL)->_value, NEW_VAL, ORDER)) + +#define _Py_atomic_load_explicit(ATOMIC_VAL, ORDER) \ + (assert((ORDER) == __ATOMIC_RELAXED \ + || (ORDER) == __ATOMIC_SEQ_CST \ + || (ORDER) == __ATOMIC_ACQUIRE \ + || (ORDER) == __ATOMIC_CONSUME), \ + __atomic_load_n(&(ATOMIC_VAL)->_value, ORDER)) + +#else + typedef enum _Py_memory_order { _Py_memory_order_relaxed, _Py_memory_order_acquire, @@ -162,6 +235,7 @@ ((ATOMIC_VAL)->_value) #endif /* !gcc x86 */ +#endif /* Standardized shortcuts. */ #define _Py_atomic_store(ATOMIC_VAL, NEW_VAL) \ diff --git a/Misc/ACKS b/Misc/ACKS --- a/Misc/ACKS +++ b/Misc/ACKS @@ -821,6 +821,7 @@ Shawn Ligocki Martin Ligr Gediminas Liktaras +Vitor de Lima Grant Limberg Christopher Lindblad Ulf A. Lindgren @@ -1355,6 +1356,7 @@ Amy Taylor Monty Taylor Anatoly Techtonik +Gustavo Temple Mikhail Terekhov Victor Terr?n Richard M. Tew diff --git a/Misc/NEWS b/Misc/NEWS --- a/Misc/NEWS +++ b/Misc/NEWS @@ -10,6 +10,10 @@ Core and Builtins ----------------- +- Issue #22038: pyatomic.h now uses stdatomic.h or GCC built-in functions for + atomic memory access if available. Patch written by Vitor de Lima and Gustavo + Temple. + - Issue #23048: Fix jumping out of an infinite while loop in the pdb. - Issue #20335: bytes constructor now raises TypeError when encoding or errors diff --git a/configure b/configure --- a/configure +++ b/configure @@ -15703,6 +15703,71 @@ esac fi +# Check for stdatomic.h +{ $as_echo "$as_me:${as_lineno-$LINENO}: checking for stdatomic.h" >&5 +$as_echo_n "checking for stdatomic.h... " >&6; } +cat confdefs.h - <<_ACEOF >conftest.$ac_ext +/* end confdefs.h. */ + + + #include + _Atomic int value = ATOMIC_VAR_INIT(1); + int main() { + int loaded_value = atomic_load(&value); + return 0; + } + + +_ACEOF +if ac_fn_c_try_link "$LINENO"; then : + have_stdatomic_h=yes +else + have_stdatomic_h=no +fi +rm -f core conftest.err conftest.$ac_objext \ + conftest$ac_exeext conftest.$ac_ext + +{ $as_echo "$as_me:${as_lineno-$LINENO}: result: $have_stdatomic_h" >&5 +$as_echo "$have_stdatomic_h" >&6; } + +if test "$have_stdatomic_h" = yes; then + +$as_echo "#define HAVE_STD_ATOMIC 1" >>confdefs.h + +fi + +# Check for GCC >= 4.7 __atomic builtins +{ $as_echo "$as_me:${as_lineno-$LINENO}: checking for GCC >= 4.7 __atomic builtins" >&5 +$as_echo_n "checking for GCC >= 4.7 __atomic builtins... " >&6; } +cat confdefs.h - <<_ACEOF >conftest.$ac_ext +/* end confdefs.h. */ + + + volatile int val = 1; + int main() { + __atomic_load_n(&val, __ATOMIC_SEQ_CST); + return 0; + } + + +_ACEOF +if ac_fn_c_try_link "$LINENO"; then : + have_builtin_atomic=yes +else + have_builtin_atomic=no +fi +rm -f core conftest.err conftest.$ac_objext \ + conftest$ac_exeext conftest.$ac_ext + +{ $as_echo "$as_me:${as_lineno-$LINENO}: result: $have_builtin_atomic" >&5 +$as_echo "$have_builtin_atomic" >&6; } + +if test "$have_builtin_atomic" = yes; then + +$as_echo "#define HAVE_BUILTIN_ATOMIC 1" >>confdefs.h + +fi + # ensurepip option { $as_echo "$as_me:${as_lineno-$LINENO}: checking for ensurepip" >&5 $as_echo_n "checking for ensurepip... " >&6; } diff --git a/configure.ac b/configure.ac --- a/configure.ac +++ b/configure.ac @@ -4884,6 +4884,45 @@ esac fi +# Check for stdatomic.h +AC_MSG_CHECKING(for stdatomic.h) +AC_LINK_IFELSE( +[ + AC_LANG_SOURCE([[ + #include + _Atomic int value = ATOMIC_VAR_INIT(1); + int main() { + int loaded_value = atomic_load(&value); + return 0; + } + ]]) +],[have_stdatomic_h=yes],[have_stdatomic_h=no]) + +AC_MSG_RESULT($have_stdatomic_h) + +if test "$have_stdatomic_h" = yes; then + AC_DEFINE(HAVE_STD_ATOMIC, 1, [Has stdatomic.h]) +fi + +# Check for GCC >= 4.7 __atomic builtins +AC_MSG_CHECKING(for GCC >= 4.7 __atomic builtins) +AC_LINK_IFELSE( +[ + AC_LANG_SOURCE([[ + volatile int val = 1; + int main() { + __atomic_load_n(&val, __ATOMIC_SEQ_CST); + return 0; + } + ]]) +],[have_builtin_atomic=yes],[have_builtin_atomic=no]) + +AC_MSG_RESULT($have_builtin_atomic) + +if test "$have_builtin_atomic" = yes; then + AC_DEFINE(HAVE_BUILTIN_ATOMIC, 1, [Has builtin atomics]) +fi + # ensurepip option AC_MSG_CHECKING(for ensurepip) AC_ARG_WITH(ensurepip, diff --git a/pyconfig.h.in b/pyconfig.h.in --- a/pyconfig.h.in +++ b/pyconfig.h.in @@ -101,6 +101,9 @@ /* Define if `unsetenv` does not return an int. */ #undef HAVE_BROKEN_UNSETENV +/* Has builtin atomics */ +#undef HAVE_BUILTIN_ATOMIC + /* Define this if you have the type _Bool. */ #undef HAVE_C99_BOOL @@ -877,6 +880,9 @@ /* Define to 1 if you have the header file. */ #undef HAVE_STDLIB_H +/* Has stdatomic.h */ +#undef HAVE_STD_ATOMIC + /* Define to 1 if you have the `strdup' function. */ #undef HAVE_STRDUP -- Repository URL: https://hg.python.org/cpython From solipsis at pitrou.net Fri Jan 9 08:53:56 2015 From: solipsis at pitrou.net (solipsis at pitrou.net) Date: Fri, 09 Jan 2015 08:53:56 +0100 Subject: [Python-checkins] Daily reference leaks (fbe87fb071a6): sum=6 Message-ID: results for fbe87fb071a6 on branch "default" -------------------------------------------- test_asyncio leaked [0, 3, 0] memory blocks, sum=3 test_functools leaked [0, 0, 3] memory blocks, sum=3 Command line was: ['./python', '-m', 'test.regrtest', '-uall', '-R', '3:3:/home/antoine/cpython/refleaks/reflogCP4DXp', '-x'] From python-checkins at python.org Fri Jan 9 16:00:50 2015 From: python-checkins at python.org (victor.stinner) Date: Fri, 09 Jan 2015 15:00:50 +0000 Subject: [Python-checkins] =?utf-8?q?cpython_=28merge_3=2E4_-=3E_default?= =?utf-8?q?=29=3A_Merge_3=2E4_=28asyncio_doc=29?= Message-ID: <20150109150037.22405.78604@psf.io> https://hg.python.org/cpython/rev/d428b4ee8b39 changeset: 94090:d428b4ee8b39 parent: 94087:fbe87fb071a6 parent: 94089:75d7f29487ce user: Victor Stinner date: Fri Jan 09 16:00:30 2015 +0100 summary: Merge 3.4 (asyncio doc) files: Doc/library/asyncio-eventloop.rst | 10 +++++----- Doc/library/asyncio-eventloops.rst | 3 ++- 2 files changed, 7 insertions(+), 6 deletions(-) diff --git a/Doc/library/asyncio-eventloop.rst b/Doc/library/asyncio-eventloop.rst --- a/Doc/library/asyncio-eventloop.rst +++ b/Doc/library/asyncio-eventloop.rst @@ -681,12 +681,12 @@ Event loop examples -=================== +------------------- .. _asyncio-hello-world-callback: Hello World with call_soon() ----------------------------- +^^^^^^^^^^^^^^^^^^^^^^^^^^^^ Example using the :meth:`BaseEventLoop.call_soon` method to schedule a callback. The callback displays ``"Hello World"`` and then stops the event @@ -716,7 +716,7 @@ .. _asyncio-date-callback: Display the current date with call_later() ------------------------------------------- +^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ Example of callback displaying the current date every second. The callback uses the :meth:`BaseEventLoop.call_later` method to reschedule itself during 5 @@ -752,7 +752,7 @@ .. _asyncio-watch-read-event: Watch a file descriptor for read events ---------------------------------------- +^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ Wait until a file descriptor received some data using the :meth:`BaseEventLoop.add_reader` method and then close the event loop:: @@ -801,7 +801,7 @@ Set signal handlers for SIGINT and SIGTERM ------------------------------------------- +^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ Register handlers for signals :py:data:`SIGINT` and :py:data:`SIGTERM` using the :meth:`BaseEventLoop.add_signal_handler` method:: diff --git a/Doc/library/asyncio-eventloops.rst b/Doc/library/asyncio-eventloops.rst --- a/Doc/library/asyncio-eventloops.rst +++ b/Doc/library/asyncio-eventloops.rst @@ -87,7 +87,8 @@ :class:`SelectorEventLoop` specific limits: -- :class:`~selectors.SelectSelector` is used but it only supports sockets +- :class:`~selectors.SelectSelector` is used which only supports sockets + and is limited to 512 sockets. - :meth:`~BaseEventLoop.add_reader` and :meth:`~BaseEventLoop.add_writer` only accept file descriptors of sockets - Pipes are not supported -- Repository URL: https://hg.python.org/cpython From python-checkins at python.org Fri Jan 9 16:00:50 2015 From: python-checkins at python.org (victor.stinner) Date: Fri, 09 Jan 2015 15:00:50 +0000 Subject: [Python-checkins] =?utf-8?b?Y3B5dGhvbiAoMy40KTogYXN5bmNpbzogU2Vs?= =?utf-8?q?ectSelector_is_limited_to_512_sockets_on_Windows?= Message-ID: <20150109150037.22423.65117@psf.io> https://hg.python.org/cpython/rev/75d7f29487ce changeset: 94089:75d7f29487ce branch: 3.4 user: Victor Stinner date: Fri Jan 09 15:59:44 2015 +0100 summary: asyncio: SelectSelector is limited to 512 sockets on Windows files: Doc/library/asyncio-eventloops.rst | 3 ++- 1 files changed, 2 insertions(+), 1 deletions(-) diff --git a/Doc/library/asyncio-eventloops.rst b/Doc/library/asyncio-eventloops.rst --- a/Doc/library/asyncio-eventloops.rst +++ b/Doc/library/asyncio-eventloops.rst @@ -87,7 +87,8 @@ :class:`SelectorEventLoop` specific limits: -- :class:`~selectors.SelectSelector` is used but it only supports sockets +- :class:`~selectors.SelectSelector` is used which only supports sockets + and is limited to 512 sockets. - :meth:`~BaseEventLoop.add_reader` and :meth:`~BaseEventLoop.add_writer` only accept file descriptors of sockets - Pipes are not supported -- Repository URL: https://hg.python.org/cpython From python-checkins at python.org Fri Jan 9 16:00:50 2015 From: python-checkins at python.org (victor.stinner) Date: Fri, 09 Jan 2015 15:00:50 +0000 Subject: [Python-checkins] =?utf-8?b?Y3B5dGhvbiAoMy40KTogYXN5bmNpbyBkb2M6?= =?utf-8?q?_fix_section_of_event_loop_examples?= Message-ID: <20150109150036.125890.96457@psf.io> https://hg.python.org/cpython/rev/f2b9fac7d040 changeset: 94088:f2b9fac7d040 branch: 3.4 parent: 94085:d07386812daa user: Victor Stinner date: Fri Jan 09 15:58:41 2015 +0100 summary: asyncio doc: fix section of event loop examples files: Doc/library/asyncio-eventloop.rst | 10 +++++----- 1 files changed, 5 insertions(+), 5 deletions(-) diff --git a/Doc/library/asyncio-eventloop.rst b/Doc/library/asyncio-eventloop.rst --- a/Doc/library/asyncio-eventloop.rst +++ b/Doc/library/asyncio-eventloop.rst @@ -681,12 +681,12 @@ Event loop examples -=================== +------------------- .. _asyncio-hello-world-callback: Hello World with call_soon() ----------------------------- +^^^^^^^^^^^^^^^^^^^^^^^^^^^^ Example using the :meth:`BaseEventLoop.call_soon` method to schedule a callback. The callback displays ``"Hello World"`` and then stops the event @@ -716,7 +716,7 @@ .. _asyncio-date-callback: Display the current date with call_later() ------------------------------------------- +^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ Example of callback displaying the current date every second. The callback uses the :meth:`BaseEventLoop.call_later` method to reschedule itself during 5 @@ -752,7 +752,7 @@ .. _asyncio-watch-read-event: Watch a file descriptor for read events ---------------------------------------- +^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ Wait until a file descriptor received some data using the :meth:`BaseEventLoop.add_reader` method and then close the event loop:: @@ -801,7 +801,7 @@ Set signal handlers for SIGINT and SIGTERM ------------------------------------------- +^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ Register handlers for signals :py:data:`SIGINT` and :py:data:`SIGTERM` using the :meth:`BaseEventLoop.add_signal_handler` method:: -- Repository URL: https://hg.python.org/cpython From python-checkins at python.org Fri Jan 9 17:39:41 2015 From: python-checkins at python.org (brett.cannon) Date: Fri, 09 Jan 2015 16:39:41 +0000 Subject: [Python-checkins] =?utf-8?q?cpython=3A_Issue_=2323014=3A_Make_imp?= =?utf-8?q?ortlib=2Eabc=2ELoader=2Ecreate=5Fmodule=28=29_required_when?= Message-ID: <20150109163931.125896.6214@psf.io> https://hg.python.org/cpython/rev/ab72f30bcd9f changeset: 94091:ab72f30bcd9f user: Brett Cannon date: Fri Jan 09 11:39:21 2015 -0500 summary: Issue #23014: Make importlib.abc.Loader.create_module() required when importlib.abc.Loader.exec_module() is also defined. Before this change, create_module() was optional **and** could return None to trigger default semantics. This change now reduces the options for choosing default semantics to one and in the most backporting-friendly way (define create_module() to return None). files: Doc/library/importlib.rst | 24 +- Doc/reference/import.rst | 8 +- Doc/whatsnew/3.5.rst | 8 + Lib/importlib/_bootstrap.py | 14 + Lib/importlib/abc.py | 3 - Lib/test/test_importlib/import_/test___loader__.py | 3 + Lib/test/test_importlib/import_/test_api.py | 4 + Lib/test/test_importlib/test_spec.py | 3 + Lib/test/test_importlib/test_util.py | 12 +- Lib/test/test_pkgutil.py | 3 + Python/importlib.h | 4719 +++++---- 11 files changed, 2447 insertions(+), 2354 deletions(-) diff --git a/Doc/library/importlib.rst b/Doc/library/importlib.rst --- a/Doc/library/importlib.rst +++ b/Doc/library/importlib.rst @@ -347,13 +347,16 @@ .. method:: create_module(spec) - An optional method that returns the module object to use when - importing a module. create_module() may also return ``None``, - indicating that the default module creation should take place - instead. + A method that returns the module object to use when + importing a module. This method may return ``None``, + indicating that default module creation semantics should take place. .. versionadded:: 3.4 + .. versionchanged:: 3.5 + Starting in Python 3.6, this method will not be optional when + :meth:`exec_module` is defined. + .. method:: exec_module(module) An abstract method that executes the module in its own namespace @@ -417,7 +420,7 @@ .. deprecated:: 3.4 The recommended API for loading a module is :meth:`exec_module` - (and optionally :meth:`create_module`). Loaders should implement + (and :meth:`create_module`). Loaders should implement it instead of load_module(). The import machinery takes care of all the other responsibilities of load_module() when exec_module() is implemented. @@ -1136,9 +1139,9 @@ .. function:: module_from_spec(spec) - Create a new module based on **spec**. + Create a new module based on **spec** and ``spec.loader.create_module()``. - If the module object is from ``spec.loader.create_module()``, then any + If ``spec.loader.create_module()`` does not return ``None``, then any pre-existing attributes will not be reset. Also, no :exc:`AttributeError` will be raised if triggered while accessing **spec** or setting an attribute on the module. @@ -1234,9 +1237,10 @@ module has an attribute accessed. This class **only** works with loaders that define - :meth:`importlib.abc.Loader.exec_module` as control over what module type - is used for the module is required. For the same reasons, the loader - **cannot** define :meth:`importlib.abc.Loader.create_module`. Finally, + :meth:`~importlib.abc.Loader.exec_module` as control over what module type + is used for the module is required. For those same reasons, the loader's + :meth:`~importlib.abc.Loader.create_module` method will be ignored (i.e., the + loader's method should only return ``None``). Finally, modules which substitute the object placed into :attr:`sys.modules` will not work as there is no way to properly replace the module references throughout the interpreter safely; :exc:`ValueError` is raised if such a diff --git a/Doc/reference/import.rst b/Doc/reference/import.rst --- a/Doc/reference/import.rst +++ b/Doc/reference/import.rst @@ -339,6 +339,7 @@ module = None if spec.loader is not None and hasattr(spec.loader, 'create_module'): + # It is assumed 'exec_module' will also be defined on the loader. module = spec.loader.create_module(spec) if module is None: module = ModuleType(spec.name) @@ -427,7 +428,7 @@ by implementing a :meth:`~importlib.abc.Loader.create_module` method. It takes one argument, the module spec, and returns the new module object to use during loading. ``create_module()`` does not need to set any attributes -on the module object. If the loader does not define ``create_module()``, the +on the module object. If the method returns ``None``, the import machinery will create the new module itself. .. versionadded:: 3.4 @@ -462,6 +463,11 @@ module(s), and only if the loader itself has loaded the module(s) explicitly. +.. versionchanged:: 3.5 + A :exc:`DeprecationWarning` is raised when ``exec_module()`` is defined but + ``create_module()`` is not. Starting in Python 3.6 it will be an error to not + define ``create_module()`` on a loader attached to a ModuleSpec. + Module spec ----------- diff --git a/Doc/whatsnew/3.5.rst b/Doc/whatsnew/3.5.rst --- a/Doc/whatsnew/3.5.rst +++ b/Doc/whatsnew/3.5.rst @@ -456,6 +456,14 @@ `http.client` and `http.server` remain available for backwards compatibility. (Contributed by Demian Brecht in :issue:`21793`.) +* When an import loader defines :meth:`~importlib.machinery.Loader.exec_module` + it is now expected to also define + :meth:`~importlib.machinery.Loader.create_module` (raises a + :exc:`DeprecationWarning` now, will be an error in Python 3.6). If the loader + inherits from :class:`importlib.abc.Loader` then there is nothing to do, else + simply define :meth:`~importlib.machinery.Loader.create_module` to return + ``None`` (:issue:`23014`). + Changes in the C API -------------------- diff --git a/Lib/importlib/_bootstrap.py b/Lib/importlib/_bootstrap.py --- a/Lib/importlib/_bootstrap.py +++ b/Lib/importlib/_bootstrap.py @@ -1055,6 +1055,10 @@ # If create_module() returns `None` then it means default # module creation should be used. module = spec.loader.create_module(spec) + elif hasattr(spec.loader, 'exec_module'): + _warnings.warn('starting in Python 3.6, loaders defining exec_module() ' + 'must also define create_module()', + DeprecationWarning, stacklevel=2) if module is None: module = _new_module(spec.name) _init_module_attrs(spec, module) @@ -1298,6 +1302,10 @@ """ return cls if _imp.is_frozen(fullname) else None + @classmethod + def create_module(cls, spec): + """Use default semantics for module creation.""" + @staticmethod def exec_module(module): name = module.__spec__.name @@ -1411,6 +1419,9 @@ tail_name = fullname.rpartition('.')[2] return filename_base == '__init__' and tail_name != '__init__' + def create_module(self, spec): + """Use default semantics for module creation.""" + def exec_module(self, module): """Execute the module.""" code = self.get_code(module.__name__) @@ -1771,6 +1782,9 @@ def get_code(self, fullname): return compile('', '', 'exec', dont_inherit=True) + def create_module(self, spec): + """Use default semantics for module creation.""" + def exec_module(self, module): pass diff --git a/Lib/importlib/abc.py b/Lib/importlib/abc.py --- a/Lib/importlib/abc.py +++ b/Lib/importlib/abc.py @@ -122,9 +122,6 @@ This method should raise ImportError if anything prevents it from creating a new module. It may return None to indicate that the spec should create the new module. - - create_module() is optional. - """ # By default, defer to default semantics for the new module. return None diff --git a/Lib/test/test_importlib/import_/test___loader__.py b/Lib/test/test_importlib/import_/test___loader__.py --- a/Lib/test/test_importlib/import_/test___loader__.py +++ b/Lib/test/test_importlib/import_/test___loader__.py @@ -11,6 +11,9 @@ def find_spec(self, fullname, path=None, target=None): return machinery.ModuleSpec(fullname, self) + def create_module(self, spec): + return None + def exec_module(self, module): pass diff --git a/Lib/test/test_importlib/import_/test_api.py b/Lib/test/test_importlib/import_/test_api.py --- a/Lib/test/test_importlib/import_/test_api.py +++ b/Lib/test/test_importlib/import_/test_api.py @@ -17,6 +17,10 @@ return spec @staticmethod + def create_module(spec): + return None + + @staticmethod def exec_module(module): if module.__name__ == SUBMOD_NAME: raise ImportError('I cannot be loaded!') diff --git a/Lib/test/test_importlib/test_spec.py b/Lib/test/test_importlib/test_spec.py --- a/Lib/test/test_importlib/test_spec.py +++ b/Lib/test/test_importlib/test_spec.py @@ -34,6 +34,9 @@ def _is_package(self, name): return self.package + def create_module(self, spec): + return None + class NewLoader(TestLoader): diff --git a/Lib/test/test_importlib/test_util.py b/Lib/test/test_importlib/test_util.py --- a/Lib/test/test_importlib/test_util.py +++ b/Lib/test/test_importlib/test_util.py @@ -41,10 +41,16 @@ class ModuleFromSpecTests: def test_no_create_module(self): - class Loader(self.abc.Loader): - pass + class Loader: + def exec_module(self, module): + pass spec = self.machinery.ModuleSpec('test', Loader()) - module = self.util.module_from_spec(spec) + with warnings.catch_warnings(record=True) as w: + warnings.simplefilter('always') + module = self.util.module_from_spec(spec) + self.assertEqual(1, len(w)) + self.assertTrue(issubclass(w[0].category, DeprecationWarning)) + self.assertIn('create_module', str(w[0].message)) self.assertIsInstance(module, types.ModuleType) self.assertEqual(module.__name__, spec.name) diff --git a/Lib/test/test_pkgutil.py b/Lib/test/test_pkgutil.py --- a/Lib/test/test_pkgutil.py +++ b/Lib/test/test_pkgutil.py @@ -104,6 +104,9 @@ class PkgutilPEP302Tests(unittest.TestCase): class MyTestLoader(object): + def create_module(self, spec): + return None + def exec_module(self, mod): # Count how many times the module is reloaded mod.__dict__['loads'] = mod.__dict__.get('loads', 0) + 1 diff --git a/Python/importlib.h b/Python/importlib.h --- a/Python/importlib.h +++ b/Python/importlib.h [stripped] -- Repository URL: https://hg.python.org/cpython From python-checkins at python.org Fri Jan 9 19:19:40 2015 From: python-checkins at python.org (chris.angelico) Date: Fri, 09 Jan 2015 18:19:40 +0000 Subject: [Python-checkins] =?utf-8?q?peps=3A_Import_Guido=27s_text_from_ht?= =?utf-8?q?tps=3A//quip=2Ecom/r69HA9GhGa7J_into_PEP_483?= Message-ID: <20150109181934.125900.72027@psf.io> https://hg.python.org/peps/rev/ed50a618f8ef changeset: 5663:ed50a618f8ef user: Chris Angelico date: Sat Jan 10 05:19:10 2015 +1100 summary: Import Guido's text from https://quip.com/r69HA9GhGa7J into PEP 483 files: pep-0483.txt | 315 ++++++++++++++++++++++++++++++++++++++- 1 files changed, 311 insertions(+), 4 deletions(-) diff --git a/pep-0483.txt b/pep-0483.txt --- a/pep-0483.txt +++ b/pep-0483.txt @@ -11,11 +11,318 @@ Post-History: Resolution: -Abstract -======== +The Theory of Type Hinting +========================== -This PEP is currently a stub. The content should be copied from -https://quip.com/r69HA9GhGa7J and reformatted. +Guido van Rossum, Dec 19, 2014 (with a few more recent updates) + +This work is licensed under a `Creative Commons +Attribution-NonCommercial-ShareAlike 4.0 International +License `_. + + +Introduction +------------ + +This document lays out the theory of the new type hinting proposal for +Python 3.5. It's not quite a full proposal or specification because +there are many details that need to be worked out, but it lays out the +theory without which it is hard to discuss more detailed specifications. +We start by explaining gradual typing; then we state some conventions +and general rules; then we define the new special types (such as Union) +that can be used in annotations; and finally we define the approach to +generic types. (The latter section needs more fleshing out; sorry!) + + +Summary of gradual typing +------------------------- + +We define a new relationship, is-consistent-with, which is similar to +is-subclass-of, except it is not transitive when the new type **Any** is +involved. (Neither relationship is symmetric.) Assigning x to y is OK if +the type of x is consistent with the type of y. (Compare this to??... if +the type of x is a subclass of the type of y,? which states one of the +fundamentals of OO programming.) The is-consistent-with relationship is +defined by three rules: + +- A type t1 is consistent with a type t2 if t1 is a subclass of t2. + (But not the other way around.) +- **Any** is consistent with every type. (But **Any** is not a subclass + of every type.) +- Every type is a subclass of **Any**. (Which also makes every type + consistent with **Any**, via rule 1.) + +That's all! See Jeremy Siek's blog post `What is Gradual +Typing `_ +for a longer explanation and motivation. Note that rule 3 places **Any** +at the root of the class graph. This makes it very similar to +**object**. The difference is that **object** is not consistent with +most types (e.g. you can't use an object() instance where an int is +expected). IOW both **Any** and **object** mean??any type is allowed? +when used to annotate an argument, but only **Any** can be passed no +matter what type is expected (in essence, **Any** shuts up complaints +from the static checker). + +?Here's an example showing how these rules work out in practice: + +Say we have an Employee class, and a subclass Manager: + +- class Employee: ... +- class Manager(Employee): ... + +Let's say variable e is declared with type Employee: + +- e = Employee()? # type: Employee + +Now it's okay to assign a Manager instance to e (rule 1): + +- e = Manager() + +It's not okay to assign an Employee instance to a variable declared with +type Manager: + +- m = Manager()? # type: Manager +- m = Employee()? # Fails static check + +However, suppose we have a variable whose type is **Any**: + +- a = some\_func()? # type: Any + +Now it's okay to assign a to e (rule 2): + +- e = a? # OK + +Of course it's also okay to assign e to a (rule 3), but we didn't need +the concept of consistency for that: + +- a = e? # OK + + +Notational conventions +---------------------- + +- t1, t2 etc.? and u1, u2 etc. are types or classes. Sometimes we write + ti or tj to refer to??any of t1, t2, etc.? +- X, Y etc. are type variables (defined with Var(), see below). +- C, D etc. are classes defined with a class statement. +- x, y etc. are objects or instances. +- We use the terms?type and?class interchangeably, and we assume + type(x) is x.\_\_class\_\_. + + +General rules +------------- + +- Instance-ness is? derived from class-ness, e.g. x is an instance of + t1 if? type(x) is a subclass of t1. +- No types defined below (i.e. Any, Union etc.) can be instantiated. + (But non-abstract subclasses of Generic can be.) +- No types defined below can be subclassed, except for Generic and + classes derived from it. +- Where a type is expected, None can be substituted for type(None); + e.g. Union[t1, None] == Union[t1, type(None)]. + + +Types +----- + +- **Any**. Every class is a subclass of Any; however, to the static + type checker it is also consistent with every class (see above). +- **Union[t1, t2, ...]**. Classes that are subclass of at least one of + t1 etc. are subclasses of this. So are unions whose components are + all subclasses of t1 etc. (Example: Union[int, str] is a subclass of + Union[int, float, str].) The order of the arguments doesn't matter. + (Example: Union[int, str] == Union[str, int].) If ti is itself a + Union the result is flattened. (Example: Union[int, Union[float, + str]] == Union[int, float, str].) If ti and tj have a subclass + relationship, the less specific type survives. (Example: + Union[Employee, Manager] == Union[Employee].)?Union[t1] returns just + t1. Union[] is illegal, so is Union[()]. Corollary: Union[..., Any, + ...] returns Any; Union[..., object, ...] returns object; to cut a + tie, Union[Any, object] == Union[object, Any] == Any. +- **Optional[t1]**. Alias for Union[t1, None], i.e. Union[t1, + type(None)]. +- **Tuple[t1, t2, ..., tn]**. A tuple whose items are instances of t1 + etc.. Example: Tuple[int, float] means a tuple of two items, the + first is an int, the second a float; e.g., (42, 3.14). Tuple[u1, u2, + ..., um] is a subclass of Tuple[t1, t2, ..., tn] if they have the + same length (n==m) and each ui is a subclass of ti. To spell the type + of the empty tuple, use Tuple[()]. There is no way to define a + variadic tuple type. (TODO: Maybe Tuple[t1, ...] with literal + ellipsis?) +- **Callable[[t1, t2, ..., tn], tr]**. A function with positional + argument types t1 etc., and return type tr. The argument list may be + empty (n==0). There is no way to indicate optional or keyword + arguments, nor varargs (we don't need to spell those often enough to + complicate the syntax ? however, Reticulated Python has a useful idea + here). This is covariant in the return type, but contravariant in the + arguments. ?Covariant? here means that for two callable types that + differ only in the return type, the subclass relationship for the + callable types follows that of the return types. (Example: + Callable[[], Manager] is a subclass of Callable[[], Employee].) + ?Contravariant? here means that for two callable types that differ + only in the type of one argument, the subclass relationship for the + callable types goes in the opposite direction as for the argument + types. (Example: Callable[[Employee], None] is a subclass of + Callable[[Mananger], None]. Yes, you read that right.) + +We might add: + +- **Intersection[t1, t2, ...]**. Classes that are subclass of *each* of + t1, etc are subclasses of this. (Compare to Union, which has *at + least one* instead of *each* in its definition.) The order of the + arguments doesn't matter. Nested intersections are flattened, e.g. + Intersection[int, Intersection[float, str]] == Intersection[int, + float, str]. An intersection of fewer types is a subclass of an + intersection of more types, e.g. Intersection[int, str] is a subclass + of Intersection[int, float, str]. An intersection of one argument is + just that argument, e.g. Intersection[int] is int. When argument have + a subclass relationship, the more specific class survives, e.g. + Intersection[str, Employee, Manager] is Intersection[str, Manager]. + Intersection[] is illegal, so is Intersection[()]. Corollary: Any + disappears from the argument list, e.g. Intersection[int, str, Any] + == Intersection[int, str]. Intersection[Any, object] is object. The + interaction between Intersection and Union is complex but should be + no surprise if you understand the interaction between intersections + and unions in set theory (note that sets of types can be infinite in + size, since there is no limit on the number of new subclasses). + + +Pragmatics +---------- + +Some things are irrelevant to the theory but make practical use more +convenient. (This is not a full list; I probably missed a few and some +are still controversial or not fully specified.) + +- Type aliases, e.g. + + * point = Tuple[float, float] + * def distance(p: point) -> float: ...? + +- Forward references via strings, e.g. + + * class C: + + + def compare(self, other: ?C?) -> int: ... + +- If a default of None is specified, the type is implicitly optional, e.g. + + * def get(key: KT, default: VT = None) -> VT: ... + +- Don't use dynamic type expressions; use builtins and imported types + only. No 'if'. + + * def display(message: str if WINDOWS else bytes):? # NOT OK + +- Type declaration in comments, e.g. + + * x = []? # type: Sequence[int] + +- Type declarations using Undefined, e.g. + + * x = Undefined(str) + +- Other things, e.g. casts, overloading and stub modules; best left to an + actual PEP. + + +Generic types +------------- + +(TODO: Explain more. See also the `mypy docs on +generics `_.) + +- **X = Var('X')**. Declares a unique type variable. The name must match + the variable name. + +- **Y = Var('Y', t1, t2, ...).** Ditto, constrained to t1 etc. Behaves + ?like Union[t1, t2, ...] for most purposes, but when used as a type + variable, subclasses of t1 etc. are replaced by the most-derived base + class among t1 etc. + +- Example of constrained type variables: + + * AnyStr = Var('AnyStr', str, bytes) + + * def longest(a: AnyStr, b: AnyStr) -> AnyStr: + + - return a if len(a) >= len(b) else b + + * x = longest('a', 'abc')? # The inferred type for x is str + + * y = longest('a', b'abc')? # Fails static type check + + * In this example, both arguments to longest() must have the same type + (str or bytes), and moreover, even if the arguments are instances of a + common str subclass, the return type is still str, not that subclass + (see next example). + +- For comparison, if the type variable was unconstrained, the common + subclass would be chosen as the return type, e.g.: + + * S = Var('S') + + * def longest(a: S, b: S) -> S: + + - return a if len(a) >= b else b + + * class MyStr(str): ... + + * x = longest(MyStr('a'), MyStr('abc')) + + * The inferred type of x is MyStr (whereas in the AnyStr example it would + be str). + +- Also for comparison, if a Union is used, the return type also has to be + a Union: + + * U = Union[str, bytes] + + * def longest(a: U, b: U) -> U: + + - return a if len(a) >= b else b + + * x = longest('a', 'abc') + + * The inferred type of x is still Union[str, bytes], even though both + arguments are str. + +- **class C(Generic[X, Y, ...]):** ... Define a generic class C over type + variables X etc. C itself becomes parameterizable, e.g. C[int, str, ...] + is a specific class with substitutions X?int etc. + +- TODO: Explain use of generic types in function signatures. E.g. + Sequence[X], Sequence[int], Sequence[Tuple[X, Y, Z]], and mixtures. + Think about co\*variance. No gimmicks like deriving from + Sequence[Union[int, str]] or Sequence[Union[int, X]]. + +- **Protocol**. Similar to Generic but uses structural equivalence. (TODO: + Explain, and think about co\*variance.) + + +Predefined generic types and Protocols in typing.py +--------------------------------------------------- + +(See also the `mypy typing.py +module `_.) + +- Everything from collections.abc (but Set renamed to AbstractSet). +- Dict, List, Set, a few more. (FrozenSet?) +- Pattern, Match. (Why?) +- IO, TextIO, BinaryIO. (Why?) + + +Another reference +----------------- + +Lest mypy gets all the attention, I should mention?\ `Reticulated +Python `_ by Michael Vitousek +as an example of a slightly different approach to gradual typing for +Python. It is described in an actual `academic +paper `_ +written by Vitousek with Jeremy Siek and Jim Baker (the latter of Jython +fame). .. -- Repository URL: https://hg.python.org/peps From python-checkins at python.org Fri Jan 9 19:29:54 2015 From: python-checkins at python.org (chris.angelico) Date: Fri, 09 Jan 2015 18:29:54 +0000 Subject: [Python-checkins] =?utf-8?q?peps=3A_ASCIIfy_PEP_483?= Message-ID: <20150109182942.11571.43952@psf.io> https://hg.python.org/peps/rev/003429214c06 changeset: 5664:003429214c06 user: Chris Angelico date: Sat Jan 10 05:28:21 2015 +1100 summary: ASCIIfy PEP 483 files: pep-0483.txt | 56 ++++++++++++++++++++-------------------- 1 files changed, 28 insertions(+), 28 deletions(-) diff --git a/pep-0483.txt b/pep-0483.txt --- a/pep-0483.txt +++ b/pep-0483.txt @@ -40,8 +40,8 @@ We define a new relationship, is-consistent-with, which is similar to is-subclass-of, except it is not transitive when the new type **Any** is involved. (Neither relationship is symmetric.) Assigning x to y is OK if -the type of x is consistent with the type of y. (Compare this to??... if -the type of x is a subclass of the type of y,? which states one of the +the type of x is consistent with the type of y. (Compare this to "... if +the type of x is a subclass of the type of y," which states one of the fundamentals of OO programming.) The is-consistent-with relationship is defined by three rules: @@ -58,12 +58,12 @@ at the root of the class graph. This makes it very similar to **object**. The difference is that **object** is not consistent with most types (e.g. you can't use an object() instance where an int is -expected). IOW both **Any** and **object** mean??any type is allowed? +expected). IOW both **Any** and **object** mean "any type is allowed" when used to annotate an argument, but only **Any** can be passed no matter what type is expected (in essence, **Any** shuts up complaints from the static checker). -?Here's an example showing how these rules work out in practice: +Here's an example showing how these rules work out in practice: Say we have an Employee class, and a subclass Manager: @@ -72,7 +72,7 @@ Let's say variable e is declared with type Employee: -- e = Employee()? # type: Employee +- e = Employee() # type: Employee Now it's okay to assign a Manager instance to e (rule 1): @@ -81,40 +81,40 @@ It's not okay to assign an Employee instance to a variable declared with type Manager: -- m = Manager()? # type: Manager -- m = Employee()? # Fails static check +- m = Manager() # type: Manager +- m = Employee() # Fails static check However, suppose we have a variable whose type is **Any**: -- a = some\_func()? # type: Any +- a = some\_func() # type: Any Now it's okay to assign a to e (rule 2): -- e = a? # OK +- e = a # OK Of course it's also okay to assign e to a (rule 3), but we didn't need the concept of consistency for that: -- a = e? # OK +- a = e # OK Notational conventions ---------------------- -- t1, t2 etc.? and u1, u2 etc. are types or classes. Sometimes we write - ti or tj to refer to??any of t1, t2, etc.? +- t1, t2 etc. and u1, u2 etc. are types or classes. Sometimes we write + ti or tj to refer to "any of t1, t2, etc." - X, Y etc. are type variables (defined with Var(), see below). - C, D etc. are classes defined with a class statement. - x, y etc. are objects or instances. -- We use the terms?type and?class interchangeably, and we assume +- We use the terms type and class interchangeably, and we assume type(x) is x.\_\_class\_\_. General rules ------------- -- Instance-ness is? derived from class-ness, e.g. x is an instance of - t1 if? type(x) is a subclass of t1. +- Instance-ness is derived from class-ness, e.g. x is an instance of + t1 if type(x) is a subclass of t1. - No types defined below (i.e. Any, Union etc.) can be instantiated. (But non-abstract subclasses of Generic can be.) - No types defined below can be subclassed, except for Generic and @@ -136,7 +136,7 @@ Union the result is flattened. (Example: Union[int, Union[float, str]] == Union[int, float, str].) If ti and tj have a subclass relationship, the less specific type survives. (Example: - Union[Employee, Manager] == Union[Employee].)?Union[t1] returns just + Union[Employee, Manager] == Union[Employee].) Union[t1] returns just t1. Union[] is illegal, so is Union[()]. Corollary: Union[..., Any, ...] returns Any; Union[..., object, ...] returns object; to cut a tie, Union[Any, object] == Union[object, Any] == Any. @@ -154,13 +154,13 @@ argument types t1 etc., and return type tr. The argument list may be empty (n==0). There is no way to indicate optional or keyword arguments, nor varargs (we don't need to spell those often enough to - complicate the syntax ? however, Reticulated Python has a useful idea + complicate the syntax - however, Reticulated Python has a useful idea here). This is covariant in the return type, but contravariant in the - arguments. ?Covariant? here means that for two callable types that + arguments. "Covariant" here means that for two callable types that differ only in the return type, the subclass relationship for the callable types follows that of the return types. (Example: Callable[[], Manager] is a subclass of Callable[[], Employee].) - ?Contravariant? here means that for two callable types that differ + "Contravariant" here means that for two callable types that differ only in the type of one argument, the subclass relationship for the callable types goes in the opposite direction as for the argument types. (Example: Callable[[Employee], None] is a subclass of @@ -198,13 +198,13 @@ - Type aliases, e.g. * point = Tuple[float, float] - * def distance(p: point) -> float: ...? + * def distance(p: point) -> float: ... - Forward references via strings, e.g. * class C: - + def compare(self, other: ?C?) -> int: ... + + def compare(self, other: "C") -> int: ... - If a default of None is specified, the type is implicitly optional, e.g. @@ -213,11 +213,11 @@ - Don't use dynamic type expressions; use builtins and imported types only. No 'if'. - * def display(message: str if WINDOWS else bytes):? # NOT OK + * def display(message: str if WINDOWS else bytes): # NOT OK - Type declaration in comments, e.g. - * x = []? # type: Sequence[int] + * x = [] # type: Sequence[int] - Type declarations using Undefined, e.g. @@ -237,7 +237,7 @@ the variable name. - **Y = Var('Y', t1, t2, ...).** Ditto, constrained to t1 etc. Behaves - ?like Union[t1, t2, ...] for most purposes, but when used as a type + like Union[t1, t2, ...] for most purposes, but when used as a type variable, subclasses of t1 etc. are replaced by the most-derived base class among t1 etc. @@ -249,9 +249,9 @@ - return a if len(a) >= len(b) else b - * x = longest('a', 'abc')? # The inferred type for x is str + * x = longest('a', 'abc') # The inferred type for x is str - * y = longest('a', b'abc')? # Fails static type check + * y = longest('a', b'abc') # Fails static type check * In this example, both arguments to longest() must have the same type (str or bytes), and moreover, even if the arguments are instances of a @@ -290,7 +290,7 @@ - **class C(Generic[X, Y, ...]):** ... Define a generic class C over type variables X etc. C itself becomes parameterizable, e.g. C[int, str, ...] - is a specific class with substitutions X?int etc. + is a specific class with substitutions X->int etc. - TODO: Explain use of generic types in function signatures. E.g. Sequence[X], Sequence[int], Sequence[Tuple[X, Y, Z]], and mixtures. @@ -316,7 +316,7 @@ Another reference ----------------- -Lest mypy gets all the attention, I should mention?\ `Reticulated +Lest mypy gets all the attention, I should mention \ `Reticulated Python `_ by Michael Vitousek as an example of a slightly different approach to gradual typing for Python. It is described in an actual `academic -- Repository URL: https://hg.python.org/peps From python-checkins at python.org Fri Jan 9 19:40:04 2015 From: python-checkins at python.org (chris.angelico) Date: Fri, 09 Jan 2015 18:40:04 +0000 Subject: [Python-checkins] =?utf-8?q?peps=3A_Record_Guido=27s_relicensing_?= =?utf-8?q?of_PEP_483_under_Open_Publication_License?= Message-ID: <20150109184000.125886.5154@psf.io> https://hg.python.org/peps/rev/1cdccde25189 changeset: 5665:1cdccde25189 user: Chris Angelico date: Sat Jan 10 05:39:44 2015 +1100 summary: Record Guido's relicensing of PEP 483 under Open Publication License files: pep-0483.txt | 20 +++++++++++++------- 1 files changed, 13 insertions(+), 7 deletions(-) diff --git a/pep-0483.txt b/pep-0483.txt --- a/pep-0483.txt +++ b/pep-0483.txt @@ -14,13 +14,6 @@ The Theory of Type Hinting ========================== -Guido van Rossum, Dec 19, 2014 (with a few more recent updates) - -This work is licensed under a `Creative Commons -Attribution-NonCommercial-ShareAlike 4.0 International -License `_. - - Introduction ------------ @@ -324,6 +317,19 @@ written by Vitousek with Jeremy Siek and Jim Baker (the latter of Jython fame). + +Copyright +========= + +This document is licensed under the `Open Publication License`_. + + +References and Footnotes +======================== + +.. _Open Publication License: http://www.opencontent.org/openpub/ + + .. Local Variables: -- Repository URL: https://hg.python.org/peps From python-checkins at python.org Fri Jan 9 19:55:35 2015 From: python-checkins at python.org (guido.van.rossum) Date: Fri, 09 Jan 2015 18:55:35 +0000 Subject: [Python-checkins] =?utf-8?q?peps=3A_Add_brief_section_on_Reticula?= =?utf-8?q?ted_Python=2C_plus_stubs_for_other_Pythonic_projects=2E?= Message-ID: <20150109185515.11591.5159@psf.io> https://hg.python.org/peps/rev/39fedd54da48 changeset: 5666:39fedd54da48 user: Guido van Rossum date: Fri Jan 09 10:52:06 2015 -0800 summary: Add brief section on Reticulated Python, plus stubs for other Pythonic projects. files: pep-0482.txt | 35 ++++++++++++++++++++++++++++++----- 1 files changed, 30 insertions(+), 5 deletions(-) diff --git a/pep-0482.txt b/pep-0482.txt --- a/pep-0482.txt +++ b/pep-0482.txt @@ -18,8 +18,9 @@ literature overview of related work. -Existing Approaches in Other Languages -====================================== +Existing Approaches for Python +============================== + mypy ---- @@ -27,6 +28,27 @@ (This section is a stub, since mypy [mypy]_ is essentially what we're proposing.) + +Reticulated Python +------------------ + +Reticulated Python [reticulated]_ by Michael Vitousek is an example of +a slightly different approach to gradual typing for Python. It is +described in an actual academic paper [reticulated-paper]_ written by +Vitousek with Jeremy Siek and Jim Baker (the latter of Jython fame). + + +Others +------ + +TBD: Add sections on pyflakes [pyflakes]_, pylint [pylint]_, numpy +[numpy]_, Argument Clinic [argumentclinic]_, pytypedecl [pytypedecl]_, +numba [numba]_, obiwan [obiwan]_. + + +Existing Approaches in Other Languages +====================================== + ActionScript ------------ @@ -165,12 +187,15 @@ References ========== -.. [pep-3107] - http://www.python.org/dev/peps/pep-3107/ - .. [mypy] http://mypy-lang.org +.. [reticulated] + https://github.com/mvitousek/reticulated + +.. [reticulated-paper] + http://wphomes.soic.indiana.edu/jsiek/files/2014/03/retic-python.pdf + .. [obiwan] http://pypi.python.org/pypi/obiwan -- Repository URL: https://hg.python.org/peps From python-checkins at python.org Fri Jan 9 19:55:35 2015 From: python-checkins at python.org (guido.van.rossum) Date: Fri, 09 Jan 2015 18:55:35 +0000 Subject: [Python-checkins] =?utf-8?q?peps=3A_Change_the_level_of_a_few_hea?= =?utf-8?q?dings=2E_Remove_duplicate_title=2E?= Message-ID: <20150109185515.8741.71799@psf.io> https://hg.python.org/peps/rev/ac3c4fefd85f changeset: 5668:ac3c4fefd85f user: Guido van Rossum date: Fri Jan 09 10:55:11 2015 -0800 summary: Change the level of a few headings. Remove duplicate title. files: pep-0483.txt | 11 ++++++----- 1 files changed, 6 insertions(+), 5 deletions(-) diff --git a/pep-0483.txt b/pep-0483.txt --- a/pep-0483.txt +++ b/pep-0483.txt @@ -11,17 +11,14 @@ Post-History: Resolution: -The Theory of Type Hinting -========================== - Abstract --------- +======== This PEP lays out the theory to be referenced by PEP 484. Introduction ------------- +============ This document lays out the theory of the new type hinting proposal for Python 3.5. It's not quite a full proposal or specification because @@ -33,6 +30,10 @@ generic types. (The latter section needs more fleshing out; sorry!) +Specification +============= + + Summary of gradual typing ------------------------- -- Repository URL: https://hg.python.org/peps From python-checkins at python.org Fri Jan 9 19:55:35 2015 From: python-checkins at python.org (guido.van.rossum) Date: Fri, 09 Jan 2015 18:55:35 +0000 Subject: [Python-checkins] =?utf-8?q?peps=3A_Move_text_on_Reticulated_Pyth?= =?utf-8?q?on_to_PEP_483=2E__Add_an_abstract=2E__Set_creation?= Message-ID: <20150109185515.125900.78716@psf.io> https://hg.python.org/peps/rev/67ccdc3a0390 changeset: 5667:67ccdc3a0390 user: Guido van Rossum date: Fri Jan 09 10:53:06 2015 -0800 summary: Move text on Reticulated Python to PEP 483. Add an abstract. Set creation date to 19-Dec-2014 (cretion date of the original Quip doc). files: pep-0483.txt | 20 +++++++------------- 1 files changed, 7 insertions(+), 13 deletions(-) diff --git a/pep-0483.txt b/pep-0483.txt --- a/pep-0483.txt +++ b/pep-0483.txt @@ -7,13 +7,19 @@ Status: Draft Type: Informational Content-Type: text/x-rst -Created: 08-Jan-2015 +Created: 19-Dec-2014 Post-History: Resolution: The Theory of Type Hinting ========================== +Abstract +-------- + +This PEP lays out the theory to be referenced by PEP 484. + + Introduction ------------ @@ -306,18 +312,6 @@ - IO, TextIO, BinaryIO. (Why?) -Another reference ------------------ - -Lest mypy gets all the attention, I should mention \ `Reticulated -Python `_ by Michael Vitousek -as an example of a slightly different approach to gradual typing for -Python. It is described in an actual `academic -paper `_ -written by Vitousek with Jeremy Siek and Jim Baker (the latter of Jython -fame). - - Copyright ========= -- Repository URL: https://hg.python.org/peps From python-checkins at python.org Fri Jan 9 21:36:37 2015 From: python-checkins at python.org (victor.stinner) Date: Fri, 09 Jan 2015 20:36:37 +0000 Subject: [Python-checkins] =?utf-8?b?Y3B5dGhvbiAoMy40KTogSXNzdWUgIzIzMjA5?= =?utf-8?q?=3A_Break_some_reference_cycles_in_asyncio=2E_Patch_written_by_?= =?utf-8?q?Martin?= Message-ID: <20150109203626.8745.28461@psf.io> https://hg.python.org/cpython/rev/376c5398f28d changeset: 94094:376c5398f28d branch: 3.4 parent: 94092:b6a636823c8c user: Victor Stinner date: Fri Jan 09 21:34:27 2015 +0100 summary: Issue #23209: Break some reference cycles in asyncio. Patch written by Martin Richard. files: Lib/asyncio/base_subprocess.py | 1 + Lib/asyncio/futures.py | 2 +- Lib/selectors.py | 1 + 3 files changed, 3 insertions(+), 1 deletions(-) diff --git a/Lib/asyncio/base_subprocess.py b/Lib/asyncio/base_subprocess.py --- a/Lib/asyncio/base_subprocess.py +++ b/Lib/asyncio/base_subprocess.py @@ -182,6 +182,7 @@ def connection_lost(self, exc): self.disconnected = True self.proc._pipe_connection_lost(self.fd, exc) + self.proc = None def pause_writing(self): self.proc._protocol.pause_writing() diff --git a/Lib/asyncio/futures.py b/Lib/asyncio/futures.py --- a/Lib/asyncio/futures.py +++ b/Lib/asyncio/futures.py @@ -405,5 +405,5 @@ new_future.add_done_callback(_check_cancel_other) fut.add_done_callback( lambda future: loop.call_soon_threadsafe( - new_future._copy_state, fut)) + new_future._copy_state, future)) return new_future diff --git a/Lib/selectors.py b/Lib/selectors.py --- a/Lib/selectors.py +++ b/Lib/selectors.py @@ -256,6 +256,7 @@ def close(self): self._fd_to_key.clear() + self._map = None def get_map(self): return self._map -- Repository URL: https://hg.python.org/cpython From python-checkins at python.org Fri Jan 9 21:36:37 2015 From: python-checkins at python.org (victor.stinner) Date: Fri, 09 Jan 2015 20:36:37 +0000 Subject: [Python-checkins] =?utf-8?q?cpython_=28merge_3=2E4_-=3E_default?= =?utf-8?q?=29=3A_Merge_3=2E4_=28asyncio=29?= Message-ID: <20150109203626.22425.7195@psf.io> https://hg.python.org/cpython/rev/242ef936add7 changeset: 94093:242ef936add7 parent: 94091:ab72f30bcd9f parent: 94092:b6a636823c8c user: Victor Stinner date: Fri Jan 09 21:32:24 2015 +0100 summary: Merge 3.4 (asyncio) files: Lib/asyncio/futures.py | 2 - Lib/asyncio/proactor_events.py | 3 +- Lib/asyncio/selector_events.py | 1 - Lib/asyncio/streams.py | 10 +++++-- Lib/asyncio/unix_events.py | 5 +--- Lib/test/test_asyncio/test_streams.py | 19 +++++++++++++++ 6 files changed, 29 insertions(+), 11 deletions(-) diff --git a/Lib/asyncio/futures.py b/Lib/asyncio/futures.py --- a/Lib/asyncio/futures.py +++ b/Lib/asyncio/futures.py @@ -20,7 +20,6 @@ _PY34 = sys.version_info >= (3, 4) -# TODO: Do we really want to depend on concurrent.futures internals? Error = concurrent.futures._base.Error CancelledError = concurrent.futures.CancelledError TimeoutError = concurrent.futures.TimeoutError @@ -30,7 +29,6 @@ class InvalidStateError(Error): """The operation is not allowed in this state.""" - # TODO: Show the future, its state, the method, and the required state. class _TracebackLogger: diff --git a/Lib/asyncio/proactor_events.py b/Lib/asyncio/proactor_events.py --- a/Lib/asyncio/proactor_events.py +++ b/Lib/asyncio/proactor_events.py @@ -487,7 +487,8 @@ self.call_soon(loop) def _process_events(self, event_list): - pass # XXX hard work currently done in poll + # Events are processed in the IocpProactor._poll() method + pass def _stop_accept_futures(self): for future in self._accept_futures.values(): diff --git a/Lib/asyncio/selector_events.py b/Lib/asyncio/selector_events.py --- a/Lib/asyncio/selector_events.py +++ b/Lib/asyncio/selector_events.py @@ -145,7 +145,6 @@ pass # False alarm. except OSError as exc: # There's nowhere to send the error, so just log it. - # TODO: Someone will want an error handler for this. if exc.errno in (errno.EMFILE, errno.ENFILE, errno.ENOBUFS, errno.ENOMEM): # Some platforms (e.g. Linux keep reporting the FD as diff --git a/Lib/asyncio/streams.py b/Lib/asyncio/streams.py --- a/Lib/asyncio/streams.py +++ b/Lib/asyncio/streams.py @@ -145,7 +145,10 @@ """ def __init__(self, loop=None): - self._loop = loop # May be None; we may never need it. + if loop is None: + self._loop = events.get_event_loop() + else: + self._loop = loop self._paused = False self._drain_waiter = None self._connection_lost = False @@ -306,8 +309,9 @@ # it also doubles as half the buffer limit. self._limit = limit if loop is None: - loop = events.get_event_loop() - self._loop = loop + self._loop = events.get_event_loop() + else: + self._loop = loop self._buffer = bytearray() self._eof = False # Whether we're done. self._waiter = None # A future. diff --git a/Lib/asyncio/unix_events.py b/Lib/asyncio/unix_events.py --- a/Lib/asyncio/unix_events.py +++ b/Lib/asyncio/unix_events.py @@ -496,9 +496,6 @@ def can_write_eof(self): return True - # TODO: Make the relationships between write_eof(), close(), - # abort(), _fatal_error() and _close() more straightforward. - def write_eof(self): if self._closing: return @@ -897,7 +894,7 @@ class _UnixDefaultEventLoopPolicy(events.BaseDefaultEventLoopPolicy): - """XXX""" + """UNIX event loop policy with a watcher for child processes.""" _loop_factory = _UnixSelectorEventLoop def __init__(self): diff --git a/Lib/test/test_asyncio/test_streams.py b/Lib/test/test_asyncio/test_streams.py --- a/Lib/test/test_asyncio/test_streams.py +++ b/Lib/test/test_asyncio/test_streams.py @@ -625,6 +625,25 @@ data = self.loop.run_until_complete(reader.read(-1)) self.assertEqual(data, b'data') + def test_streamreader_constructor(self): + self.addCleanup(asyncio.set_event_loop, None) + asyncio.set_event_loop(self.loop) + + # Tulip issue #184: Ensure that StreamReaderProtocol constructor + # retrieves the current loop if the loop parameter is not set + reader = asyncio.StreamReader() + self.assertIs(reader._loop, self.loop) + + def test_streamreaderprotocol_constructor(self): + self.addCleanup(asyncio.set_event_loop, None) + asyncio.set_event_loop(self.loop) + + # Tulip issue #184: Ensure that StreamReaderProtocol constructor + # retrieves the current loop if the loop parameter is not set + reader = mock.Mock() + protocol = asyncio.StreamReaderProtocol(reader) + self.assertIs(protocol._loop, self.loop) + if __name__ == '__main__': unittest.main() -- Repository URL: https://hg.python.org/cpython From python-checkins at python.org Fri Jan 9 21:36:38 2015 From: python-checkins at python.org (victor.stinner) Date: Fri, 09 Jan 2015 20:36:38 +0000 Subject: [Python-checkins] =?utf-8?q?cpython_=28merge_3=2E4_-=3E_default?= =?utf-8?b?KTogTWVyZ2UgMy40IChhc3luY2lvLCBzZWxlY3RvcnMp?= Message-ID: <20150109203626.11569.71704@psf.io> https://hg.python.org/cpython/rev/880a1febd6f5 changeset: 94095:880a1febd6f5 parent: 94093:242ef936add7 parent: 94094:376c5398f28d user: Victor Stinner date: Fri Jan 09 21:35:03 2015 +0100 summary: Merge 3.4 (asyncio, selectors) files: Lib/asyncio/base_subprocess.py | 1 + Lib/asyncio/futures.py | 2 +- Lib/selectors.py | 1 + 3 files changed, 3 insertions(+), 1 deletions(-) diff --git a/Lib/asyncio/base_subprocess.py b/Lib/asyncio/base_subprocess.py --- a/Lib/asyncio/base_subprocess.py +++ b/Lib/asyncio/base_subprocess.py @@ -182,6 +182,7 @@ def connection_lost(self, exc): self.disconnected = True self.proc._pipe_connection_lost(self.fd, exc) + self.proc = None def pause_writing(self): self.proc._protocol.pause_writing() diff --git a/Lib/asyncio/futures.py b/Lib/asyncio/futures.py --- a/Lib/asyncio/futures.py +++ b/Lib/asyncio/futures.py @@ -405,5 +405,5 @@ new_future.add_done_callback(_check_cancel_other) fut.add_done_callback( lambda future: loop.call_soon_threadsafe( - new_future._copy_state, fut)) + new_future._copy_state, future)) return new_future diff --git a/Lib/selectors.py b/Lib/selectors.py --- a/Lib/selectors.py +++ b/Lib/selectors.py @@ -256,6 +256,7 @@ def close(self): self._fd_to_key.clear() + self._map = None def get_map(self): return self._map -- Repository URL: https://hg.python.org/cpython From python-checkins at python.org Fri Jan 9 21:36:39 2015 From: python-checkins at python.org (victor.stinner) Date: Fri, 09 Jan 2015 20:36:39 +0000 Subject: [Python-checkins] =?utf-8?b?Y3B5dGhvbiAoMy40KTogYXN5bmNpbzogc3lu?= =?utf-8?q?c_with_Tulip?= Message-ID: <20150109203625.8757.82802@psf.io> https://hg.python.org/cpython/rev/b6a636823c8c changeset: 94092:b6a636823c8c branch: 3.4 parent: 94089:75d7f29487ce user: Victor Stinner date: Fri Jan 09 21:32:05 2015 +0100 summary: asyncio: sync with Tulip * Tulip issue 184: FlowControlMixin constructor now get the event loop if the loop parameter is not set. Add unit tests to ensure that constructor of StreamReader and StreamReaderProtocol classes get the event loop. * Remove outdated TODO/XXX files: Lib/asyncio/futures.py | 2 - Lib/asyncio/proactor_events.py | 3 +- Lib/asyncio/selector_events.py | 1 - Lib/asyncio/streams.py | 10 +++++-- Lib/asyncio/unix_events.py | 5 +--- Lib/test/test_asyncio/test_streams.py | 19 +++++++++++++++ 6 files changed, 29 insertions(+), 11 deletions(-) diff --git a/Lib/asyncio/futures.py b/Lib/asyncio/futures.py --- a/Lib/asyncio/futures.py +++ b/Lib/asyncio/futures.py @@ -20,7 +20,6 @@ _PY34 = sys.version_info >= (3, 4) -# TODO: Do we really want to depend on concurrent.futures internals? Error = concurrent.futures._base.Error CancelledError = concurrent.futures.CancelledError TimeoutError = concurrent.futures.TimeoutError @@ -30,7 +29,6 @@ class InvalidStateError(Error): """The operation is not allowed in this state.""" - # TODO: Show the future, its state, the method, and the required state. class _TracebackLogger: diff --git a/Lib/asyncio/proactor_events.py b/Lib/asyncio/proactor_events.py --- a/Lib/asyncio/proactor_events.py +++ b/Lib/asyncio/proactor_events.py @@ -487,7 +487,8 @@ self.call_soon(loop) def _process_events(self, event_list): - pass # XXX hard work currently done in poll + # Events are processed in the IocpProactor._poll() method + pass def _stop_accept_futures(self): for future in self._accept_futures.values(): diff --git a/Lib/asyncio/selector_events.py b/Lib/asyncio/selector_events.py --- a/Lib/asyncio/selector_events.py +++ b/Lib/asyncio/selector_events.py @@ -145,7 +145,6 @@ pass # False alarm. except OSError as exc: # There's nowhere to send the error, so just log it. - # TODO: Someone will want an error handler for this. if exc.errno in (errno.EMFILE, errno.ENFILE, errno.ENOBUFS, errno.ENOMEM): # Some platforms (e.g. Linux keep reporting the FD as diff --git a/Lib/asyncio/streams.py b/Lib/asyncio/streams.py --- a/Lib/asyncio/streams.py +++ b/Lib/asyncio/streams.py @@ -145,7 +145,10 @@ """ def __init__(self, loop=None): - self._loop = loop # May be None; we may never need it. + if loop is None: + self._loop = events.get_event_loop() + else: + self._loop = loop self._paused = False self._drain_waiter = None self._connection_lost = False @@ -306,8 +309,9 @@ # it also doubles as half the buffer limit. self._limit = limit if loop is None: - loop = events.get_event_loop() - self._loop = loop + self._loop = events.get_event_loop() + else: + self._loop = loop self._buffer = bytearray() self._eof = False # Whether we're done. self._waiter = None # A future. diff --git a/Lib/asyncio/unix_events.py b/Lib/asyncio/unix_events.py --- a/Lib/asyncio/unix_events.py +++ b/Lib/asyncio/unix_events.py @@ -496,9 +496,6 @@ def can_write_eof(self): return True - # TODO: Make the relationships between write_eof(), close(), - # abort(), _fatal_error() and _close() more straightforward. - def write_eof(self): if self._closing: return @@ -897,7 +894,7 @@ class _UnixDefaultEventLoopPolicy(events.BaseDefaultEventLoopPolicy): - """XXX""" + """UNIX event loop policy with a watcher for child processes.""" _loop_factory = _UnixSelectorEventLoop def __init__(self): diff --git a/Lib/test/test_asyncio/test_streams.py b/Lib/test/test_asyncio/test_streams.py --- a/Lib/test/test_asyncio/test_streams.py +++ b/Lib/test/test_asyncio/test_streams.py @@ -625,6 +625,25 @@ data = self.loop.run_until_complete(reader.read(-1)) self.assertEqual(data, b'data') + def test_streamreader_constructor(self): + self.addCleanup(asyncio.set_event_loop, None) + asyncio.set_event_loop(self.loop) + + # Tulip issue #184: Ensure that StreamReaderProtocol constructor + # retrieves the current loop if the loop parameter is not set + reader = asyncio.StreamReader() + self.assertIs(reader._loop, self.loop) + + def test_streamreaderprotocol_constructor(self): + self.addCleanup(asyncio.set_event_loop, None) + asyncio.set_event_loop(self.loop) + + # Tulip issue #184: Ensure that StreamReaderProtocol constructor + # retrieves the current loop if the loop parameter is not set + reader = mock.Mock() + protocol = asyncio.StreamReaderProtocol(reader) + self.assertIs(protocol._loop, self.loop) + if __name__ == '__main__': unittest.main() -- Repository URL: https://hg.python.org/cpython From python-checkins at python.org Fri Jan 9 21:58:02 2015 From: python-checkins at python.org (victor.stinner) Date: Fri, 09 Jan 2015 20:58:02 +0000 Subject: [Python-checkins] =?utf-8?b?Y3B5dGhvbiAoMy40KTogSXNzdWUgIzIzMjA5?= =?utf-8?q?=3A_Revert_change_on_selectors=2C_test=5Fselectors_failed=2E?= Message-ID: <20150109205800.11581.70403@psf.io> https://hg.python.org/cpython/rev/7438f2e30908 changeset: 94096:7438f2e30908 branch: 3.4 parent: 94094:376c5398f28d user: Victor Stinner date: Fri Jan 09 21:56:28 2015 +0100 summary: Issue #23209: Revert change on selectors, test_selectors failed. files: Lib/selectors.py | 1 - 1 files changed, 0 insertions(+), 1 deletions(-) diff --git a/Lib/selectors.py b/Lib/selectors.py --- a/Lib/selectors.py +++ b/Lib/selectors.py @@ -256,7 +256,6 @@ def close(self): self._fd_to_key.clear() - self._map = None def get_map(self): return self._map -- Repository URL: https://hg.python.org/cpython From python-checkins at python.org Fri Jan 9 21:58:01 2015 From: python-checkins at python.org (victor.stinner) Date: Fri, 09 Jan 2015 20:58:01 +0000 Subject: [Python-checkins] =?utf-8?q?cpython_=28merge_3=2E4_-=3E_default?= =?utf-8?q?=29=3A_=28Merge_3=2E4=29_Issue_=2323209=3A_Revert_change_on_sel?= =?utf-8?q?ectors=2C_test=5Fselectors_failed=2E?= Message-ID: <20150109205801.22409.4228@psf.io> https://hg.python.org/cpython/rev/27cbc877447b changeset: 94097:27cbc877447b parent: 94095:880a1febd6f5 parent: 94096:7438f2e30908 user: Victor Stinner date: Fri Jan 09 21:57:19 2015 +0100 summary: (Merge 3.4) Issue #23209: Revert change on selectors, test_selectors failed. files: Lib/selectors.py | 1 - 1 files changed, 0 insertions(+), 1 deletions(-) diff --git a/Lib/selectors.py b/Lib/selectors.py --- a/Lib/selectors.py +++ b/Lib/selectors.py @@ -256,7 +256,6 @@ def close(self): self._fd_to_key.clear() - self._map = None def get_map(self): return self._map -- Repository URL: https://hg.python.org/cpython From python-checkins at python.org Fri Jan 9 22:34:20 2015 From: python-checkins at python.org (ned.deily) Date: Fri, 09 Jan 2015 21:34:20 +0000 Subject: [Python-checkins] =?utf-8?b?Y3B5dGhvbiAoMi43KTogSXNzdWUgIzIzMjEy?= =?utf-8?q?=3A_Update_OS_X_installer_build_OpenSSL_to_1=2E0=2E1k=2E?= Message-ID: <20150109213420.11577.75467@psf.io> https://hg.python.org/cpython/rev/a216f349771b changeset: 94098:a216f349771b branch: 2.7 parent: 94075:1c3f8d044589 user: Ned Deily date: Fri Jan 09 13:26:13 2015 -0800 summary: Issue #23212: Update OS X installer build OpenSSL to 1.0.1k. (currently only used for builds with <= 10.5 deployment targets) files: Mac/BuildScript/build-installer.py | 6 +- Mac/BuildScript/openssl_sdk_makedepend.patch | 16 +++++---- 2 files changed, 12 insertions(+), 10 deletions(-) diff --git a/Mac/BuildScript/build-installer.py b/Mac/BuildScript/build-installer.py --- a/Mac/BuildScript/build-installer.py +++ b/Mac/BuildScript/build-installer.py @@ -237,9 +237,9 @@ result.extend([ dict( - name="OpenSSL 1.0.1j", - url="https://www.openssl.org/source/openssl-1.0.1j.tar.gz", - checksum='f7175c9cd3c39bb1907ac8bba9df8ed3', + name="OpenSSL 1.0.1k", + url="https://www.openssl.org/source/openssl-1.0.1k.tar.gz", + checksum='d4f002bd22a56881340105028842ae1f', patches=[ "openssl_sdk_makedepend.patch", ], diff --git a/Mac/BuildScript/openssl_sdk_makedepend.patch b/Mac/BuildScript/openssl_sdk_makedepend.patch --- a/Mac/BuildScript/openssl_sdk_makedepend.patch +++ b/Mac/BuildScript/openssl_sdk_makedepend.patch @@ -1,13 +1,15 @@ # openssl_sdk_makedepend.patch # -# using openssl 1.0.1j +# using openssl 1.0.1k # # - support building with an OS X SDK # - allow "make depend" to use compilers with names other than "gcc" diff Configure ---- a/Configure Fri Dec 05 01:24:16 2014 -0800 -+++ b/Configure Fri Dec 05 01:52:29 2014 -0800 + +diff -r 99ae439a07f1 Configure +--- a/Configure Fri Jan 09 12:50:43 2015 -0800 ++++ b/Configure Fri Jan 09 12:53:52 2015 -0800 @@ -577,11 +577,11 @@ ##### MacOS X (a.k.a. Rhapsody or Darwin) setup @@ -25,7 +27,7 @@ "debug-darwin-ppc-cc","cc:-DBN_DEBUG -DREF_CHECK -DCONF_DEBUG -DCRYPTO_MDEBUG -DB_ENDIAN -g -Wall -O::-D_REENTRANT:MACOSX::BN_LLONG RC4_CHAR RC4_CHUNK DES_UNROLL BF_PTR:${ppc32_asm}:osx32:dlfcn:darwin-shared:-fPIC:-dynamiclib:.\$(SHLIB_MAJOR).\$(SHLIB_MINOR).dylib", # iPhoneOS/iOS "iphoneos-cross","llvm-gcc:-O3 -isysroot \$(CROSS_TOP)/SDKs/\$(CROSS_SDK) -fomit-frame-pointer -fno-common::-D_REENTRANT:iOS:-Wl,-search_paths_first%:BN_LLONG RC4_CHAR RC4_CHUNK DES_UNROLL BF_PTR:${no_asm}:dlfcn:darwin-shared:-fPIC -fno-common:-dynamiclib:.\$(SHLIB_MAJOR).\$(SHLIB_MINOR).dylib", -@@ -1624,7 +1624,7 @@ +@@ -1629,7 +1629,7 @@ s/^CC=.*$/CC= $cc/; s/^AR=\s*ar/AR= $ar/; s/^RANLIB=.*/RANLIB= $ranlib/; @@ -34,9 +36,9 @@ } s/^CFLAG=.*$/CFLAG= $cflags/; s/^DEPFLAG=.*$/DEPFLAG=$depflags/; -diff util/domd ---- a/util/domd Fri Dec 05 01:24:16 2014 -0800 -+++ b/util/domd Fri Dec 05 01:52:29 2014 -0800 +diff -r 99ae439a07f1 util/domd +--- a/util/domd Fri Jan 09 12:50:43 2015 -0800 ++++ b/util/domd Fri Jan 09 12:53:52 2015 -0800 @@ -14,7 +14,7 @@ cp Makefile Makefile.save # fake the presence of Kerberos -- Repository URL: https://hg.python.org/cpython From python-checkins at python.org Fri Jan 9 22:34:20 2015 From: python-checkins at python.org (ned.deily) Date: Fri, 09 Jan 2015 21:34:20 +0000 Subject: [Python-checkins] =?utf-8?b?Y3B5dGhvbiAoMi43KTogSXNzdWUgIzIzMjEy?= =?utf-8?q?=3A_2=2E7-specific_OS_X_installer_updates?= Message-ID: <20150109213420.8757.39544@psf.io> https://hg.python.org/cpython/rev/849ec86651b4 changeset: 94099:849ec86651b4 branch: 2.7 user: Ned Deily date: Fri Jan 09 13:29:04 2015 -0800 summary: Issue #23212: 2.7-specific OS X installer updates files: Mac/BuildScript/README.txt | 2 +- Mac/BuildScript/resources/ReadMe.rtf | 2 +- Misc/NEWS | 2 ++ 3 files changed, 4 insertions(+), 2 deletions(-) diff --git a/Mac/BuildScript/README.txt b/Mac/BuildScript/README.txt --- a/Mac/BuildScript/README.txt +++ b/Mac/BuildScript/README.txt @@ -30,7 +30,7 @@ - builds the following third-party libraries - * libcrypto and libssl from OpenSSL 1.0.1j + * libcrypto and libssl from OpenSSL 1.0.1 * NCurses 5.9 * SQLite 3.7.13 * Oracle Sleepycat DB 4.8 (Python 2.x only) diff --git a/Mac/BuildScript/resources/ReadMe.rtf b/Mac/BuildScript/resources/ReadMe.rtf --- a/Mac/BuildScript/resources/ReadMe.rtf +++ b/Mac/BuildScript/resources/ReadMe.rtf @@ -111,7 +111,7 @@ \i0 . To solve this problem, as of 2.7.9 the \i 10.5+ 32-bit-only python.org variant \i0 is linked with a private copy of -\i OpenSSL 1.0.1j +\i OpenSSL 1.0.1 \i0 ; it consults the same default certificate directory, \f1 /System/Library/OpenSSL \f0 . As before, it is still necessary to manage certificates yourself when you use this Python variant and, with certificate verification now enabled by default, you may now need to take additional steps to ensure your Python programs have access to CA certificates you trust. If you use this Python variant to build standalone applications with third-party tools like {\field{\*\fldinst{HYPERLINK "https://pypi.python.org/pypi/py2app/"}}{\fldrslt diff --git a/Misc/NEWS b/Misc/NEWS --- a/Misc/NEWS +++ b/Misc/NEWS @@ -92,6 +92,8 @@ - Issue #23032: Fix installer build failures on OS X 10.4 Tiger by disabling assembly code in the OpenSSL build. +- Issue #23212: Update 10.5 OS X installer build to use OpenSSL 1.0.1k. + What's New in Python 2.7.9? =========================== -- Repository URL: https://hg.python.org/cpython From python-checkins at python.org Fri Jan 9 22:34:21 2015 From: python-checkins at python.org (ned.deily) Date: Fri, 09 Jan 2015 21:34:21 +0000 Subject: [Python-checkins] =?utf-8?q?cpython_=28merge_3=2E4_-=3E_default?= =?utf-8?q?=29=3A_Issue_=2323212=3A_merge_from_3=2E4?= Message-ID: <20150109213421.11589.74242@psf.io> https://hg.python.org/cpython/rev/6d518aa5e1a2 changeset: 94102:6d518aa5e1a2 parent: 94097:27cbc877447b parent: 94101:726d67a7ebf2 user: Ned Deily date: Fri Jan 09 13:33:28 2015 -0800 summary: Issue #23212: merge from 3.4 files: Mac/BuildScript/build-installer.py | 6 +- Mac/BuildScript/openssl_sdk_makedepend.patch | 16 +++++---- 2 files changed, 12 insertions(+), 10 deletions(-) diff --git a/Mac/BuildScript/build-installer.py b/Mac/BuildScript/build-installer.py --- a/Mac/BuildScript/build-installer.py +++ b/Mac/BuildScript/build-installer.py @@ -237,9 +237,9 @@ result.extend([ dict( - name="OpenSSL 1.0.1j", - url="https://www.openssl.org/source/openssl-1.0.1j.tar.gz", - checksum='f7175c9cd3c39bb1907ac8bba9df8ed3', + name="OpenSSL 1.0.1k", + url="https://www.openssl.org/source/openssl-1.0.1k.tar.gz", + checksum='d4f002bd22a56881340105028842ae1f', patches=[ "openssl_sdk_makedepend.patch", ], diff --git a/Mac/BuildScript/openssl_sdk_makedepend.patch b/Mac/BuildScript/openssl_sdk_makedepend.patch --- a/Mac/BuildScript/openssl_sdk_makedepend.patch +++ b/Mac/BuildScript/openssl_sdk_makedepend.patch @@ -1,13 +1,15 @@ # openssl_sdk_makedepend.patch # -# using openssl 1.0.1j +# using openssl 1.0.1k # # - support building with an OS X SDK # - allow "make depend" to use compilers with names other than "gcc" diff Configure ---- a/Configure Fri Dec 05 01:24:16 2014 -0800 -+++ b/Configure Fri Dec 05 01:52:29 2014 -0800 + +diff -r 99ae439a07f1 Configure +--- a/Configure Fri Jan 09 12:50:43 2015 -0800 ++++ b/Configure Fri Jan 09 12:53:52 2015 -0800 @@ -577,11 +577,11 @@ ##### MacOS X (a.k.a. Rhapsody or Darwin) setup @@ -25,7 +27,7 @@ "debug-darwin-ppc-cc","cc:-DBN_DEBUG -DREF_CHECK -DCONF_DEBUG -DCRYPTO_MDEBUG -DB_ENDIAN -g -Wall -O::-D_REENTRANT:MACOSX::BN_LLONG RC4_CHAR RC4_CHUNK DES_UNROLL BF_PTR:${ppc32_asm}:osx32:dlfcn:darwin-shared:-fPIC:-dynamiclib:.\$(SHLIB_MAJOR).\$(SHLIB_MINOR).dylib", # iPhoneOS/iOS "iphoneos-cross","llvm-gcc:-O3 -isysroot \$(CROSS_TOP)/SDKs/\$(CROSS_SDK) -fomit-frame-pointer -fno-common::-D_REENTRANT:iOS:-Wl,-search_paths_first%:BN_LLONG RC4_CHAR RC4_CHUNK DES_UNROLL BF_PTR:${no_asm}:dlfcn:darwin-shared:-fPIC -fno-common:-dynamiclib:.\$(SHLIB_MAJOR).\$(SHLIB_MINOR).dylib", -@@ -1624,7 +1624,7 @@ +@@ -1629,7 +1629,7 @@ s/^CC=.*$/CC= $cc/; s/^AR=\s*ar/AR= $ar/; s/^RANLIB=.*/RANLIB= $ranlib/; @@ -34,9 +36,9 @@ } s/^CFLAG=.*$/CFLAG= $cflags/; s/^DEPFLAG=.*$/DEPFLAG=$depflags/; -diff util/domd ---- a/util/domd Fri Dec 05 01:24:16 2014 -0800 -+++ b/util/domd Fri Dec 05 01:52:29 2014 -0800 +diff -r 99ae439a07f1 util/domd +--- a/util/domd Fri Jan 09 12:50:43 2015 -0800 ++++ b/util/domd Fri Jan 09 12:53:52 2015 -0800 @@ -14,7 +14,7 @@ cp Makefile Makefile.save # fake the presence of Kerberos -- Repository URL: https://hg.python.org/cpython From python-checkins at python.org Fri Jan 9 22:34:20 2015 From: python-checkins at python.org (ned.deily) Date: Fri, 09 Jan 2015 21:34:20 +0000 Subject: [Python-checkins] =?utf-8?b?Y3B5dGhvbiAoMy40KTogSXNzdWUgIzIzMjEy?= =?utf-8?q?=3A_Update_OS_X_installer_build_OpenSSL_to_1=2E0=2E1k=2E?= Message-ID: <20150109213420.11561.76308@psf.io> https://hg.python.org/cpython/rev/ce3028357f8b changeset: 94100:ce3028357f8b branch: 3.4 parent: 94096:7438f2e30908 user: Ned Deily date: Fri Jan 09 13:29:54 2015 -0800 summary: Issue #23212: Update OS X installer build OpenSSL to 1.0.1k. (currently only used for builds with <= 10.5 deployment targets) files: Mac/BuildScript/build-installer.py | 6 +- Mac/BuildScript/openssl_sdk_makedepend.patch | 16 +++++---- 2 files changed, 12 insertions(+), 10 deletions(-) diff --git a/Mac/BuildScript/build-installer.py b/Mac/BuildScript/build-installer.py --- a/Mac/BuildScript/build-installer.py +++ b/Mac/BuildScript/build-installer.py @@ -237,9 +237,9 @@ result.extend([ dict( - name="OpenSSL 1.0.1j", - url="https://www.openssl.org/source/openssl-1.0.1j.tar.gz", - checksum='f7175c9cd3c39bb1907ac8bba9df8ed3', + name="OpenSSL 1.0.1k", + url="https://www.openssl.org/source/openssl-1.0.1k.tar.gz", + checksum='d4f002bd22a56881340105028842ae1f', patches=[ "openssl_sdk_makedepend.patch", ], diff --git a/Mac/BuildScript/openssl_sdk_makedepend.patch b/Mac/BuildScript/openssl_sdk_makedepend.patch --- a/Mac/BuildScript/openssl_sdk_makedepend.patch +++ b/Mac/BuildScript/openssl_sdk_makedepend.patch @@ -1,13 +1,15 @@ # openssl_sdk_makedepend.patch # -# using openssl 1.0.1j +# using openssl 1.0.1k # # - support building with an OS X SDK # - allow "make depend" to use compilers with names other than "gcc" diff Configure ---- a/Configure Fri Dec 05 01:24:16 2014 -0800 -+++ b/Configure Fri Dec 05 01:52:29 2014 -0800 + +diff -r 99ae439a07f1 Configure +--- a/Configure Fri Jan 09 12:50:43 2015 -0800 ++++ b/Configure Fri Jan 09 12:53:52 2015 -0800 @@ -577,11 +577,11 @@ ##### MacOS X (a.k.a. Rhapsody or Darwin) setup @@ -25,7 +27,7 @@ "debug-darwin-ppc-cc","cc:-DBN_DEBUG -DREF_CHECK -DCONF_DEBUG -DCRYPTO_MDEBUG -DB_ENDIAN -g -Wall -O::-D_REENTRANT:MACOSX::BN_LLONG RC4_CHAR RC4_CHUNK DES_UNROLL BF_PTR:${ppc32_asm}:osx32:dlfcn:darwin-shared:-fPIC:-dynamiclib:.\$(SHLIB_MAJOR).\$(SHLIB_MINOR).dylib", # iPhoneOS/iOS "iphoneos-cross","llvm-gcc:-O3 -isysroot \$(CROSS_TOP)/SDKs/\$(CROSS_SDK) -fomit-frame-pointer -fno-common::-D_REENTRANT:iOS:-Wl,-search_paths_first%:BN_LLONG RC4_CHAR RC4_CHUNK DES_UNROLL BF_PTR:${no_asm}:dlfcn:darwin-shared:-fPIC -fno-common:-dynamiclib:.\$(SHLIB_MAJOR).\$(SHLIB_MINOR).dylib", -@@ -1624,7 +1624,7 @@ +@@ -1629,7 +1629,7 @@ s/^CC=.*$/CC= $cc/; s/^AR=\s*ar/AR= $ar/; s/^RANLIB=.*/RANLIB= $ranlib/; @@ -34,9 +36,9 @@ } s/^CFLAG=.*$/CFLAG= $cflags/; s/^DEPFLAG=.*$/DEPFLAG=$depflags/; -diff util/domd ---- a/util/domd Fri Dec 05 01:24:16 2014 -0800 -+++ b/util/domd Fri Dec 05 01:52:29 2014 -0800 +diff -r 99ae439a07f1 util/domd +--- a/util/domd Fri Jan 09 12:50:43 2015 -0800 ++++ b/util/domd Fri Jan 09 12:53:52 2015 -0800 @@ -14,7 +14,7 @@ cp Makefile Makefile.save # fake the presence of Kerberos -- Repository URL: https://hg.python.org/cpython From python-checkins at python.org Fri Jan 9 22:34:21 2015 From: python-checkins at python.org (ned.deily) Date: Fri, 09 Jan 2015 21:34:21 +0000 Subject: [Python-checkins] =?utf-8?b?Y3B5dGhvbiAoMy40KTogSXNzdWUgIzIzMjEy?= =?utf-8?q?=3A_3=2E4-specific_OS_X_installer_updates?= Message-ID: <20150109213420.8761.38015@psf.io> https://hg.python.org/cpython/rev/726d67a7ebf2 changeset: 94101:726d67a7ebf2 branch: 3.4 user: Ned Deily date: Fri Jan 09 13:30:11 2015 -0800 summary: Issue #23212: 3.4-specific OS X installer updates files: Mac/BuildScript/README.txt | 46 +--------------- Mac/BuildScript/resources/ReadMe.rtf | 2 +- 2 files changed, 3 insertions(+), 45 deletions(-) diff --git a/Mac/BuildScript/README.txt b/Mac/BuildScript/README.txt --- a/Mac/BuildScript/README.txt +++ b/Mac/BuildScript/README.txt @@ -21,9 +21,6 @@ yet integrated into ``build-installer.py``. The steps prior to the flat package creation are the same as for 3.4.1 below. -For Python 3.4.0 and 3.4.1, PSF practice was to build two installer variants -for each release. - 1. 32-bit-only, i386 and PPC universal, capable on running on all machines supported by Mac OS X 10.5 through (at least) 10.9:: @@ -34,6 +31,7 @@ - builds the following third-party libraries + * libcrypto and libssl from OpenSSL 1.0.1 (new, as of 3.4.3) * NCurses 5.9 (http://bugs.python.org/issue15037) * SQLite 3.8.3.1 * XZ 5.0.5 @@ -75,6 +73,7 @@ - uses system-supplied versions of third-party libraries + * libcrypto and libssl from Apple OpenSSL 0.9.8 * readline module links with Apple BSD editline (libedit) - requires ActiveState Tcl/Tk 8.5.15.1 (or later) to be installed for building @@ -103,47 +102,6 @@ that the Xcode 3 gcc-4.2 compiler has had. -* For Python 2.7.x and 3.2.x, the 32-bit-only installer was configured to - support Mac OS X 10.3.9 through (at least) 10.6. Because it is - believed that there are few systems still running OS X 10.3 or 10.4 - and because it has become increasingly difficult to test and - support the differences in these earlier systems, as of Python 3.3.0 the PSF - 32-bit installer no longer supports them. For reference in building such - an installer yourself, the details are:: - - /usr/bin/python build-installer.py \ - --sdk-path=/Developer/SDKs/MacOSX10.4u.sdk \ - --universal-archs=32-bit \ - --dep-target=10.3 - - - builds the following third-party libraries - - * Bzip2 - * NCurses - * GNU Readline (GPL) - * SQLite 3 - * XZ - * Zlib 1.2.3 - * Oracle Sleepycat DB 4.8 (Python 2.x only) - - - requires ActiveState ``Tcl/Tk 8.4`` (currently 8.4.20) to be installed for building - - - recommended build environment: - - * Mac OS X 10.5.8 PPC or Intel - * Xcode 3.1.4 (or later) - * ``MacOSX10.4u`` SDK (later SDKs do not support PPC G3 processors) - * ``MACOSX_DEPLOYMENT_TARGET=10.3`` - * Apple ``gcc-4.0`` - * system Python 2.5 for documentation build with Sphinx - - - alternate build environments: - - * Mac OS X 10.6.8 with Xcode 3.2.6 - - need to change ``/System/Library/Frameworks/{Tcl,Tk}.framework/Version/Current`` to ``8.4`` - - - General Prerequisites --------------------- diff --git a/Mac/BuildScript/resources/ReadMe.rtf b/Mac/BuildScript/resources/ReadMe.rtf --- a/Mac/BuildScript/resources/ReadMe.rtf +++ b/Mac/BuildScript/resources/ReadMe.rtf @@ -125,7 +125,7 @@ \i0 . To solve this problem, as of 3.4.3 the \i 10.5+ 32-bit-only python.org variant \i0 is linked with a private copy of -\i OpenSSL 1.0.1j +\i OpenSSL 1.0.1 \i0 ; it consults the same default certificate directory, \f1 /System/Library/OpenSSL \f0 . As before, it is still necessary to manage certificates yourself when you use this Python variant and, with certificate verification now enabled by default, you may now need to take additional steps to ensure your Python programs have access to CA certificates you trust. If you use this Python variant to build standalone applications with third-party tools like {\field{\*\fldinst{HYPERLINK "https://pypi.python.org/pypi/py2app/"}}{\fldrslt -- Repository URL: https://hg.python.org/cpython From python-checkins at python.org Fri Jan 9 23:41:10 2015 From: python-checkins at python.org (benjamin.peterson) Date: Fri, 09 Jan 2015 22:41:10 +0000 Subject: [Python-checkins] =?utf-8?q?cpython_=283=2E4=29=3A_remove_buzzwor?= =?utf-8?q?d_=28closes_=2323210=29?= Message-ID: <20150109224044.11571.87094@psf.io> https://hg.python.org/cpython/rev/79f33147949b changeset: 94103:79f33147949b branch: 3.4 parent: 94101:726d67a7ebf2 user: Benjamin Peterson date: Fri Jan 09 16:40:23 2015 -0600 summary: remove buzzword (closes #23210) files: Objects/rangeobject.c | 2 +- 1 files changed, 1 insertions(+), 1 deletions(-) diff --git a/Objects/rangeobject.c b/Objects/rangeobject.c --- a/Objects/rangeobject.c +++ b/Objects/rangeobject.c @@ -139,7 +139,7 @@ "range(stop) -> range object\n\ range(start, stop[, step]) -> range object\n\ \n\ -Return a virtual sequence of numbers from start to stop by step."); +Return a sequence of numbers from start to stop by step."); static void range_dealloc(rangeobject *r) -- Repository URL: https://hg.python.org/cpython From python-checkins at python.org Fri Jan 9 23:41:10 2015 From: python-checkins at python.org (benjamin.peterson) Date: Fri, 09 Jan 2015 22:41:10 +0000 Subject: [Python-checkins] =?utf-8?q?cpython_=28merge_3=2E4_-=3E_default?= =?utf-8?b?KTogbWVyZ2UgMy40ICgjMjMyMTAp?= Message-ID: <20150109224044.125890.56985@psf.io> https://hg.python.org/cpython/rev/154ae3af0317 changeset: 94104:154ae3af0317 parent: 94102:6d518aa5e1a2 parent: 94103:79f33147949b user: Benjamin Peterson date: Fri Jan 09 16:40:38 2015 -0600 summary: merge 3.4 (#23210) files: Objects/rangeobject.c | 2 +- 1 files changed, 1 insertions(+), 1 deletions(-) diff --git a/Objects/rangeobject.c b/Objects/rangeobject.c --- a/Objects/rangeobject.c +++ b/Objects/rangeobject.c @@ -139,7 +139,7 @@ "range(stop) -> range object\n\ range(start, stop[, step]) -> range object\n\ \n\ -Return a virtual sequence of numbers from start to stop by step."); +Return a sequence of numbers from start to stop by step."); static void range_dealloc(rangeobject *r) -- Repository URL: https://hg.python.org/cpython From python-checkins at python.org Sat Jan 10 00:21:38 2015 From: python-checkins at python.org (guido.van.rossum) Date: Fri, 09 Jan 2015 23:21:38 +0000 Subject: [Python-checkins] =?utf-8?q?peps=3A_Add_PyCharm_reference=2E?= Message-ID: <20150109232129.8763.28265@psf.io> https://hg.python.org/peps/rev/0b4bc336a7fc changeset: 5669:0b4bc336a7fc user: Guido van Rossum date: Fri Jan 09 15:21:14 2015 -0800 summary: Add PyCharm reference. files: pep-0482.txt | 13 +++++++++++++ 1 files changed, 13 insertions(+), 0 deletions(-) diff --git a/pep-0482.txt b/pep-0482.txt --- a/pep-0482.txt +++ b/pep-0482.txt @@ -38,6 +38,16 @@ Vitousek with Jeremy Siek and Jim Baker (the latter of Jython fame). +PyCharm +------- + +PyCharm by JetBrains has been providing a way to specify and check +types for about four years. The type system suggested by PyCharm +[pycharm]_ grew from simple class types to tuple types, generic types, +function types, etc. based on feedback of many users who shared their +experience of using type hints in their code. + + Others ------ @@ -196,6 +206,9 @@ .. [reticulated-paper] http://wphomes.soic.indiana.edu/jsiek/files/2014/03/retic-python.pdf +.. [pycharm] + https://github.com/JetBrains/python-skeletons#types + .. [obiwan] http://pypi.python.org/pypi/obiwan -- Repository URL: https://hg.python.org/peps From python-checkins at python.org Sat Jan 10 09:00:46 2015 From: python-checkins at python.org (victor.stinner) Date: Sat, 10 Jan 2015 08:00:46 +0000 Subject: [Python-checkins] =?utf-8?q?cpython=3A_Issue_=2319776=3A_Fix_test?= =?utf-8?q?=5Fpathlib=2Etest=5Fexpanduser=28=29?= Message-ID: <20150110080029.72577.69712@psf.io> https://hg.python.org/cpython/rev/63dac5212552 changeset: 94105:63dac5212552 user: Victor Stinner date: Sat Jan 10 09:00:20 2015 +0100 summary: Issue #19776: Fix test_pathlib.test_expanduser() Skip users with an empty home directory. files: Lib/test/test_pathlib.py | 2 +- 1 files changed, 1 insertions(+), 1 deletions(-) diff --git a/Lib/test/test_pathlib.py b/Lib/test/test_pathlib.py --- a/Lib/test/test_pathlib.py +++ b/Lib/test/test_pathlib.py @@ -1983,7 +1983,7 @@ for pwdent in pwd.getpwall(): othername = pwdent.pw_name otherhome = pwdent.pw_dir.rstrip('/') - if othername != username: + if othername != username and otherhome: break p1 = P('~/Documents') -- Repository URL: https://hg.python.org/cpython From solipsis at pitrou.net Sat Jan 10 09:35:38 2015 From: solipsis at pitrou.net (solipsis at pitrou.net) Date: Sat, 10 Jan 2015 09:35:38 +0100 Subject: [Python-checkins] Daily reference leaks (154ae3af0317): sum=6 Message-ID: results for 154ae3af0317 on branch "default" -------------------------------------------- test_asyncio leaked [3, 0, 0] memory blocks, sum=3 test_functools leaked [0, 0, 3] memory blocks, sum=3 Command line was: ['./python', '-m', 'test.regrtest', '-uall', '-R', '3:3:/home/antoine/cpython/refleaks/reflogcGfLFP', '-x'] From solipsis at pitrou.net Sun Jan 11 09:47:52 2015 From: solipsis at pitrou.net (solipsis at pitrou.net) Date: Sun, 11 Jan 2015 09:47:52 +0100 Subject: [Python-checkins] Daily reference leaks (63dac5212552): sum=1 Message-ID: results for 63dac5212552 on branch "default" -------------------------------------------- test_collections leaked [0, -2, 0] references, sum=-2 test_functools leaked [0, 0, 3] memory blocks, sum=3 Command line was: ['./python', '-m', 'test.regrtest', '-uall', '-R', '3:3:/home/antoine/cpython/refleaks/reflogW9zElA', '-x'] From python-checkins at python.org Sun Jan 11 11:49:17 2015 From: python-checkins at python.org (serhiy.storchaka) Date: Sun, 11 Jan 2015 10:49:17 +0000 Subject: [Python-checkins] =?utf-8?q?cpython=3A_Removed_duplicated_dict_en?= =?utf-8?q?tries=2E?= Message-ID: <20150111104856.125902.48174@psf.io> https://hg.python.org/cpython/rev/5b2212e0c350 changeset: 94106:5b2212e0c350 user: Serhiy Storchaka date: Sun Jan 11 12:48:17 2015 +0200 summary: Removed duplicated dict entries. files: Lib/tokenize.py | 1 - 1 files changed, 0 insertions(+), 1 deletions(-) diff --git a/Lib/tokenize.py b/Lib/tokenize.py --- a/Lib/tokenize.py +++ b/Lib/tokenize.py @@ -186,7 +186,6 @@ "rB'''": Single3, 'rB"""': Double3, "RB'''": Single3, 'RB"""': Double3, "u'''": Single3, 'u"""': Double3, - "R'''": Single3, 'R"""': Double3, "U'''": Single3, 'U"""': Double3, 'r': None, 'R': None, 'b': None, 'B': None, 'u': None, 'U': None} -- Repository URL: https://hg.python.org/cpython From python-checkins at python.org Sun Jan 11 12:55:38 2015 From: python-checkins at python.org (mark.dickinson) Date: Sun, 11 Jan 2015 11:55:38 +0000 Subject: [Python-checkins] =?utf-8?q?cpython=3A_Issue_=2323185=3A_add_math?= =?utf-8?q?=2Einf_and_math=2Enan_constants=2E?= Message-ID: <20150111115537.125892.41308@psf.io> https://hg.python.org/cpython/rev/cf4bf577749c changeset: 94107:cf4bf577749c user: Mark Dickinson date: Sun Jan 11 11:55:29 2015 +0000 summary: Issue #23185: add math.inf and math.nan constants. files: Doc/library/math.rst | 16 ++++++++++++++ Doc/whatsnew/3.5.rst | 6 +++++ Lib/test/test_math.py | 11 +++++++++ Misc/NEWS | 2 + Modules/mathmodule.c | 35 ++++++++++++++++++++++++++++++- 5 files changed, 69 insertions(+), 1 deletions(-) diff --git a/Doc/library/math.rst b/Doc/library/math.rst --- a/Doc/library/math.rst +++ b/Doc/library/math.rst @@ -383,6 +383,22 @@ The mathematical constant e = 2.718281..., to available precision. +.. data:: inf + + A floating-point positive infinity. (For negative infinity, use + ``-math.inf``.) Equivalent to the output of ``float('inf')``. + + .. versionadded:: 3.5 + + +.. data:: nan + + A floating-point "not a number" (NaN) value. Equivalent to the output of + ``float('nan')``. + + .. versionadded:: 3.5 + + .. impl-detail:: The :mod:`math` module consists mostly of thin wrappers around the platform C diff --git a/Doc/whatsnew/3.5.rst b/Doc/whatsnew/3.5.rst --- a/Doc/whatsnew/3.5.rst +++ b/Doc/whatsnew/3.5.rst @@ -243,6 +243,12 @@ * Now unmatched groups are replaced with empty strings in :func:`re.sub` and :func:`re.subn`. (Contributed by Serhiy Storchaka in :issue:`1519638`.) +math +---- + +* :data:`math.inf` and :data:`math.nan` constants added. (Contributed by Mark + Dickinson in :issue:`23185`.) + shutil ------ diff --git a/Lib/test/test_math.py b/Lib/test/test_math.py --- a/Lib/test/test_math.py +++ b/Lib/test/test_math.py @@ -983,6 +983,17 @@ self.assertFalse(math.isinf(0.)) self.assertFalse(math.isinf(1.)) + @requires_IEEE_754 + def test_nan_constant(self): + self.assertTrue(math.isnan(math.nan)) + + @requires_IEEE_754 + def test_inf_constant(self): + self.assertTrue(math.isinf(math.inf)) + self.assertGreater(math.inf, 0.0) + self.assertEqual(math.inf, float("inf")) + self.assertEqual(-math.inf, float("-inf")) + # RED_FLAG 16-Oct-2000 Tim # While 2.0 is more consistent about exceptions than previous releases, it # still fails this part of the test on some platforms. For now, we only diff --git a/Misc/NEWS b/Misc/NEWS --- a/Misc/NEWS +++ b/Misc/NEWS @@ -203,6 +203,8 @@ Library ------- +- Issue #23185: Add math.inf and math.nan constants. + - Issue #23186: Add ssl.SSLObject.shared_ciphers() and ssl.SSLSocket.shared_ciphers() to fetch the client's list ciphers sent at handshake. diff --git a/Modules/mathmodule.c b/Modules/mathmodule.c --- a/Modules/mathmodule.c +++ b/Modules/mathmodule.c @@ -223,6 +223,35 @@ return num/den; } +/* Constant for +infinity, generated in the same way as float('inf'). */ + +static double +m_inf(void) +{ +#ifndef PY_NO_SHORT_FLOAT_REPR + return _Py_dg_infinity(0); +#else + return Py_HUGE_VAL; +#endif +} + +/* Constant nan value, generated in the same way as float('nan'). */ +/* We don't currently assume that Py_NAN is defined everywhere. */ + +#if !defined(PY_NO_SHORT_FLOAT_REPR) || defined(Py_NAN) + +static double +m_nan(void) +{ +#ifndef PY_NO_SHORT_FLOAT_REPR + return _Py_dg_stdnan(0); +#else + return Py_NAN; +#endif +} + +#endif + static double m_tgamma(double x) { @@ -2009,7 +2038,11 @@ PyModule_AddObject(m, "pi", PyFloat_FromDouble(Py_MATH_PI)); PyModule_AddObject(m, "e", PyFloat_FromDouble(Py_MATH_E)); + PyModule_AddObject(m, "inf", PyFloat_FromDouble(m_inf())); +#if !defined(PY_NO_SHORT_FLOAT_REPR) || defined(Py_NAN) + PyModule_AddObject(m, "nan", PyFloat_FromDouble(m_nan())); +#endif - finally: + finally: return m; } -- Repository URL: https://hg.python.org/cpython From python-checkins at python.org Sun Jan 11 14:03:52 2015 From: python-checkins at python.org (mark.dickinson) Date: Sun, 11 Jan 2015 13:03:52 +0000 Subject: [Python-checkins] =?utf-8?b?Y3B5dGhvbiAoMi43KTogSXNzdWUgIzIxOTAy?= =?utf-8?q?=3A_Replace_incorrect_=27hyperbolic_arc_sine=27_=28etc=2E=29_wi?= =?utf-8?q?th_=27inverse?= Message-ID: <20150111130344.125882.28696@psf.io> https://hg.python.org/cpython/rev/d81cabb39de3 changeset: 94108:d81cabb39de3 branch: 2.7 parent: 94099:849ec86651b4 user: Mark Dickinson date: Sun Jan 11 13:03:06 2015 +0000 summary: Issue #21902: Replace incorrect 'hyperbolic arc sine' (etc.) with 'inverse hyperbolic sine' (etc.). Remove meaningless reference to radians. files: Doc/library/cmath.rst | 8 ++++---- Modules/cmathmodule.c | 6 +++--- Modules/mathmodule.c | 6 +++--- 3 files changed, 10 insertions(+), 10 deletions(-) diff --git a/Doc/library/cmath.rst b/Doc/library/cmath.rst --- a/Doc/library/cmath.rst +++ b/Doc/library/cmath.rst @@ -161,13 +161,13 @@ .. function:: acosh(x) - Return the hyperbolic arc cosine of *x*. There is one branch cut, extending left - from 1 along the real axis to -?, continuous from above. + Return the inverse hyperbolic cosine of *x*. There is one branch cut, + extending left from 1 along the real axis to -?, continuous from above. .. function:: asinh(x) - Return the hyperbolic arc sine of *x*. There are two branch cuts: + Return the inverse hyperbolic sine of *x*. There are two branch cuts: One extends from ``1j`` along the imaginary axis to ``?j``, continuous from the right. The other extends from ``-1j`` along the imaginary axis to ``-?j``, continuous from the left. @@ -178,7 +178,7 @@ .. function:: atanh(x) - Return the hyperbolic arc tangent of *x*. There are two branch cuts: One + Return the inverse hyperbolic tangent of *x*. There are two branch cuts: One extends from ``1`` along the real axis to ``?``, continuous from below. The other extends from ``-1`` along the real axis to ``-?``, continuous from above. diff --git a/Modules/cmathmodule.c b/Modules/cmathmodule.c --- a/Modules/cmathmodule.c +++ b/Modules/cmathmodule.c @@ -192,7 +192,7 @@ PyDoc_STRVAR(c_acosh_doc, "acosh(x)\n" "\n" -"Return the hyperbolic arccosine of x."); +"Return the inverse hyperbolic cosine of x."); static Py_complex @@ -249,7 +249,7 @@ PyDoc_STRVAR(c_asinh_doc, "asinh(x)\n" "\n" -"Return the hyperbolic arc sine of x."); +"Return the inverse hyperbolic sine of x."); static Py_complex @@ -353,7 +353,7 @@ PyDoc_STRVAR(c_atanh_doc, "atanh(x)\n" "\n" -"Return the hyperbolic arc tangent of x."); +"Return the inverse hyperbolic tangent of x."); static Py_complex diff --git a/Modules/mathmodule.c b/Modules/mathmodule.c --- a/Modules/mathmodule.c +++ b/Modules/mathmodule.c @@ -809,18 +809,18 @@ FUNC1(acos, acos, 0, "acos(x)\n\nReturn the arc cosine (measured in radians) of x.") FUNC1(acosh, m_acosh, 0, - "acosh(x)\n\nReturn the hyperbolic arc cosine (measured in radians) of x.") + "acosh(x)\n\nReturn the inverse hyperbolic cosine of x.") FUNC1(asin, asin, 0, "asin(x)\n\nReturn the arc sine (measured in radians) of x.") FUNC1(asinh, m_asinh, 0, - "asinh(x)\n\nReturn the hyperbolic arc sine (measured in radians) of x.") + "asinh(x)\n\nReturn the inverse hyperbolic sine of x.") FUNC1(atan, atan, 0, "atan(x)\n\nReturn the arc tangent (measured in radians) of x.") FUNC2(atan2, m_atan2, "atan2(y, x)\n\nReturn the arc tangent (measured in radians) of y/x.\n" "Unlike atan(y/x), the signs of both x and y are considered.") FUNC1(atanh, m_atanh, 0, - "atanh(x)\n\nReturn the hyperbolic arc tangent (measured in radians) of x.") + "atanh(x)\n\nReturn the inverse hyperbolic tangent of x.") FUNC1(ceil, ceil, 0, "ceil(x)\n\nReturn the ceiling of x as a float.\n" "This is the smallest integral value >= x.") -- Repository URL: https://hg.python.org/cpython From python-checkins at python.org Sun Jan 11 14:23:44 2015 From: python-checkins at python.org (mark.dickinson) Date: Sun, 11 Jan 2015 13:23:44 +0000 Subject: [Python-checkins] =?utf-8?b?Y3B5dGhvbiAoMy40KTogSXNzdWUgIzIxOTAy?= =?utf-8?q?=3A_Replace_incorrect_=27hyperbolic_arc_sine=27_=28etc=2E=29_wi?= =?utf-8?q?th_=27inverse?= Message-ID: <20150111132335.8745.27549@psf.io> https://hg.python.org/cpython/rev/5edfc6c929f9 changeset: 94109:5edfc6c929f9 branch: 3.4 parent: 94103:79f33147949b user: Mark Dickinson date: Sun Jan 11 13:08:05 2015 +0000 summary: Issue #21902: Replace incorrect 'hyperbolic arc sine' (etc.) with 'inverse hyperbolic sine' (etc.). Remove meaningless reference to radians. files: Doc/library/cmath.rst | 8 ++++---- Modules/cmathmodule.c | 6 +++--- Modules/mathmodule.c | 6 +++--- 3 files changed, 10 insertions(+), 10 deletions(-) diff --git a/Doc/library/cmath.rst b/Doc/library/cmath.rst --- a/Doc/library/cmath.rst +++ b/Doc/library/cmath.rst @@ -149,13 +149,13 @@ .. function:: acosh(x) - Return the hyperbolic arc cosine of *x*. There is one branch cut, extending left - from 1 along the real axis to -?, continuous from above. + Return the inverse hyperbolic cosine of *x*. There is one branch cut, + extending left from 1 along the real axis to -?, continuous from above. .. function:: asinh(x) - Return the hyperbolic arc sine of *x*. There are two branch cuts: + Return the inverse hyperbolic sine of *x*. There are two branch cuts: One extends from ``1j`` along the imaginary axis to ``?j``, continuous from the right. The other extends from ``-1j`` along the imaginary axis to ``-?j``, continuous from the left. @@ -163,7 +163,7 @@ .. function:: atanh(x) - Return the hyperbolic arc tangent of *x*. There are two branch cuts: One + Return the inverse hyperbolic tangent of *x*. There are two branch cuts: One extends from ``1`` along the real axis to ``?``, continuous from below. The other extends from ``-1`` along the real axis to ``-?``, continuous from above. diff --git a/Modules/cmathmodule.c b/Modules/cmathmodule.c --- a/Modules/cmathmodule.c +++ b/Modules/cmathmodule.c @@ -192,7 +192,7 @@ PyDoc_STRVAR(c_acosh_doc, "acosh(x)\n" "\n" -"Return the hyperbolic arccosine of x."); +"Return the inverse hyperbolic cosine of x."); static Py_complex @@ -249,7 +249,7 @@ PyDoc_STRVAR(c_asinh_doc, "asinh(x)\n" "\n" -"Return the hyperbolic arc sine of x."); +"Return the inverse hyperbolic sine of x."); static Py_complex @@ -353,7 +353,7 @@ PyDoc_STRVAR(c_atanh_doc, "atanh(x)\n" "\n" -"Return the hyperbolic arc tangent of x."); +"Return the inverse hyperbolic tangent of x."); static Py_complex diff --git a/Modules/mathmodule.c b/Modules/mathmodule.c --- a/Modules/mathmodule.c +++ b/Modules/mathmodule.c @@ -873,18 +873,18 @@ FUNC1(acos, acos, 0, "acos(x)\n\nReturn the arc cosine (measured in radians) of x.") FUNC1(acosh, m_acosh, 0, - "acosh(x)\n\nReturn the hyperbolic arc cosine (measured in radians) of x.") + "acosh(x)\n\nReturn the inverse hyperbolic cosine of x.") FUNC1(asin, asin, 0, "asin(x)\n\nReturn the arc sine (measured in radians) of x.") FUNC1(asinh, m_asinh, 0, - "asinh(x)\n\nReturn the hyperbolic arc sine (measured in radians) of x.") + "asinh(x)\n\nReturn the inverse hyperbolic sine of x.") FUNC1(atan, atan, 0, "atan(x)\n\nReturn the arc tangent (measured in radians) of x.") FUNC2(atan2, m_atan2, "atan2(y, x)\n\nReturn the arc tangent (measured in radians) of y/x.\n" "Unlike atan(y/x), the signs of both x and y are considered.") FUNC1(atanh, m_atanh, 0, - "atanh(x)\n\nReturn the hyperbolic arc tangent (measured in radians) of x.") + "atanh(x)\n\nReturn the inverse hyperbolic tangent of x.") static PyObject * math_ceil(PyObject *self, PyObject *number) { _Py_IDENTIFIER(__ceil__); -- Repository URL: https://hg.python.org/cpython From python-checkins at python.org Sun Jan 11 14:23:44 2015 From: python-checkins at python.org (mark.dickinson) Date: Sun, 11 Jan 2015 13:23:44 +0000 Subject: [Python-checkins] =?utf-8?q?cpython_=28merge_3=2E4_-=3E_default?= =?utf-8?q?=29=3A_Issue_=2321092=3A_Merge_from_3=2E4=2E?= Message-ID: <20150111132335.125882.8030@psf.io> https://hg.python.org/cpython/rev/36099a05d76a changeset: 94110:36099a05d76a parent: 94107:cf4bf577749c parent: 94109:5edfc6c929f9 user: Mark Dickinson date: Sun Jan 11 13:22:44 2015 +0000 summary: Issue #21092: Merge from 3.4. files: Doc/library/cmath.rst | 8 ++++---- Modules/clinic/cmathmodule.c.h | 6 +++--- Modules/cmathmodule.c | 6 +++--- Modules/mathmodule.c | 6 +++--- 4 files changed, 13 insertions(+), 13 deletions(-) diff --git a/Doc/library/cmath.rst b/Doc/library/cmath.rst --- a/Doc/library/cmath.rst +++ b/Doc/library/cmath.rst @@ -149,13 +149,13 @@ .. function:: acosh(x) - Return the hyperbolic arc cosine of *x*. There is one branch cut, extending left - from 1 along the real axis to -?, continuous from above. + Return the inverse hyperbolic cosine of *x*. There is one branch cut, + extending left from 1 along the real axis to -?, continuous from above. .. function:: asinh(x) - Return the hyperbolic arc sine of *x*. There are two branch cuts: + Return the inverse hyperbolic sine of *x*. There are two branch cuts: One extends from ``1j`` along the imaginary axis to ``?j``, continuous from the right. The other extends from ``-1j`` along the imaginary axis to ``-?j``, continuous from the left. @@ -163,7 +163,7 @@ .. function:: atanh(x) - Return the hyperbolic arc tangent of *x*. There are two branch cuts: One + Return the inverse hyperbolic tangent of *x*. There are two branch cuts: One extends from ``1`` along the real axis to ``?``, continuous from below. The other extends from ``-1`` along the real axis to ``-?``, continuous from above. diff --git a/Modules/clinic/cmathmodule.c.h b/Modules/clinic/cmathmodule.c.h --- a/Modules/clinic/cmathmodule.c.h +++ b/Modules/clinic/cmathmodule.c.h @@ -49,7 +49,7 @@ "acosh($module, z, /)\n" "--\n" "\n" -"Return the hyperbolic arccosine of z."); +"Return the inverse hyperbolic cosine of z."); #define CMATH_ACOSH_METHODDEF \ {"acosh", (PyCFunction)cmath_acosh, METH_VARARGS, cmath_acosh__doc__}, @@ -135,7 +135,7 @@ "asinh($module, z, /)\n" "--\n" "\n" -"Return the hyperbolic arc sine of z."); +"Return the inverse hyperbolic sine of z."); #define CMATH_ASINH_METHODDEF \ {"asinh", (PyCFunction)cmath_asinh, METH_VARARGS, cmath_asinh__doc__}, @@ -221,7 +221,7 @@ "atanh($module, z, /)\n" "--\n" "\n" -"Return the hyperbolic arc tangent of z."); +"Return the inverse hyperbolic tangent of z."); #define CMATH_ATANH_METHODDEF \ {"atanh", (PyCFunction)cmath_atanh, METH_VARARGS, cmath_atanh__doc__}, diff --git a/Modules/cmathmodule.c b/Modules/cmathmodule.c --- a/Modules/cmathmodule.c +++ b/Modules/cmathmodule.c @@ -207,7 +207,7 @@ /*[clinic input] cmath.acosh = cmath.acos -Return the hyperbolic arccosine of z. +Return the inverse hyperbolic cosine of z. [clinic start generated code]*/ static Py_complex @@ -262,7 +262,7 @@ /*[clinic input] cmath.asinh = cmath.acos -Return the hyperbolic arc sine of z. +Return the inverse hyperbolic sine of z. [clinic start generated code]*/ static Py_complex @@ -353,7 +353,7 @@ /*[clinic input] cmath.atanh = cmath.acos -Return the hyperbolic arc tangent of z. +Return the inverse hyperbolic tangent of z. [clinic start generated code]*/ static Py_complex diff --git a/Modules/mathmodule.c b/Modules/mathmodule.c --- a/Modules/mathmodule.c +++ b/Modules/mathmodule.c @@ -902,18 +902,18 @@ FUNC1(acos, acos, 0, "acos(x)\n\nReturn the arc cosine (measured in radians) of x.") FUNC1(acosh, m_acosh, 0, - "acosh(x)\n\nReturn the hyperbolic arc cosine (measured in radians) of x.") + "acosh(x)\n\nReturn the inverse hyperbolic cosine of x.") FUNC1(asin, asin, 0, "asin(x)\n\nReturn the arc sine (measured in radians) of x.") FUNC1(asinh, m_asinh, 0, - "asinh(x)\n\nReturn the hyperbolic arc sine (measured in radians) of x.") + "asinh(x)\n\nReturn the inverse hyperbolic sine of x.") FUNC1(atan, atan, 0, "atan(x)\n\nReturn the arc tangent (measured in radians) of x.") FUNC2(atan2, m_atan2, "atan2(y, x)\n\nReturn the arc tangent (measured in radians) of y/x.\n" "Unlike atan(y/x), the signs of both x and y are considered.") FUNC1(atanh, m_atanh, 0, - "atanh(x)\n\nReturn the hyperbolic arc tangent (measured in radians) of x.") + "atanh(x)\n\nReturn the inverse hyperbolic tangent of x.") static PyObject * math_ceil(PyObject *self, PyObject *number) { _Py_IDENTIFIER(__ceil__); -- Repository URL: https://hg.python.org/cpython From python-checkins at python.org Sun Jan 11 15:07:08 2015 From: python-checkins at python.org (antoine.pitrou) Date: Sun, 11 Jan 2015 14:07:08 +0000 Subject: [Python-checkins] =?utf-8?q?cpython_=28merge_3=2E4_-=3E_default?= =?utf-8?q?=29=3A_Issue_=2322952=3A_improve_multiprocessing_doc_introducti?= =?utf-8?q?on_and_defer_notes_until?= Message-ID: <20150111140707.11591.10135@psf.io> https://hg.python.org/cpython/rev/a65c23ea5f9e changeset: 94112:a65c23ea5f9e parent: 94110:36099a05d76a parent: 94111:a9a9c71f8e15 user: Antoine Pitrou date: Sun Jan 11 15:06:39 2015 +0100 summary: Issue #22952: improve multiprocessing doc introduction and defer notes until appropriate. Patch by Davin Potts. files: Doc/library/multiprocessing.rst | 92 +++++++++++++------- 1 files changed, 57 insertions(+), 35 deletions(-) diff --git a/Doc/library/multiprocessing.rst b/Doc/library/multiprocessing.rst --- a/Doc/library/multiprocessing.rst +++ b/Doc/library/multiprocessing.rst @@ -16,41 +16,27 @@ leverage multiple processors on a given machine. It runs on both Unix and Windows. -.. note:: - - Some of this package's functionality requires a functioning shared semaphore - implementation on the host operating system. Without one, the - :mod:`multiprocessing.synchronize` module will be disabled, and attempts to - import it will result in an :exc:`ImportError`. See - :issue:`3770` for additional information. - -.. note:: - - Functionality within this package requires that the ``__main__`` module be - importable by the children. This is covered in :ref:`multiprocessing-programming` - however it is worth pointing out here. This means that some examples, such - as the :class:`multiprocessing.pool.Pool` examples will not work in the - interactive interpreter. For example:: - - >>> from multiprocessing import Pool - >>> p = Pool(5) - >>> def f(x): - ... return x*x - ... - >>> p.map(f, [1,2,3]) - Process PoolWorker-1: - Process PoolWorker-2: - Process PoolWorker-3: - Traceback (most recent call last): - Traceback (most recent call last): - Traceback (most recent call last): - AttributeError: 'module' object has no attribute 'f' - AttributeError: 'module' object has no attribute 'f' - AttributeError: 'module' object has no attribute 'f' - - (If you try this it will actually output three full tracebacks - interleaved in a semi-random fashion, and then you may have to - stop the master process somehow.) +The :mod:`multiprocessing` module also introduces APIs which do not have +analogs in the :mod:`threading` module. A prime example of this is the +:class:`~multiprocessing.pool.Pool` object which offers a convenient means of +parallelizing the execution of a function across multiple input values, +distributing the input data across processes (data parallelism). The following +example demonstrates the common practice of defining such functions in a module +so that child processes can successfully import that module. This basic example +of data parallelism using :class:`~multiprocessing.pool.Pool`, :: + + from multiprocessing import Pool + + def f(x): + return x*x + + if __name__ == '__main__': + with Pool(5) as p: + print(p.map(f, [1, 2, 3])) + +will print to standard output :: + + [1, 4, 9] The :class:`Process` class @@ -276,6 +262,14 @@ Without using the lock output from the different processes is liable to get all mixed up. +.. note:: + + Some of this package's functionality requires a functioning shared semaphore + implementation on the host operating system. Without one, the + :mod:`multiprocessing.synchronize` module will be disabled, and attempts to + import it will result in an :exc:`ImportError`. See + :issue:`3770` for additional information. + Sharing state between processes ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ @@ -406,6 +400,34 @@ Note that the methods of a pool should only ever be used by the process which created it. +.. note:: + + Functionality within this package requires that the ``__main__`` module be + importable by the children. This is covered in :ref:`multiprocessing-programming` + however it is worth pointing out here. This means that some examples, such + as the :class:`multiprocessing.pool.Pool` examples will not work in the + interactive interpreter. For example:: + + >>> from multiprocessing import Pool + >>> p = Pool(5) + >>> def f(x): + ... return x*x + ... + >>> p.map(f, [1,2,3]) + Process PoolWorker-1: + Process PoolWorker-2: + Process PoolWorker-3: + Traceback (most recent call last): + Traceback (most recent call last): + Traceback (most recent call last): + AttributeError: 'module' object has no attribute 'f' + AttributeError: 'module' object has no attribute 'f' + AttributeError: 'module' object has no attribute 'f' + + (If you try this it will actually output three full tracebacks + interleaved in a semi-random fashion, and then you may have to + stop the master process somehow.) + Reference --------- -- Repository URL: https://hg.python.org/cpython From python-checkins at python.org Sun Jan 11 15:07:08 2015 From: python-checkins at python.org (antoine.pitrou) Date: Sun, 11 Jan 2015 14:07:08 +0000 Subject: [Python-checkins] =?utf-8?b?Y3B5dGhvbiAoMy40KTogSXNzdWUgIzIyOTUy?= =?utf-8?q?=3A_improve_multiprocessing_doc_introduction_and_defer_notes_un?= =?utf-8?q?til?= Message-ID: <20150111140707.125904.97891@psf.io> https://hg.python.org/cpython/rev/a9a9c71f8e15 changeset: 94111:a9a9c71f8e15 branch: 3.4 parent: 94109:5edfc6c929f9 user: Antoine Pitrou date: Sun Jan 11 15:05:29 2015 +0100 summary: Issue #22952: improve multiprocessing doc introduction and defer notes until appropriate. Patch by Davin Potts. files: Doc/library/multiprocessing.rst | 92 +++++++++++++------- 1 files changed, 57 insertions(+), 35 deletions(-) diff --git a/Doc/library/multiprocessing.rst b/Doc/library/multiprocessing.rst --- a/Doc/library/multiprocessing.rst +++ b/Doc/library/multiprocessing.rst @@ -16,41 +16,27 @@ leverage multiple processors on a given machine. It runs on both Unix and Windows. -.. note:: - - Some of this package's functionality requires a functioning shared semaphore - implementation on the host operating system. Without one, the - :mod:`multiprocessing.synchronize` module will be disabled, and attempts to - import it will result in an :exc:`ImportError`. See - :issue:`3770` for additional information. - -.. note:: - - Functionality within this package requires that the ``__main__`` module be - importable by the children. This is covered in :ref:`multiprocessing-programming` - however it is worth pointing out here. This means that some examples, such - as the :class:`multiprocessing.pool.Pool` examples will not work in the - interactive interpreter. For example:: - - >>> from multiprocessing import Pool - >>> p = Pool(5) - >>> def f(x): - ... return x*x - ... - >>> p.map(f, [1,2,3]) - Process PoolWorker-1: - Process PoolWorker-2: - Process PoolWorker-3: - Traceback (most recent call last): - Traceback (most recent call last): - Traceback (most recent call last): - AttributeError: 'module' object has no attribute 'f' - AttributeError: 'module' object has no attribute 'f' - AttributeError: 'module' object has no attribute 'f' - - (If you try this it will actually output three full tracebacks - interleaved in a semi-random fashion, and then you may have to - stop the master process somehow.) +The :mod:`multiprocessing` module also introduces APIs which do not have +analogs in the :mod:`threading` module. A prime example of this is the +:class:`~multiprocessing.pool.Pool` object which offers a convenient means of +parallelizing the execution of a function across multiple input values, +distributing the input data across processes (data parallelism). The following +example demonstrates the common practice of defining such functions in a module +so that child processes can successfully import that module. This basic example +of data parallelism using :class:`~multiprocessing.pool.Pool`, :: + + from multiprocessing import Pool + + def f(x): + return x*x + + if __name__ == '__main__': + with Pool(5) as p: + print(p.map(f, [1, 2, 3])) + +will print to standard output :: + + [1, 4, 9] The :class:`Process` class @@ -276,6 +262,14 @@ Without using the lock output from the different processes is liable to get all mixed up. +.. note:: + + Some of this package's functionality requires a functioning shared semaphore + implementation on the host operating system. Without one, the + :mod:`multiprocessing.synchronize` module will be disabled, and attempts to + import it will result in an :exc:`ImportError`. See + :issue:`3770` for additional information. + Sharing state between processes ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ @@ -406,6 +400,34 @@ Note that the methods of a pool should only ever be used by the process which created it. +.. note:: + + Functionality within this package requires that the ``__main__`` module be + importable by the children. This is covered in :ref:`multiprocessing-programming` + however it is worth pointing out here. This means that some examples, such + as the :class:`multiprocessing.pool.Pool` examples will not work in the + interactive interpreter. For example:: + + >>> from multiprocessing import Pool + >>> p = Pool(5) + >>> def f(x): + ... return x*x + ... + >>> p.map(f, [1,2,3]) + Process PoolWorker-1: + Process PoolWorker-2: + Process PoolWorker-3: + Traceback (most recent call last): + Traceback (most recent call last): + Traceback (most recent call last): + AttributeError: 'module' object has no attribute 'f' + AttributeError: 'module' object has no attribute 'f' + AttributeError: 'module' object has no attribute 'f' + + (If you try this it will actually output three full tracebacks + interleaved in a semi-random fashion, and then you may have to + stop the master process somehow.) + Reference --------- -- Repository URL: https://hg.python.org/cpython From python-checkins at python.org Sun Jan 11 15:09:38 2015 From: python-checkins at python.org (antoine.pitrou) Date: Sun, 11 Jan 2015 14:09:38 +0000 Subject: [Python-checkins] =?utf-8?b?Y3B5dGhvbiAoMi43KTogSXNzdWUgIzIyOTUy?= =?utf-8?q?=3A_improve_multiprocessing_doc_introduction_and_defer_notes_un?= =?utf-8?q?til?= Message-ID: <20150111140935.125896.30028@psf.io> https://hg.python.org/cpython/rev/e7d03a33e675 changeset: 94113:e7d03a33e675 branch: 2.7 parent: 94108:d81cabb39de3 user: Antoine Pitrou date: Sun Jan 11 15:09:27 2015 +0100 summary: Issue #22952: improve multiprocessing doc introduction and defer notes until appropriate. Patch by Davin Potts. files: Doc/library/multiprocessing.rst | 93 ++++++++++++-------- 1 files changed, 57 insertions(+), 36 deletions(-) diff --git a/Doc/library/multiprocessing.rst b/Doc/library/multiprocessing.rst --- a/Doc/library/multiprocessing.rst +++ b/Doc/library/multiprocessing.rst @@ -18,41 +18,27 @@ leverage multiple processors on a given machine. It runs on both Unix and Windows. -.. warning:: - - Some of this package's functionality requires a functioning shared semaphore - implementation on the host operating system. Without one, the - :mod:`multiprocessing.synchronize` module will be disabled, and attempts to - import it will result in an :exc:`ImportError`. See - :issue:`3770` for additional information. - -.. note:: - - Functionality within this package requires that the ``__main__`` module be - importable by the children. This is covered in :ref:`multiprocessing-programming` - however it is worth pointing out here. This means that some examples, such - as the :class:`multiprocessing.Pool` examples will not work in the - interactive interpreter. For example:: - - >>> from multiprocessing import Pool - >>> p = Pool(5) - >>> def f(x): - ... return x*x - ... - >>> p.map(f, [1,2,3]) - Process PoolWorker-1: - Process PoolWorker-2: - Process PoolWorker-3: - Traceback (most recent call last): - Traceback (most recent call last): - Traceback (most recent call last): - AttributeError: 'module' object has no attribute 'f' - AttributeError: 'module' object has no attribute 'f' - AttributeError: 'module' object has no attribute 'f' - - (If you try this it will actually output three full tracebacks - interleaved in a semi-random fashion, and then you may have to - stop the master process somehow.) +The :mod:`multiprocessing` module also introduces APIs which do not have +analogs in the :mod:`threading` module. A prime example of this is the +:class:`Pool` object which offers a convenient means of parallelizing the +execution of a function across multiple input values, distributing the +input data across processes (data parallelism). The following example +demonstrates the common practice of defining such functions in a module so +that child processes can successfully import that module. This basic example +of data parallelism using :class:`Pool`, :: + + from multiprocessing import Pool + + def f(x): + return x*x + + if __name__ == '__main__': + p = Pool(5) + print(p.map(f, [1, 2, 3])) + +will print to standard output :: + + [1, 4, 9] The :class:`Process` class @@ -99,7 +85,6 @@ necessary, see :ref:`multiprocessing-programming`. - Exchanging objects between processes ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ @@ -175,6 +160,14 @@ Without using the lock output from the different processes is liable to get all mixed up. +.. warning:: + + Some of this package's functionality requires a functioning shared semaphore + implementation on the host operating system. Without one, the + :mod:`multiprocessing.synchronize` module will be disabled, and attempts to + import it will result in an :exc:`ImportError`. See + :issue:`3770` for additional information. + Sharing state between processes ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ @@ -290,6 +283,34 @@ Note that the methods of a pool should only ever be used by the process which created it. +.. note:: + + Functionality within this package requires that the ``__main__`` module be + importable by the children. This is covered in :ref:`multiprocessing-programming` + however it is worth pointing out here. This means that some examples, such + as the :class:`Pool` examples will not work in the interactive interpreter. + For example:: + + >>> from multiprocessing import Pool + >>> p = Pool(5) + >>> def f(x): + ... return x*x + ... + >>> p.map(f, [1,2,3]) + Process PoolWorker-1: + Process PoolWorker-2: + Process PoolWorker-3: + Traceback (most recent call last): + Traceback (most recent call last): + Traceback (most recent call last): + AttributeError: 'module' object has no attribute 'f' + AttributeError: 'module' object has no attribute 'f' + AttributeError: 'module' object has no attribute 'f' + + (If you try this it will actually output three full tracebacks + interleaved in a semi-random fashion, and then you may have to + stop the master process somehow.) + Reference --------- -- Repository URL: https://hg.python.org/cpython From python-checkins at python.org Sun Jan 11 21:22:26 2015 From: python-checkins at python.org (benjamin.peterson) Date: Sun, 11 Jan 2015 20:22:26 +0000 Subject: [Python-checkins] =?utf-8?q?cpython=3A_remove_extra_definite_arti?= =?utf-8?q?cle?= Message-ID: <20150111202226.11591.86096@psf.io> https://hg.python.org/cpython/rev/ca5fe2148490 changeset: 94115:ca5fe2148490 user: Benjamin Peterson date: Sun Jan 11 15:22:07 2015 -0500 summary: remove extra definite article files: Lib/ssl.py | 4 ++-- 1 files changed, 2 insertions(+), 2 deletions(-) diff --git a/Lib/ssl.py b/Lib/ssl.py --- a/Lib/ssl.py +++ b/Lib/ssl.py @@ -573,8 +573,8 @@ return self._sslobj.cipher() def shared_ciphers(self): - """Return the a list of ciphers shared by the client during the - handshake or None if this is not a valid server connection. + """Return a list of ciphers shared by the client during the handshake or + None if this is not a valid server connection. """ return self._sslobj.shared_ciphers() -- Repository URL: https://hg.python.org/cpython From python-checkins at python.org Sun Jan 11 21:49:48 2015 From: python-checkins at python.org (donald.stufft) Date: Sun, 11 Jan 2015 20:49:48 +0000 Subject: [Python-checkins] =?utf-8?q?cpython_=282=2E7=29=3A_Bump_setuptool?= =?utf-8?q?s_to_11=2E3=2E1?= Message-ID: <20150111204943.8753.8140@psf.io> https://hg.python.org/cpython/rev/ad92f9b2fe04 changeset: 94116:ad92f9b2fe04 branch: 2.7 parent: 94113:e7d03a33e675 user: Donald Stufft date: Sun Jan 11 15:49:22 2015 -0500 summary: Bump setuptools to 11.3.1 files: Lib/ensurepip/__init__.py | 2 +- Lib/ensurepip/_bundled/setuptools-11.0-py2.py3-none-any.whl | Bin Lib/ensurepip/_bundled/setuptools-11.3.1-py2.py3-none-any.whl | Bin 3 files changed, 1 insertions(+), 1 deletions(-) diff --git a/Lib/ensurepip/__init__.py b/Lib/ensurepip/__init__.py --- a/Lib/ensurepip/__init__.py +++ b/Lib/ensurepip/__init__.py @@ -12,7 +12,7 @@ __all__ = ["version", "bootstrap"] -_SETUPTOOLS_VERSION = "11.0" +_SETUPTOOLS_VERSION = "11.3.1" _PIP_VERSION = "6.0.6" diff --git a/Lib/ensurepip/_bundled/setuptools-11.0-py2.py3-none-any.whl b/Lib/ensurepip/_bundled/setuptools-11.0-py2.py3-none-any.whl deleted file mode 100644 index ceeaa3cd5c326495156cb90ed3934ee5b712ad2e..e69de29bb2d1d6434b8b29ae775ad8c2e48c5391 GIT binary patch [stripped] diff --git a/Lib/ensurepip/_bundled/setuptools-11.3.1-py2.py3-none-any.whl b/Lib/ensurepip/_bundled/setuptools-11.3.1-py2.py3-none-any.whl new file mode 100644 index e69de29bb2d1d6434b8b29ae775ad8c2e48c5391..6892ff89b4f916c40a0044c18d4b41dbf73340e3 GIT binary patch [stripped] -- Repository URL: https://hg.python.org/cpython From python-checkins at python.org Sun Jan 11 21:51:19 2015 From: python-checkins at python.org (donald.stufft) Date: Sun, 11 Jan 2015 20:51:19 +0000 Subject: [Python-checkins] =?utf-8?q?cpython_=283=2E4=29=3A_Update_setupto?= =?utf-8?q?ols_to_11=2E3=2E1?= Message-ID: <20150111205118.72575.47284@psf.io> https://hg.python.org/cpython/rev/1b145e8ae4be changeset: 94117:1b145e8ae4be branch: 3.4 parent: 94111:a9a9c71f8e15 user: Donald Stufft date: Sun Jan 11 15:51:11 2015 -0500 summary: Update setuptools to 11.3.1 files: Lib/ensurepip/__init__.py | 2 +- Lib/ensurepip/_bundled/setuptools-11.0-py2.py3-none-any.whl | Bin Lib/ensurepip/_bundled/setuptools-11.3.1-py2.py3-none-any.whl | Bin 3 files changed, 1 insertions(+), 1 deletions(-) diff --git a/Lib/ensurepip/__init__.py b/Lib/ensurepip/__init__.py --- a/Lib/ensurepip/__init__.py +++ b/Lib/ensurepip/__init__.py @@ -8,7 +8,7 @@ __all__ = ["version", "bootstrap"] -_SETUPTOOLS_VERSION = "11.0" +_SETUPTOOLS_VERSION = "11.3.1" _PIP_VERSION = "6.0.6" diff --git a/Lib/ensurepip/_bundled/setuptools-11.0-py2.py3-none-any.whl b/Lib/ensurepip/_bundled/setuptools-11.0-py2.py3-none-any.whl deleted file mode 100644 index ceeaa3cd5c326495156cb90ed3934ee5b712ad2e..e69de29bb2d1d6434b8b29ae775ad8c2e48c5391 GIT binary patch [stripped] diff --git a/Lib/ensurepip/_bundled/setuptools-11.3.1-py2.py3-none-any.whl b/Lib/ensurepip/_bundled/setuptools-11.3.1-py2.py3-none-any.whl new file mode 100644 index e69de29bb2d1d6434b8b29ae775ad8c2e48c5391..6892ff89b4f916c40a0044c18d4b41dbf73340e3 GIT binary patch [stripped] -- Repository URL: https://hg.python.org/cpython From python-checkins at python.org Sun Jan 11 21:53:26 2015 From: python-checkins at python.org (donald.stufft) Date: Sun, 11 Jan 2015 20:53:26 +0000 Subject: [Python-checkins] =?utf-8?q?cpython_=28merge_3=2E4_-=3E_default?= =?utf-8?q?=29=3A_Merged_3=2E4_into_default?= Message-ID: <20150111205326.72575.34916@psf.io> https://hg.python.org/cpython/rev/8b3c609f3f73 changeset: 94118:8b3c609f3f73 parent: 94115:ca5fe2148490 parent: 94117:1b145e8ae4be user: Donald Stufft date: Sun Jan 11 15:53:02 2015 -0500 summary: Merged 3.4 into default files: Lib/ensurepip/__init__.py | 2 +- Lib/ensurepip/_bundled/setuptools-11.0-py2.py3-none-any.whl | Bin Lib/ensurepip/_bundled/setuptools-11.3.1-py2.py3-none-any.whl | Bin 3 files changed, 1 insertions(+), 1 deletions(-) diff --git a/Lib/ensurepip/__init__.py b/Lib/ensurepip/__init__.py --- a/Lib/ensurepip/__init__.py +++ b/Lib/ensurepip/__init__.py @@ -8,7 +8,7 @@ __all__ = ["version", "bootstrap"] -_SETUPTOOLS_VERSION = "11.0" +_SETUPTOOLS_VERSION = "11.3.1" _PIP_VERSION = "6.0.6" diff --git a/Lib/ensurepip/_bundled/setuptools-11.0-py2.py3-none-any.whl b/Lib/ensurepip/_bundled/setuptools-11.0-py2.py3-none-any.whl deleted file mode 100644 index ceeaa3cd5c326495156cb90ed3934ee5b712ad2e..e69de29bb2d1d6434b8b29ae775ad8c2e48c5391 GIT binary patch [stripped] diff --git a/Lib/ensurepip/_bundled/setuptools-11.3.1-py2.py3-none-any.whl b/Lib/ensurepip/_bundled/setuptools-11.3.1-py2.py3-none-any.whl new file mode 100644 index e69de29bb2d1d6434b8b29ae775ad8c2e48c5391..6892ff89b4f916c40a0044c18d4b41dbf73340e3 GIT binary patch [stripped] -- Repository URL: https://hg.python.org/cpython From solipsis at pitrou.net Mon Jan 12 08:53:57 2015 From: solipsis at pitrou.net (solipsis at pitrou.net) Date: Mon, 12 Jan 2015 08:53:57 +0100 Subject: [Python-checkins] Daily reference leaks (8b3c609f3f73): sum=-3 Message-ID: results for 8b3c609f3f73 on branch "default" -------------------------------------------- test_collections leaked [-4, 0, 0] references, sum=-4 test_collections leaked [-2, 0, 0] memory blocks, sum=-2 test_functools leaked [0, 0, 3] memory blocks, sum=3 Command line was: ['./python', '-m', 'test.regrtest', '-uall', '-R', '3:3:/home/antoine/cpython/refleaks/reflogKzw4VR', '-x'] From python-checkins at python.org Mon Jan 12 10:58:57 2015 From: python-checkins at python.org (larry.hastings) Date: Mon, 12 Jan 2015 09:58:57 +0000 Subject: [Python-checkins] =?utf-8?q?peps=3A_Removed_Ronald_Oussoren_from_?= =?utf-8?q?being_listed_as_Mac_PE_for_3=2E4_and_3=2E5=2E?= Message-ID: <20150112095856.72571.9080@psf.io> https://hg.python.org/peps/rev/59dfaaf19a14 changeset: 5670:59dfaaf19a14 user: Larry Hastings date: Mon Jan 12 01:58:52 2015 -0800 summary: Removed Ronald Oussoren from being listed as Mac PE for 3.4 and 3.5. Ned has been the only Mac PE for 3.4, and will be again for 3.5. files: pep-0429.txt | 2 +- pep-0478.txt | 2 +- 2 files changed, 2 insertions(+), 2 deletions(-) diff --git a/pep-0429.txt b/pep-0429.txt --- a/pep-0429.txt +++ b/pep-0429.txt @@ -27,7 +27,7 @@ - 3.4 Release Manager: Larry Hastings - Windows installers: Martin v. L?wis -- Mac installers: Ned Deily / Ronald Oussoren +- Mac installers: Ned Deily - Documentation: Georg Brandl diff --git a/pep-0478.txt b/pep-0478.txt --- a/pep-0478.txt +++ b/pep-0478.txt @@ -27,7 +27,7 @@ - 3.5 Release Manager: Larry Hastings - Windows installers: Steve Dower -- Mac installers: Ned Deily / Ronald Oussoren +- Mac installers: Ned Deily - Documentation: Georg Brandl -- Repository URL: https://hg.python.org/peps From python-checkins at python.org Mon Jan 12 21:45:53 2015 From: python-checkins at python.org (antoine.pitrou) Date: Mon, 12 Jan 2015 20:45:53 +0000 Subject: [Python-checkins] =?utf-8?q?cpython=3A_Issue_=2319777=3A_Provide_?= =?utf-8?q?a_home=28=29_classmethod_on_Path_objects=2E?= Message-ID: <20150112204538.22399.41892@psf.io> https://hg.python.org/cpython/rev/4a55b98314cd changeset: 94119:4a55b98314cd user: Antoine Pitrou date: Mon Jan 12 21:03:41 2015 +0100 summary: Issue #19777: Provide a home() classmethod on Path objects. Contributed by Victor Salgado and Mayank Tripathi. files: Doc/library/pathlib.rst | 11 +++++++++++ Lib/pathlib.py | 7 +++++++ Lib/test/test_pathlib.py | 11 +++++++++++ Misc/ACKS | 2 ++ Misc/NEWS | 3 +++ 5 files changed, 34 insertions(+), 0 deletions(-) diff --git a/Doc/library/pathlib.rst b/Doc/library/pathlib.rst --- a/Doc/library/pathlib.rst +++ b/Doc/library/pathlib.rst @@ -628,6 +628,17 @@ PosixPath('/home/antoine/pathlib') +.. classmethod:: Path.home() + + Return a new path object representing the user's home directory (as + returned by :func:`os.path.expanduser` with ``~`` construct):: + + >>> Path.home() + PosixPath('/home/antoine') + + .. versionadded:: 3.5 + + .. method:: Path.stat() Return information about this path (similarly to :func:`os.stat`). diff --git a/Lib/pathlib.py b/Lib/pathlib.py --- a/Lib/pathlib.py +++ b/Lib/pathlib.py @@ -1008,6 +1008,13 @@ """ return cls(os.getcwd()) + @classmethod + def home(cls): + """Return a new path pointing to the user's home directory (as + returned by os.path.expanduser('~')). + """ + return cls(cls()._flavour.gethomedir(None)) + def samefile(self, other_path): """Return whether `other_file` is the same or not as this file. (as returned by os.path.samefile(file, other_file)). diff --git a/Lib/test/test_pathlib.py b/Lib/test/test_pathlib.py --- a/Lib/test/test_pathlib.py +++ b/Lib/test/test_pathlib.py @@ -1261,6 +1261,17 @@ p = self.cls.cwd() self._test_cwd(p) + def _test_home(self, p): + q = self.cls(os.path.expanduser('~')) + self.assertEqual(p, q) + self.assertEqual(str(p), str(q)) + self.assertIs(type(p), type(q)) + self.assertTrue(p.is_absolute()) + + def test_home(self): + p = self.cls.home() + self._test_home(p) + def test_samefile(self): fileA_path = os.path.join(BASE, 'fileA') fileB_path = os.path.join(BASE, 'dirB', 'fileB') diff --git a/Misc/ACKS b/Misc/ACKS --- a/Misc/ACKS +++ b/Misc/ACKS @@ -1201,6 +1201,7 @@ Suman Saha Hajime Saitou George Sakkis +Victor Salgado Rich Salz Kevin Samborn Adrian Sampson @@ -1390,6 +1391,7 @@ Nathan Trapuzzano Laurence Tratt Alberto Trevino +Mayank Tripathi Matthias Troffaes Tom Tromey John Tromp diff --git a/Misc/NEWS b/Misc/NEWS --- a/Misc/NEWS +++ b/Misc/NEWS @@ -203,6 +203,9 @@ Library ------- +- Issue #19777: Provide a home() classmethod on Path objects. Contributed + by Victor Salgado and Mayank Tripathi. + - Issue #23206: Make ``json.dumps(..., ensure_ascii=False)`` as fast as the default case of ``ensure_ascii=True``. Patch by Naoki Inada. -- Repository URL: https://hg.python.org/cpython From solipsis at pitrou.net Tue Jan 13 09:20:25 2015 From: solipsis at pitrou.net (solipsis at pitrou.net) Date: Tue, 13 Jan 2015 09:20:25 +0100 Subject: [Python-checkins] Daily reference leaks (4a55b98314cd): sum=6 Message-ID: results for 4a55b98314cd on branch "default" -------------------------------------------- test_asyncio leaked [0, 3, 0] memory blocks, sum=3 test_functools leaked [0, 0, 3] memory blocks, sum=3 Command line was: ['./python', '-m', 'test.regrtest', '-uall', '-R', '3:3:/home/antoine/cpython/refleaks/reflogAZ_5D3', '-x'] From python-checkins at python.org Tue Jan 13 10:01:58 2015 From: python-checkins at python.org (victor.stinner) Date: Tue, 13 Jan 2015 09:01:58 +0000 Subject: [Python-checkins] =?utf-8?q?cpython_=28merge_3=2E4_-=3E_default?= =?utf-8?q?=29=3A_Issue_=2323209=2C_=2323225=3A_selectors=2EBaseSelector?= =?utf-8?q?=2Eget=5Fkey=28=29_now_raises_a?= Message-ID: <20150113090148.8745.98932@psf.io> https://hg.python.org/cpython/rev/6e7403bc906f changeset: 94121:6e7403bc906f parent: 94119:4a55b98314cd parent: 94120:1544bdc409be user: Victor Stinner date: Tue Jan 13 10:00:55 2015 +0100 summary: Issue #23209, #23225: selectors.BaseSelector.get_key() now raises a RuntimeError if the selector is closed. And selectors.BaseSelector.close() now clears its internal reference to the selector mapping to break a reference cycle. Initial patch written by Martin Richard. files: Lib/selectors.py | 3 +++ Lib/test/test_selectors.py | 11 +++++++---- Misc/NEWS | 5 +++++ 3 files changed, 15 insertions(+), 4 deletions(-) diff --git a/Lib/selectors.py b/Lib/selectors.py --- a/Lib/selectors.py +++ b/Lib/selectors.py @@ -174,6 +174,8 @@ SelectorKey for this file object """ mapping = self.get_map() + if mapping is None: + raise RuntimeError('Selector is closed') try: return mapping[fileobj] except KeyError: @@ -256,6 +258,7 @@ def close(self): self._fd_to_key.clear() + self._map = None def get_map(self): return self._map diff --git a/Lib/test/test_selectors.py b/Lib/test/test_selectors.py --- a/Lib/test/test_selectors.py +++ b/Lib/test/test_selectors.py @@ -178,14 +178,17 @@ s = self.SELECTOR() self.addCleanup(s.close) + mapping = s.get_map() rd, wr = self.make_socketpair() s.register(rd, selectors.EVENT_READ) s.register(wr, selectors.EVENT_WRITE) s.close() - self.assertRaises(KeyError, s.get_key, rd) - self.assertRaises(KeyError, s.get_key, wr) + self.assertRaises(RuntimeError, s.get_key, rd) + self.assertRaises(RuntimeError, s.get_key, wr) + self.assertRaises(KeyError, mapping.__getitem__, rd) + self.assertRaises(KeyError, mapping.__getitem__, wr) def test_get_key(self): s = self.SELECTOR() @@ -252,8 +255,8 @@ sel.register(rd, selectors.EVENT_READ) sel.register(wr, selectors.EVENT_WRITE) - self.assertRaises(KeyError, s.get_key, rd) - self.assertRaises(KeyError, s.get_key, wr) + self.assertRaises(RuntimeError, s.get_key, rd) + self.assertRaises(RuntimeError, s.get_key, wr) def test_fileno(self): s = self.SELECTOR() diff --git a/Misc/NEWS b/Misc/NEWS --- a/Misc/NEWS +++ b/Misc/NEWS @@ -203,6 +203,11 @@ Library ------- +- Issue #23209, #23225: selectors.BaseSelector.get_key() now raises a + RuntimeError if the selector is closed. And selectors.BaseSelector.close() + now clears its internal reference to the selector mapping to break a + reference cycle. Initial patch written by Martin Richard. + - Issue #19777: Provide a home() classmethod on Path objects. Contributed by Victor Salgado and Mayank Tripathi. -- Repository URL: https://hg.python.org/cpython From python-checkins at python.org Tue Jan 13 10:01:58 2015 From: python-checkins at python.org (victor.stinner) Date: Tue, 13 Jan 2015 09:01:58 +0000 Subject: [Python-checkins] =?utf-8?b?Y3B5dGhvbiAoMy40KTogSXNzdWUgIzIzMjA5?= =?utf-8?q?=2C_=2323225=3A_selectors=2EBaseSelector=2Eclose=28=29_now_clea?= =?utf-8?q?rs_its_internal?= Message-ID: <20150113090148.125888.67105@psf.io> https://hg.python.org/cpython/rev/1544bdc409be changeset: 94120:1544bdc409be branch: 3.4 parent: 94117:1b145e8ae4be user: Victor Stinner date: Tue Jan 13 09:58:33 2015 +0100 summary: Issue #23209, #23225: selectors.BaseSelector.close() now clears its internal reference to the selector mapping to break a reference cycle. Initial patch written by Martin Richard. files: Lib/selectors.py | 3 +++ Lib/test/test_selectors.py | 3 +++ Misc/NEWS | 4 ++++ 3 files changed, 10 insertions(+), 0 deletions(-) diff --git a/Lib/selectors.py b/Lib/selectors.py --- a/Lib/selectors.py +++ b/Lib/selectors.py @@ -175,6 +175,8 @@ """ mapping = self.get_map() try: + if mapping is None: + raise KeyError return mapping[fileobj] except KeyError: raise KeyError("{!r} is not registered".format(fileobj)) from None @@ -256,6 +258,7 @@ def close(self): self._fd_to_key.clear() + self._map = None def get_map(self): return self._map diff --git a/Lib/test/test_selectors.py b/Lib/test/test_selectors.py --- a/Lib/test/test_selectors.py +++ b/Lib/test/test_selectors.py @@ -180,6 +180,7 @@ s = self.SELECTOR() self.addCleanup(s.close) + mapping = s.get_map() rd, wr = self.make_socketpair() s.register(rd, selectors.EVENT_READ) @@ -188,6 +189,8 @@ s.close() self.assertRaises(KeyError, s.get_key, rd) self.assertRaises(KeyError, s.get_key, wr) + self.assertRaises(KeyError, mapping.__getitem__, rd) + self.assertRaises(KeyError, mapping.__getitem__, wr) def test_get_key(self): s = self.SELECTOR() diff --git a/Misc/NEWS b/Misc/NEWS --- a/Misc/NEWS +++ b/Misc/NEWS @@ -44,6 +44,10 @@ Library ------- +- Issue #23209, #23225: selectors.BaseSelector.close() now clears its internal + reference to the selector mapping to break a reference cycle. Initial patch + written by Martin Richard. + - Issue #21356: Make ssl.RAND_egd() optional to support LibreSSL. The availability of the function is checked during the compilation. Patch written by Bernard Spil. -- Repository URL: https://hg.python.org/cpython From python-checkins at python.org Tue Jan 13 15:20:40 2015 From: python-checkins at python.org (benjamin.peterson) Date: Tue, 13 Jan 2015 14:20:40 +0000 Subject: [Python-checkins] =?utf-8?q?cpython_=283=2E4=29=3A_fix_instances_?= =?utf-8?q?of_consecutive_articles_=28closes_=2323221=29?= Message-ID: <20150113142039.22413.56688@psf.io> https://hg.python.org/cpython/rev/b168c41f2e3f changeset: 94122:b168c41f2e3f branch: 3.4 parent: 94120:1544bdc409be user: Benjamin Peterson date: Tue Jan 13 09:17:24 2015 -0500 summary: fix instances of consecutive articles (closes #23221) Patch by Karan Goel. files: Doc/c-api/exceptions.rst | 2 +- Doc/c-api/init.rst | 2 +- Doc/c-api/structures.rst | 2 +- Doc/distutils/apiref.rst | 2 +- Doc/library/unittest.mock.rst | 2 +- Include/dynamic_annotations.h | 2 +- Include/unicodeobject.h | 2 +- Lib/distutils/dir_util.py | 2 +- Lib/http/cookiejar.py | 2 +- Lib/lib2to3/fixes/fix_exitfunc.py | 2 +- Lib/socket.py | 2 +- Lib/test/test_argparse.py | 2 +- Misc/ACKS | 1 + Modules/_ctypes/_ctypes.c | 2 +- 14 files changed, 14 insertions(+), 13 deletions(-) diff --git a/Doc/c-api/exceptions.rst b/Doc/c-api/exceptions.rst --- a/Doc/c-api/exceptions.rst +++ b/Doc/c-api/exceptions.rst @@ -64,7 +64,7 @@ Do not compare the return value to a specific exception; use :c:func:`PyErr_ExceptionMatches` instead, shown below. (The comparison could easily fail since the exception may be an instance instead of a class, in the - case of a class exception, or it may the a subclass of the expected exception.) + case of a class exception, or it may be a subclass of the expected exception.) .. c:function:: int PyErr_ExceptionMatches(PyObject *exc) diff --git a/Doc/c-api/init.rst b/Doc/c-api/init.rst --- a/Doc/c-api/init.rst +++ b/Doc/c-api/init.rst @@ -1182,7 +1182,7 @@ .. c:function:: PyThreadState * PyInterpreterState_ThreadHead(PyInterpreterState *interp) - Return the a pointer to the first :c:type:`PyThreadState` object in the list of + Return the pointer to the first :c:type:`PyThreadState` object in the list of threads associated with the interpreter *interp*. diff --git a/Doc/c-api/structures.rst b/Doc/c-api/structures.rst --- a/Doc/c-api/structures.rst +++ b/Doc/c-api/structures.rst @@ -131,7 +131,7 @@ types, but they always return :c:type:`PyObject\*`. If the function is not of the :c:type:`PyCFunction`, the compiler will require a cast in the method table. Even though :c:type:`PyCFunction` defines the first parameter as -:c:type:`PyObject\*`, it is common that the method implementation uses a the +:c:type:`PyObject\*`, it is common that the method implementation uses the specific C type of the *self* object. The :attr:`ml_flags` field is a bitfield which can include the following flags. diff --git a/Doc/distutils/apiref.rst b/Doc/distutils/apiref.rst --- a/Doc/distutils/apiref.rst +++ b/Doc/distutils/apiref.rst @@ -964,7 +964,7 @@ .. function:: create_tree(base_dir, files[, mode=0o777, verbose=0, dry_run=0]) Create all the empty directories under *base_dir* needed to put *files* there. - *base_dir* is just the a name of a directory which doesn't necessarily exist + *base_dir* is just the name of a directory which doesn't necessarily exist yet; *files* is a list of filenames to be interpreted relative to *base_dir*. *base_dir* + the directory portion of every file in *files* will be created if it doesn't already exist. *mode*, *verbose* and *dry_run* flags are as for diff --git a/Doc/library/unittest.mock.rst b/Doc/library/unittest.mock.rst --- a/Doc/library/unittest.mock.rst +++ b/Doc/library/unittest.mock.rst @@ -1499,7 +1499,7 @@ However, consider the alternative scenario where instead of ``from a import SomeClass`` module b does ``import a`` and ``some_function`` uses ``a.SomeClass``. Both of these import forms are common. In this case the class we want to patch is -being looked up on the a module and so we have to patch ``a.SomeClass`` instead:: +being looked up in the module and so we have to patch ``a.SomeClass`` instead:: @patch('a.SomeClass') diff --git a/Include/dynamic_annotations.h b/Include/dynamic_annotations.h --- a/Include/dynamic_annotations.h +++ b/Include/dynamic_annotations.h @@ -150,7 +150,7 @@ /* Report that a new memory at "address" of size "size" has been allocated. This might be used when the memory has been retrieved from a free list and - is about to be reused, or when a the locking discipline for a variable + is about to be reused, or when the locking discipline for a variable changes. */ #define _Py_ANNOTATE_NEW_MEMORY(address, size) \ AnnotateNewMemory(__FILE__, __LINE__, address, size) diff --git a/Include/unicodeobject.h b/Include/unicodeobject.h --- a/Include/unicodeobject.h +++ b/Include/unicodeobject.h @@ -605,7 +605,7 @@ ); #endif -/* Initializes the canonical string representation from a the deprecated +/* Initializes the canonical string representation from the deprecated wstr/Py_UNICODE representation. This function is used to convert Unicode objects which were created using the old API to the new flexible format introduced with PEP 393. diff --git a/Lib/distutils/dir_util.py b/Lib/distutils/dir_util.py --- a/Lib/distutils/dir_util.py +++ b/Lib/distutils/dir_util.py @@ -81,7 +81,7 @@ """Create all the empty directories under 'base_dir' needed to put 'files' there. - 'base_dir' is just the a name of a directory which doesn't necessarily + 'base_dir' is just the name of a directory which doesn't necessarily exist yet; 'files' is a list of filenames to be interpreted relative to 'base_dir'. 'base_dir' + the directory portion of every file in 'files' will be created if it doesn't already exist. 'mode', 'verbose' and diff --git a/Lib/http/cookiejar.py b/Lib/http/cookiejar.py --- a/Lib/http/cookiejar.py +++ b/Lib/http/cookiejar.py @@ -1792,7 +1792,7 @@ def lwp_cookie_str(cookie): - """Return string representation of Cookie in an the LWP cookie file format. + """Return string representation of Cookie in the LWP cookie file format. Actually, the format is extended a bit -- see module docstring. diff --git a/Lib/lib2to3/fixes/fix_exitfunc.py b/Lib/lib2to3/fixes/fix_exitfunc.py --- a/Lib/lib2to3/fixes/fix_exitfunc.py +++ b/Lib/lib2to3/fixes/fix_exitfunc.py @@ -35,7 +35,7 @@ self.sys_import = None def transform(self, node, results): - # First, find a the sys import. We'll just hope it's global scope. + # First, find the sys import. We'll just hope it's global scope. if "sys_import" in results: if self.sys_import is None: self.sys_import = results["sys_import"] diff --git a/Lib/socket.py b/Lib/socket.py --- a/Lib/socket.py +++ b/Lib/socket.py @@ -299,7 +299,7 @@ def fromshare(info): """ fromshare(info) -> socket object - Create a socket object from a the bytes object returned by + Create a socket object from the bytes object returned by socket.share(pid). """ return socket(0, 0, 0, info) diff --git a/Lib/test/test_argparse.py b/Lib/test/test_argparse.py --- a/Lib/test/test_argparse.py +++ b/Lib/test/test_argparse.py @@ -644,7 +644,7 @@ class TestOptionalsRequired(ParserTestCase): - """Tests the an optional action that is required""" + """Tests an optional action that is required""" argument_signatures = [ Sig('-x', type=int, required=True), diff --git a/Misc/ACKS b/Misc/ACKS --- a/Misc/ACKS +++ b/Misc/ACKS @@ -483,6 +483,7 @@ Matt Giuca Wim Glenn Michael Goderbauer +Karan Goel Jeroen Van Goey Christoph Gohlke Tim Golden diff --git a/Modules/_ctypes/_ctypes.c b/Modules/_ctypes/_ctypes.c --- a/Modules/_ctypes/_ctypes.c +++ b/Modules/_ctypes/_ctypes.c @@ -4830,7 +4830,7 @@ *(void **)self->b_ptr = dst->b_ptr; /* - A Pointer instance must keep a the value it points to alive. So, a + A Pointer instance must keep the value it points to alive. So, a pointer instance has b_length set to 2 instead of 1, and we set 'value' itself as the second item of the b_objects list, additionally. */ -- Repository URL: https://hg.python.org/cpython From python-checkins at python.org Tue Jan 13 15:20:39 2015 From: python-checkins at python.org (benjamin.peterson) Date: Tue, 13 Jan 2015 14:20:39 +0000 Subject: [Python-checkins] =?utf-8?q?cpython_=282=2E7=29=3A_fix_instances_?= =?utf-8?q?of_consecutive_articles_=28closes_=2323221=29?= Message-ID: <20150113142039.11573.19012@psf.io> https://hg.python.org/cpython/rev/6a19e37ce94d changeset: 94123:6a19e37ce94d branch: 2.7 parent: 94116:ad92f9b2fe04 user: Benjamin Peterson date: Tue Jan 13 09:17:24 2015 -0500 summary: fix instances of consecutive articles (closes #23221) Patch by Karan Goel. files: Doc/c-api/exceptions.rst | 2 +- Doc/c-api/init.rst | 2 +- Doc/c-api/structures.rst | 2 +- Doc/distutils/apiref.rst | 2 +- Lib/_LWPCookieJar.py | 2 +- Lib/distutils/dir_util.py | 2 +- Lib/lib2to3/fixes/fix_exitfunc.py | 2 +- Lib/test/test_argparse.py | 2 +- Misc/ACKS | 1 + Modules/_ctypes/_ctypes.c | 2 +- 10 files changed, 10 insertions(+), 9 deletions(-) diff --git a/Doc/c-api/exceptions.rst b/Doc/c-api/exceptions.rst --- a/Doc/c-api/exceptions.rst +++ b/Doc/c-api/exceptions.rst @@ -70,7 +70,7 @@ Do not compare the return value to a specific exception; use :c:func:`PyErr_ExceptionMatches` instead, shown below. (The comparison could easily fail since the exception may be an instance instead of a class, in the - case of a class exception, or it may the a subclass of the expected exception.) + case of a class exception, or it may be a subclass of the expected exception.) .. c:function:: int PyErr_ExceptionMatches(PyObject *exc) diff --git a/Doc/c-api/init.rst b/Doc/c-api/init.rst --- a/Doc/c-api/init.rst +++ b/Doc/c-api/init.rst @@ -1136,7 +1136,7 @@ .. c:function:: PyThreadState * PyInterpreterState_ThreadHead(PyInterpreterState *interp) - Return the a pointer to the first :c:type:`PyThreadState` object in the list of + Return the pointer to the first :c:type:`PyThreadState` object in the list of threads associated with the interpreter *interp*. .. versionadded:: 2.2 diff --git a/Doc/c-api/structures.rst b/Doc/c-api/structures.rst --- a/Doc/c-api/structures.rst +++ b/Doc/c-api/structures.rst @@ -122,7 +122,7 @@ types, but they always return :c:type:`PyObject\*`. If the function is not of the :c:type:`PyCFunction`, the compiler will require a cast in the method table. Even though :c:type:`PyCFunction` defines the first parameter as -:c:type:`PyObject\*`, it is common that the method implementation uses a the +:c:type:`PyObject\*`, it is common that the method implementation uses the specific C type of the *self* object. The :attr:`ml_flags` field is a bitfield which can include the following flags. diff --git a/Doc/distutils/apiref.rst b/Doc/distutils/apiref.rst --- a/Doc/distutils/apiref.rst +++ b/Doc/distutils/apiref.rst @@ -970,7 +970,7 @@ .. function:: create_tree(base_dir, files[, mode=0777, verbose=0, dry_run=0]) Create all the empty directories under *base_dir* needed to put *files* there. - *base_dir* is just the a name of a directory which doesn't necessarily exist + *base_dir* is just the name of a directory which doesn't necessarily exist yet; *files* is a list of filenames to be interpreted relative to *base_dir*. *base_dir* + the directory portion of every file in *files* will be created if it doesn't already exist. *mode*, *verbose* and *dry_run* flags are as for diff --git a/Lib/_LWPCookieJar.py b/Lib/_LWPCookieJar.py --- a/Lib/_LWPCookieJar.py +++ b/Lib/_LWPCookieJar.py @@ -18,7 +18,7 @@ iso2time, time2isoz) def lwp_cookie_str(cookie): - """Return string representation of Cookie in an the LWP cookie file format. + """Return string representation of Cookie in the LWP cookie file format. Actually, the format is extended a bit -- see module docstring. diff --git a/Lib/distutils/dir_util.py b/Lib/distutils/dir_util.py --- a/Lib/distutils/dir_util.py +++ b/Lib/distutils/dir_util.py @@ -83,7 +83,7 @@ """Create all the empty directories under 'base_dir' needed to put 'files' there. - 'base_dir' is just the a name of a directory which doesn't necessarily + 'base_dir' is just the name of a directory which doesn't necessarily exist yet; 'files' is a list of filenames to be interpreted relative to 'base_dir'. 'base_dir' + the directory portion of every file in 'files' will be created if it doesn't already exist. 'mode', 'verbose' and diff --git a/Lib/lib2to3/fixes/fix_exitfunc.py b/Lib/lib2to3/fixes/fix_exitfunc.py --- a/Lib/lib2to3/fixes/fix_exitfunc.py +++ b/Lib/lib2to3/fixes/fix_exitfunc.py @@ -35,7 +35,7 @@ self.sys_import = None def transform(self, node, results): - # First, find a the sys import. We'll just hope it's global scope. + # First, find the sys import. We'll just hope it's global scope. if "sys_import" in results: if self.sys_import is None: self.sys_import = results["sys_import"] diff --git a/Lib/test/test_argparse.py b/Lib/test/test_argparse.py --- a/Lib/test/test_argparse.py +++ b/Lib/test/test_argparse.py @@ -646,7 +646,7 @@ class TestOptionalsRequired(ParserTestCase): - """Tests the an optional action that is required""" + """Tests an optional action that is required""" argument_signatures = [ Sig('-x', type=int, required=True), diff --git a/Misc/ACKS b/Misc/ACKS --- a/Misc/ACKS +++ b/Misc/ACKS @@ -475,6 +475,7 @@ Matt Giuca Wim Glenn Michael Goderbauer +Karan Goel Jeroen Van Goey Christoph Gohlke Tim Golden diff --git a/Modules/_ctypes/_ctypes.c b/Modules/_ctypes/_ctypes.c --- a/Modules/_ctypes/_ctypes.c +++ b/Modules/_ctypes/_ctypes.c @@ -5060,7 +5060,7 @@ *(void **)self->b_ptr = dst->b_ptr; /* - A Pointer instance must keep a the value it points to alive. So, a + A Pointer instance must keep the value it points to alive. So, a pointer instance has b_length set to 2 instead of 1, and we set 'value' itself as the second item of the b_objects list, additionally. */ -- Repository URL: https://hg.python.org/cpython From python-checkins at python.org Tue Jan 13 15:20:39 2015 From: python-checkins at python.org (benjamin.peterson) Date: Tue, 13 Jan 2015 14:20:39 +0000 Subject: [Python-checkins] =?utf-8?q?cpython_=28merge_3=2E4_-=3E_default?= =?utf-8?b?KTogbWVyZ2UgMy40ICgjMjMyMjEp?= Message-ID: <20150113142039.125906.62520@psf.io> https://hg.python.org/cpython/rev/c16b76c7c8ba changeset: 94124:c16b76c7c8ba parent: 94121:6e7403bc906f parent: 94122:b168c41f2e3f user: Benjamin Peterson date: Tue Jan 13 09:20:31 2015 -0500 summary: merge 3.4 (#23221) files: Doc/c-api/exceptions.rst | 2 +- Doc/c-api/init.rst | 2 +- Doc/c-api/structures.rst | 2 +- Doc/distutils/apiref.rst | 2 +- Doc/library/unittest.mock.rst | 2 +- Include/dynamic_annotations.h | 2 +- Include/unicodeobject.h | 2 +- Lib/distutils/dir_util.py | 2 +- Lib/http/cookiejar.py | 2 +- Lib/lib2to3/fixes/fix_exitfunc.py | 2 +- Lib/socket.py | 2 +- Lib/test/test_argparse.py | 2 +- Misc/ACKS | 1 + Modules/_ctypes/_ctypes.c | 2 +- 14 files changed, 14 insertions(+), 13 deletions(-) diff --git a/Doc/c-api/exceptions.rst b/Doc/c-api/exceptions.rst --- a/Doc/c-api/exceptions.rst +++ b/Doc/c-api/exceptions.rst @@ -350,7 +350,7 @@ Do not compare the return value to a specific exception; use :c:func:`PyErr_ExceptionMatches` instead, shown below. (The comparison could easily fail since the exception may be an instance instead of a class, in the - case of a class exception, or it may the a subclass of the expected exception.) + case of a class exception, or it may be a subclass of the expected exception.) .. c:function:: int PyErr_ExceptionMatches(PyObject *exc) diff --git a/Doc/c-api/init.rst b/Doc/c-api/init.rst --- a/Doc/c-api/init.rst +++ b/Doc/c-api/init.rst @@ -1197,7 +1197,7 @@ .. c:function:: PyThreadState * PyInterpreterState_ThreadHead(PyInterpreterState *interp) - Return the a pointer to the first :c:type:`PyThreadState` object in the list of + Return the pointer to the first :c:type:`PyThreadState` object in the list of threads associated with the interpreter *interp*. diff --git a/Doc/c-api/structures.rst b/Doc/c-api/structures.rst --- a/Doc/c-api/structures.rst +++ b/Doc/c-api/structures.rst @@ -131,7 +131,7 @@ types, but they always return :c:type:`PyObject\*`. If the function is not of the :c:type:`PyCFunction`, the compiler will require a cast in the method table. Even though :c:type:`PyCFunction` defines the first parameter as -:c:type:`PyObject\*`, it is common that the method implementation uses a the +:c:type:`PyObject\*`, it is common that the method implementation uses the specific C type of the *self* object. The :attr:`ml_flags` field is a bitfield which can include the following flags. diff --git a/Doc/distutils/apiref.rst b/Doc/distutils/apiref.rst --- a/Doc/distutils/apiref.rst +++ b/Doc/distutils/apiref.rst @@ -964,7 +964,7 @@ .. function:: create_tree(base_dir, files[, mode=0o777, verbose=0, dry_run=0]) Create all the empty directories under *base_dir* needed to put *files* there. - *base_dir* is just the a name of a directory which doesn't necessarily exist + *base_dir* is just the name of a directory which doesn't necessarily exist yet; *files* is a list of filenames to be interpreted relative to *base_dir*. *base_dir* + the directory portion of every file in *files* will be created if it doesn't already exist. *mode*, *verbose* and *dry_run* flags are as for diff --git a/Doc/library/unittest.mock.rst b/Doc/library/unittest.mock.rst --- a/Doc/library/unittest.mock.rst +++ b/Doc/library/unittest.mock.rst @@ -1541,7 +1541,7 @@ However, consider the alternative scenario where instead of ``from a import SomeClass`` module b does ``import a`` and ``some_function`` uses ``a.SomeClass``. Both of these import forms are common. In this case the class we want to patch is -being looked up on the a module and so we have to patch ``a.SomeClass`` instead:: +being looked up in the module and so we have to patch ``a.SomeClass`` instead:: @patch('a.SomeClass') diff --git a/Include/dynamic_annotations.h b/Include/dynamic_annotations.h --- a/Include/dynamic_annotations.h +++ b/Include/dynamic_annotations.h @@ -150,7 +150,7 @@ /* Report that a new memory at "address" of size "size" has been allocated. This might be used when the memory has been retrieved from a free list and - is about to be reused, or when a the locking discipline for a variable + is about to be reused, or when the locking discipline for a variable changes. */ #define _Py_ANNOTATE_NEW_MEMORY(address, size) \ AnnotateNewMemory(__FILE__, __LINE__, address, size) diff --git a/Include/unicodeobject.h b/Include/unicodeobject.h --- a/Include/unicodeobject.h +++ b/Include/unicodeobject.h @@ -605,7 +605,7 @@ ); #endif -/* Initializes the canonical string representation from a the deprecated +/* Initializes the canonical string representation from the deprecated wstr/Py_UNICODE representation. This function is used to convert Unicode objects which were created using the old API to the new flexible format introduced with PEP 393. diff --git a/Lib/distutils/dir_util.py b/Lib/distutils/dir_util.py --- a/Lib/distutils/dir_util.py +++ b/Lib/distutils/dir_util.py @@ -81,7 +81,7 @@ """Create all the empty directories under 'base_dir' needed to put 'files' there. - 'base_dir' is just the a name of a directory which doesn't necessarily + 'base_dir' is just the name of a directory which doesn't necessarily exist yet; 'files' is a list of filenames to be interpreted relative to 'base_dir'. 'base_dir' + the directory portion of every file in 'files' will be created if it doesn't already exist. 'mode', 'verbose' and diff --git a/Lib/http/cookiejar.py b/Lib/http/cookiejar.py --- a/Lib/http/cookiejar.py +++ b/Lib/http/cookiejar.py @@ -1792,7 +1792,7 @@ def lwp_cookie_str(cookie): - """Return string representation of Cookie in an the LWP cookie file format. + """Return string representation of Cookie in the LWP cookie file format. Actually, the format is extended a bit -- see module docstring. diff --git a/Lib/lib2to3/fixes/fix_exitfunc.py b/Lib/lib2to3/fixes/fix_exitfunc.py --- a/Lib/lib2to3/fixes/fix_exitfunc.py +++ b/Lib/lib2to3/fixes/fix_exitfunc.py @@ -35,7 +35,7 @@ self.sys_import = None def transform(self, node, results): - # First, find a the sys import. We'll just hope it's global scope. + # First, find the sys import. We'll just hope it's global scope. if "sys_import" in results: if self.sys_import is None: self.sys_import = results["sys_import"] diff --git a/Lib/socket.py b/Lib/socket.py --- a/Lib/socket.py +++ b/Lib/socket.py @@ -450,7 +450,7 @@ def fromshare(info): """ fromshare(info) -> socket object - Create a socket object from a the bytes object returned by + Create a socket object from the bytes object returned by socket.share(pid). """ return socket(0, 0, 0, info) diff --git a/Lib/test/test_argparse.py b/Lib/test/test_argparse.py --- a/Lib/test/test_argparse.py +++ b/Lib/test/test_argparse.py @@ -635,7 +635,7 @@ class TestOptionalsRequired(ParserTestCase): - """Tests the an optional action that is required""" + """Tests an optional action that is required""" argument_signatures = [ Sig('-x', type=int, required=True), diff --git a/Misc/ACKS b/Misc/ACKS --- a/Misc/ACKS +++ b/Misc/ACKS @@ -488,6 +488,7 @@ Matt Giuca Wim Glenn Michael Goderbauer +Karan Goel Jeroen Van Goey Christoph Gohlke Tim Golden diff --git a/Modules/_ctypes/_ctypes.c b/Modules/_ctypes/_ctypes.c --- a/Modules/_ctypes/_ctypes.c +++ b/Modules/_ctypes/_ctypes.c @@ -4830,7 +4830,7 @@ *(void **)self->b_ptr = dst->b_ptr; /* - A Pointer instance must keep a the value it points to alive. So, a + A Pointer instance must keep the value it points to alive. So, a pointer instance has b_length set to 2 instead of 1, and we set 'value' itself as the second item of the b_objects list, additionally. */ -- Repository URL: https://hg.python.org/cpython From python-checkins at python.org Tue Jan 13 16:15:12 2015 From: python-checkins at python.org (victor.stinner) Date: Tue, 13 Jan 2015 15:15:12 +0000 Subject: [Python-checkins] =?utf-8?q?cpython_=283=2E4=29=3A_Tulip_issue_18?= =?utf-8?q?4=3A_Fix_test=5Fpipe=28=29_on_Windows?= Message-ID: <20150113151507.8757.23143@psf.io> https://hg.python.org/cpython/rev/0387862a5675 changeset: 94126:0387862a5675 branch: 3.4 user: Victor Stinner date: Tue Jan 13 16:13:06 2015 +0100 summary: Tulip issue 184: Fix test_pipe() on Windows Pass explicitly the event loop to StreamReaderProtocol. files: Lib/test/test_asyncio/test_windows_events.py | 3 ++- 1 files changed, 2 insertions(+), 1 deletions(-) diff --git a/Lib/test/test_asyncio/test_windows_events.py b/Lib/test/test_asyncio/test_windows_events.py --- a/Lib/test/test_asyncio/test_windows_events.py +++ b/Lib/test/test_asyncio/test_windows_events.py @@ -67,7 +67,8 @@ clients = [] for i in range(5): stream_reader = asyncio.StreamReader(loop=self.loop) - protocol = asyncio.StreamReaderProtocol(stream_reader) + protocol = asyncio.StreamReaderProtocol(stream_reader, + loop=self.loop) trans, proto = yield from self.loop.create_pipe_connection( lambda: protocol, ADDRESS) self.assertIsInstance(trans, asyncio.Transport) -- Repository URL: https://hg.python.org/cpython From python-checkins at python.org Tue Jan 13 16:15:12 2015 From: python-checkins at python.org (victor.stinner) Date: Tue, 13 Jan 2015 15:15:12 +0000 Subject: [Python-checkins] =?utf-8?q?cpython_=28merge_3=2E4_-=3E_default?= =?utf-8?q?=29=3A_Merge_3=2E4_=28asyncio=29?= Message-ID: <20150113151507.11589.71651@psf.io> https://hg.python.org/cpython/rev/56f717235c45 changeset: 94127:56f717235c45 parent: 94124:c16b76c7c8ba parent: 94126:0387862a5675 user: Victor Stinner date: Tue Jan 13 16:13:36 2015 +0100 summary: Merge 3.4 (asyncio) files: Lib/asyncio/proactor_events.py | 8 +++++++- Lib/test/test_asyncio/test_windows_events.py | 3 ++- 2 files changed, 9 insertions(+), 2 deletions(-) diff --git a/Lib/asyncio/proactor_events.py b/Lib/asyncio/proactor_events.py --- a/Lib/asyncio/proactor_events.py +++ b/Lib/asyncio/proactor_events.py @@ -387,13 +387,19 @@ raise RuntimeError("Cannot close a running event loop") if self.is_closed(): return + + # Call these methods before closing the event loop (before calling + # BaseEventLoop.close), because they can schedule callbacks with + # call_soon(), which is forbidden when the event loop is closed. self._stop_accept_futures() self._close_self_pipe() - super().close() self._proactor.close() self._proactor = None self._selector = None + # Close the event loop + super().close() + def sock_recv(self, sock, n): return self._proactor.recv(sock, n) diff --git a/Lib/test/test_asyncio/test_windows_events.py b/Lib/test/test_asyncio/test_windows_events.py --- a/Lib/test/test_asyncio/test_windows_events.py +++ b/Lib/test/test_asyncio/test_windows_events.py @@ -67,7 +67,8 @@ clients = [] for i in range(5): stream_reader = asyncio.StreamReader(loop=self.loop) - protocol = asyncio.StreamReaderProtocol(stream_reader) + protocol = asyncio.StreamReaderProtocol(stream_reader, + loop=self.loop) trans, proto = yield from self.loop.create_pipe_connection( lambda: protocol, ADDRESS) self.assertIsInstance(trans, asyncio.Transport) -- Repository URL: https://hg.python.org/cpython From python-checkins at python.org Tue Jan 13 16:15:12 2015 From: python-checkins at python.org (victor.stinner) Date: Tue, 13 Jan 2015 15:15:12 +0000 Subject: [Python-checkins] =?utf-8?b?Y3B5dGhvbiAoMy40KTogSXNzdWUgIzIyOTIy?= =?utf-8?q?=3A_Fix_ProactorEventLoop=2Eclose=28=29?= Message-ID: <20150113151506.8757.25307@psf.io> https://hg.python.org/cpython/rev/6c473f82309d changeset: 94125:6c473f82309d branch: 3.4 parent: 94122:b168c41f2e3f user: Victor Stinner date: Tue Jan 13 16:11:19 2015 +0100 summary: Issue #22922: Fix ProactorEventLoop.close() Close the IocpProactor before closing the event loop. IocpProactor.close() can call loop.call_soon(), which is forbidden when the event loop is closed. files: Lib/asyncio/proactor_events.py | 8 +++++++- 1 files changed, 7 insertions(+), 1 deletions(-) diff --git a/Lib/asyncio/proactor_events.py b/Lib/asyncio/proactor_events.py --- a/Lib/asyncio/proactor_events.py +++ b/Lib/asyncio/proactor_events.py @@ -387,13 +387,19 @@ raise RuntimeError("Cannot close a running event loop") if self.is_closed(): return + + # Call these methods before closing the event loop (before calling + # BaseEventLoop.close), because they can schedule callbacks with + # call_soon(), which is forbidden when the event loop is closed. self._stop_accept_futures() self._close_self_pipe() - super().close() self._proactor.close() self._proactor = None self._selector = None + # Close the event loop + super().close() + def sock_recv(self, sock, n): return self._proactor.recv(sock, n) -- Repository URL: https://hg.python.org/cpython From python-checkins at python.org Wed Jan 14 00:21:43 2015 From: python-checkins at python.org (victor.stinner) Date: Tue, 13 Jan 2015 23:21:43 +0000 Subject: [Python-checkins] =?utf-8?q?cpython_=28merge_3=2E4_-=3E_default?= =?utf-8?q?=29=3A_Merge_3=2E4_=28asyncio=3A_new_SSL_implementation=29?= Message-ID: <20150113232143.11587.50003@psf.io> https://hg.python.org/cpython/rev/dfe532dd0138 changeset: 94129:dfe532dd0138 parent: 94127:56f717235c45 parent: 94128:432b817611f2 user: Victor Stinner date: Wed Jan 14 00:19:55 2015 +0100 summary: Merge 3.4 (asyncio: new SSL implementation) files: Lib/asyncio/proactor_events.py | 31 +- Lib/asyncio/selector_events.py | 45 +- Lib/asyncio/sslproto.py | 640 ++++++++++ Lib/asyncio/test_utils.py | 5 + Lib/test/test_asyncio/test_events.py | 66 +- Lib/test/test_asyncio/test_selector_events.py | 6 +- 6 files changed, 751 insertions(+), 42 deletions(-) diff --git a/Lib/asyncio/proactor_events.py b/Lib/asyncio/proactor_events.py --- a/Lib/asyncio/proactor_events.py +++ b/Lib/asyncio/proactor_events.py @@ -11,6 +11,7 @@ from . import base_events from . import constants from . import futures +from . import sslproto from . import transports from .log import logger @@ -367,6 +368,20 @@ return _ProactorSocketTransport(self, sock, protocol, waiter, extra, server) + def _make_ssl_transport(self, rawsock, protocol, sslcontext, waiter=None, + *, server_side=False, server_hostname=None, + extra=None, server=None): + if not sslproto._is_sslproto_available(): + raise NotImplementedError("Proactor event loop requires Python 3.5" + " or newer (ssl.MemoryBIO) to support " + "SSL") + + ssl_protocol = sslproto.SSLProtocol(self, protocol, sslcontext, waiter, + server_side, server_hostname) + _ProactorSocketTransport(self, rawsock, ssl_protocol, + extra=extra, server=server) + return ssl_protocol._app_transport + def _make_duplex_pipe_transport(self, sock, protocol, waiter=None, extra=None): return _ProactorDuplexPipeTransport(self, @@ -455,9 +470,8 @@ def _write_to_self(self): self._csock.send(b'\0') - def _start_serving(self, protocol_factory, sock, ssl=None, server=None): - if ssl: - raise ValueError('IocpEventLoop is incompatible with SSL.') + def _start_serving(self, protocol_factory, sock, + sslcontext=None, server=None): def loop(f=None): try: @@ -467,9 +481,14 @@ logger.debug("%r got a new connection from %r: %r", server, addr, conn) protocol = protocol_factory() - self._make_socket_transport( - conn, protocol, - extra={'peername': addr}, server=server) + if sslcontext is not None: + self._make_ssl_transport( + conn, protocol, sslcontext, server_side=True, + extra={'peername': addr}, server=server) + else: + self._make_socket_transport( + conn, protocol, + extra={'peername': addr}, server=server) if self.is_closed(): return f = self._proactor.accept(sock) diff --git a/Lib/asyncio/selector_events.py b/Lib/asyncio/selector_events.py --- a/Lib/asyncio/selector_events.py +++ b/Lib/asyncio/selector_events.py @@ -10,6 +10,7 @@ import errno import functools import socket +import sys try: import ssl except ImportError: # pragma: no cover @@ -21,6 +22,7 @@ from . import futures from . import selectors from . import transports +from . import sslproto from .log import logger @@ -58,6 +60,24 @@ def _make_ssl_transport(self, rawsock, protocol, sslcontext, waiter=None, *, server_side=False, server_hostname=None, extra=None, server=None): + if not sslproto._is_sslproto_available(): + return self._make_legacy_ssl_transport( + rawsock, protocol, sslcontext, waiter, + server_side=server_side, server_hostname=server_hostname, + extra=extra, server=server) + + ssl_protocol = sslproto.SSLProtocol(self, protocol, sslcontext, waiter, + server_side, server_hostname) + _SelectorSocketTransport(self, rawsock, ssl_protocol, + extra=extra, server=server) + return ssl_protocol._app_transport + + def _make_legacy_ssl_transport(self, rawsock, protocol, sslcontext, + waiter, *, + server_side=False, server_hostname=None, + extra=None, server=None): + # Use the legacy API: SSL_write, SSL_read, etc. The legacy API is used + # on Python 3.4 and older, when ssl.MemoryBIO is not available. return _SelectorSslTransport( self, rawsock, protocol, sslcontext, waiter, server_side, server_hostname, extra, server) @@ -508,7 +528,8 @@ def _fatal_error(self, exc, message='Fatal error on transport'): # Should be called from exception handler only. - if isinstance(exc, (BrokenPipeError, ConnectionResetError)): + if isinstance(exc, (BrokenPipeError, + ConnectionResetError, ConnectionAbortedError)): if self._loop.get_debug(): logger.debug("%r: %s", self, message, exc_info=True) else: @@ -683,26 +704,8 @@ if ssl is None: raise RuntimeError('stdlib ssl module not available') - if server_side: - if not sslcontext: - raise ValueError('Server side ssl needs a valid SSLContext') - else: - if not sslcontext: - # Client side may pass ssl=True to use a default - # context; in that case the sslcontext passed is None. - # The default is secure for client connections. - if hasattr(ssl, 'create_default_context'): - # Python 3.4+: use up-to-date strong settings. - sslcontext = ssl.create_default_context() - if not server_hostname: - sslcontext.check_hostname = False - else: - # Fallback for Python 3.3. - sslcontext = ssl.SSLContext(ssl.PROTOCOL_SSLv23) - sslcontext.options |= ssl.OP_NO_SSLv2 - sslcontext.options |= ssl.OP_NO_SSLv3 - sslcontext.set_default_verify_paths() - sslcontext.verify_mode = ssl.CERT_REQUIRED + if not sslcontext: + sslcontext = sslproto._create_transport_context(server_side, server_hostname) wrap_kwargs = { 'server_side': server_side, diff --git a/Lib/asyncio/sslproto.py b/Lib/asyncio/sslproto.py new file mode 100644 --- /dev/null +++ b/Lib/asyncio/sslproto.py @@ -0,0 +1,640 @@ +import collections +try: + import ssl +except ImportError: # pragma: no cover + ssl = None + +from . import protocols +from . import transports +from .log import logger + + +def _create_transport_context(server_side, server_hostname): + if server_side: + raise ValueError('Server side SSL needs a valid SSLContext') + + # Client side may pass ssl=True to use a default + # context; in that case the sslcontext passed is None. + # The default is secure for client connections. + if hasattr(ssl, 'create_default_context'): + # Python 3.4+: use up-to-date strong settings. + sslcontext = ssl.create_default_context() + if not server_hostname: + sslcontext.check_hostname = False + else: + # Fallback for Python 3.3. + sslcontext = ssl.SSLContext(ssl.PROTOCOL_SSLv23) + sslcontext.options |= ssl.OP_NO_SSLv2 + sslcontext.options |= ssl.OP_NO_SSLv3 + sslcontext.set_default_verify_paths() + sslcontext.verify_mode = ssl.CERT_REQUIRED + return sslcontext + + +def _is_sslproto_available(): + return hasattr(ssl, "MemoryBIO") + + +# States of an _SSLPipe. +_UNWRAPPED = "UNWRAPPED" +_DO_HANDSHAKE = "DO_HANDSHAKE" +_WRAPPED = "WRAPPED" +_SHUTDOWN = "SHUTDOWN" + + +class _SSLPipe(object): + """An SSL "Pipe". + + An SSL pipe allows you to communicate with an SSL/TLS protocol instance + through memory buffers. It can be used to implement a security layer for an + existing connection where you don't have access to the connection's file + descriptor, or for some reason you don't want to use it. + + An SSL pipe can be in "wrapped" and "unwrapped" mode. In unwrapped mode, + data is passed through untransformed. In wrapped mode, application level + data is encrypted to SSL record level data and vice versa. The SSL record + level is the lowest level in the SSL protocol suite and is what travels + as-is over the wire. + + An SslPipe initially is in "unwrapped" mode. To start SSL, call + do_handshake(). To shutdown SSL again, call unwrap(). + """ + + max_size = 256 * 1024 # Buffer size passed to read() + + def __init__(self, context, server_side, server_hostname=None): + """ + The *context* argument specifies the ssl.SSLContext to use. + + The *server_side* argument indicates whether this is a server side or + client side transport. + + The optional *server_hostname* argument can be used to specify the + hostname you are connecting to. You may only specify this parameter if + the _ssl module supports Server Name Indication (SNI). + """ + self._context = context + self._server_side = server_side + self._server_hostname = server_hostname + self._state = _UNWRAPPED + self._incoming = ssl.MemoryBIO() + self._outgoing = ssl.MemoryBIO() + self._sslobj = None + self._need_ssldata = False + self._handshake_cb = None + self._shutdown_cb = None + + @property + def context(self): + """The SSL context passed to the constructor.""" + return self._context + + @property + def ssl_object(self): + """The internal ssl.SSLObject instance. + + Return None if the pipe is not wrapped. + """ + return self._sslobj + + @property + def need_ssldata(self): + """Whether more record level data is needed to complete a handshake + that is currently in progress.""" + return self._need_ssldata + + @property + def wrapped(self): + """ + Whether a security layer is currently in effect. + + Return False during handshake. + """ + return self._state == _WRAPPED + + def do_handshake(self, callback=None): + """Start the SSL handshake. + + Return a list of ssldata. A ssldata element is a list of buffers + + The optional *callback* argument can be used to install a callback that + will be called when the handshake is complete. The callback will be + called with None if successful, else an exception instance. + """ + if self._state != _UNWRAPPED: + raise RuntimeError('handshake in progress or completed') + self._sslobj = self._context.wrap_bio( + self._incoming, self._outgoing, + server_side=self._server_side, + server_hostname=self._server_hostname) + self._state = _DO_HANDSHAKE + self._handshake_cb = callback + ssldata, appdata = self.feed_ssldata(b'', only_handshake=True) + assert len(appdata) == 0 + return ssldata + + def shutdown(self, callback=None): + """Start the SSL shutdown sequence. + + Return a list of ssldata. A ssldata element is a list of buffers + + The optional *callback* argument can be used to install a callback that + will be called when the shutdown is complete. The callback will be + called without arguments. + """ + if self._state == _UNWRAPPED: + raise RuntimeError('no security layer present') + if self._state == _SHUTDOWN: + raise RuntimeError('shutdown in progress') + assert self._state in (_WRAPPED, _DO_HANDSHAKE) + self._state = _SHUTDOWN + self._shutdown_cb = callback + ssldata, appdata = self.feed_ssldata(b'') + assert appdata == [] or appdata == [b''] + return ssldata + + def feed_eof(self): + """Send a potentially "ragged" EOF. + + This method will raise an SSL_ERROR_EOF exception if the EOF is + unexpected. + """ + self._incoming.write_eof() + ssldata, appdata = self.feed_ssldata(b'') + assert appdata == [] or appdata == [b''] + + def feed_ssldata(self, data, only_handshake=False): + """Feed SSL record level data into the pipe. + + The data must be a bytes instance. It is OK to send an empty bytes + instance. This can be used to get ssldata for a handshake initiated by + this endpoint. + + Return a (ssldata, appdata) tuple. The ssldata element is a list of + buffers containing SSL data that needs to be sent to the remote SSL. + + The appdata element is a list of buffers containing plaintext data that + needs to be forwarded to the application. The appdata list may contain + an empty buffer indicating an SSL "close_notify" alert. This alert must + be acknowledged by calling shutdown(). + """ + if self._state == _UNWRAPPED: + # If unwrapped, pass plaintext data straight through. + if data: + appdata = [data] + else: + appdata = [] + return ([], appdata) + + self._need_ssldata = False + if data: + self._incoming.write(data) + + ssldata = [] + appdata = [] + try: + if self._state == _DO_HANDSHAKE: + # Call do_handshake() until it doesn't raise anymore. + self._sslobj.do_handshake() + self._state = _WRAPPED + if self._handshake_cb: + self._handshake_cb(None) + if only_handshake: + return (ssldata, appdata) + # Handshake done: execute the wrapped block + + if self._state == _WRAPPED: + # Main state: read data from SSL until close_notify + while True: + chunk = self._sslobj.read(self.max_size) + appdata.append(chunk) + if not chunk: # close_notify + break + + elif self._state == _SHUTDOWN: + # Call shutdown() until it doesn't raise anymore. + self._sslobj.unwrap() + self._sslobj = None + self._state = _UNWRAPPED + if self._shutdown_cb: + self._shutdown_cb() + + elif self._state == _UNWRAPPED: + # Drain possible plaintext data after close_notify. + appdata.append(self._incoming.read()) + except (ssl.SSLError, ssl.CertificateError) as exc: + if getattr(exc, 'errno', None) not in ( + ssl.SSL_ERROR_WANT_READ, ssl.SSL_ERROR_WANT_WRITE, + ssl.SSL_ERROR_SYSCALL): + if self._state == _DO_HANDSHAKE and self._handshake_cb: + self._handshake_cb(exc) + raise + self._need_ssldata = (exc.errno == ssl.SSL_ERROR_WANT_READ) + + # Check for record level data that needs to be sent back. + # Happens for the initial handshake and renegotiations. + if self._outgoing.pending: + ssldata.append(self._outgoing.read()) + return (ssldata, appdata) + + def feed_appdata(self, data, offset=0): + """Feed plaintext data into the pipe. + + Return an (ssldata, offset) tuple. The ssldata element is a list of + buffers containing record level data that needs to be sent to the + remote SSL instance. The offset is the number of plaintext bytes that + were processed, which may be less than the length of data. + + NOTE: In case of short writes, this call MUST be retried with the SAME + buffer passed into the *data* argument (i.e. the id() must be the + same). This is an OpenSSL requirement. A further particularity is that + a short write will always have offset == 0, because the _ssl module + does not enable partial writes. And even though the offset is zero, + there will still be encrypted data in ssldata. + """ + assert 0 <= offset <= len(data) + if self._state == _UNWRAPPED: + # pass through data in unwrapped mode + if offset < len(data): + ssldata = [data[offset:]] + else: + ssldata = [] + return (ssldata, len(data)) + + ssldata = [] + view = memoryview(data) + while True: + self._need_ssldata = False + try: + if offset < len(view): + offset += self._sslobj.write(view[offset:]) + except ssl.SSLError as exc: + # It is not allowed to call write() after unwrap() until the + # close_notify is acknowledged. We return the condition to the + # caller as a short write. + if exc.reason == 'PROTOCOL_IS_SHUTDOWN': + exc.errno = ssl.SSL_ERROR_WANT_READ + if exc.errno not in (ssl.SSL_ERROR_WANT_READ, + ssl.SSL_ERROR_WANT_WRITE, + ssl.SSL_ERROR_SYSCALL): + raise + self._need_ssldata = (exc.errno == ssl.SSL_ERROR_WANT_READ) + + # See if there's any record level data back for us. + if self._outgoing.pending: + ssldata.append(self._outgoing.read()) + if offset == len(view) or self._need_ssldata: + break + return (ssldata, offset) + + +class _SSLProtocolTransport(transports._FlowControlMixin, + transports.Transport): + + def __init__(self, loop, ssl_protocol, app_protocol): + self._loop = loop + self._ssl_protocol = ssl_protocol + self._app_protocol = app_protocol + + def get_extra_info(self, name, default=None): + """Get optional transport information.""" + return self._ssl_protocol._get_extra_info(name, default) + + def close(self): + """Close the transport. + + Buffered data will be flushed asynchronously. No more data + will be received. After all buffered data is flushed, the + protocol's connection_lost() method will (eventually) called + with None as its argument. + """ + self._ssl_protocol._start_shutdown() + + def pause_reading(self): + """Pause the receiving end. + + No data will be passed to the protocol's data_received() + method until resume_reading() is called. + """ + self._ssl_protocol._transport.pause_reading() + + def resume_reading(self): + """Resume the receiving end. + + Data received will once again be passed to the protocol's + data_received() method. + """ + self._ssl_protocol._transport.resume_reading() + + def set_write_buffer_limits(self, high=None, low=None): + """Set the high- and low-water limits for write flow control. + + These two values control when to call the protocol's + pause_writing() and resume_writing() methods. If specified, + the low-water limit must be less than or equal to the + high-water limit. Neither value can be negative. + + The defaults are implementation-specific. If only the + high-water limit is given, the low-water limit defaults to a + implementation-specific value less than or equal to the + high-water limit. Setting high to zero forces low to zero as + well, and causes pause_writing() to be called whenever the + buffer becomes non-empty. Setting low to zero causes + resume_writing() to be called only once the buffer is empty. + Use of zero for either limit is generally sub-optimal as it + reduces opportunities for doing I/O and computation + concurrently. + """ + self._ssl_protocol._transport.set_write_buffer_limits(high, low) + + def get_write_buffer_size(self): + """Return the current size of the write buffer.""" + return self._ssl_protocol._transport.get_write_buffer_size() + + def write(self, data): + """Write some data bytes to the transport. + + This does not block; it buffers the data and arranges for it + to be sent out asynchronously. + """ + if not isinstance(data, (bytes, bytearray, memoryview)): + raise TypeError("data: expecting a bytes-like instance, got {!r}" + .format(type(data).__name__)) + if not data: + return + self._ssl_protocol._write_appdata(data) + + def can_write_eof(self): + """Return True if this transport supports write_eof(), False if not.""" + return False + + def abort(self): + """Close the transport immediately. + + Buffered data will be lost. No more data will be received. + The protocol's connection_lost() method will (eventually) be + called with None as its argument. + """ + self._ssl_protocol._abort() + + +class SSLProtocol(protocols.Protocol): + """SSL protocol. + + Implementation of SSL on top of a socket using incoming and outgoing + buffers which are ssl.MemoryBIO objects. + """ + + def __init__(self, loop, app_protocol, sslcontext, waiter, + server_side=False, server_hostname=None): + if ssl is None: + raise RuntimeError('stdlib ssl module not available') + + if not sslcontext: + sslcontext = _create_transport_context(server_side, server_hostname) + + self._server_side = server_side + if server_hostname and not server_side: + self._server_hostname = server_hostname + else: + self._server_hostname = None + self._sslcontext = sslcontext + # SSL-specific extra info. More info are set when the handshake + # completes. + self._extra = dict(sslcontext=sslcontext) + + # App data write buffering + self._write_backlog = collections.deque() + self._write_buffer_size = 0 + + self._waiter = waiter + self._closing = False + self._loop = loop + self._app_protocol = app_protocol + self._app_transport = _SSLProtocolTransport(self._loop, + self, self._app_protocol) + self._sslpipe = None + self._session_established = False + self._in_handshake = False + self._in_shutdown = False + + def connection_made(self, transport): + """Called when the low-level connection is made. + + Start the SSL handshake. + """ + self._transport = transport + self._sslpipe = _SSLPipe(self._sslcontext, + self._server_side, + self._server_hostname) + self._start_handshake() + + def connection_lost(self, exc): + """Called when the low-level connection is lost or closed. + + The argument is an exception object or None (the latter + meaning a regular EOF is received or the connection was + aborted or closed). + """ + if self._session_established: + self._session_established = False + self._loop.call_soon(self._app_protocol.connection_lost, exc) + self._transport = None + self._app_transport = None + + def pause_writing(self): + """Called when the low-level transport's buffer goes over + the high-water mark. + """ + self._app_protocol.pause_writing() + + def resume_writing(self): + """Called when the low-level transport's buffer drains below + the low-water mark. + """ + self._app_protocol.resume_writing() + + def data_received(self, data): + """Called when some SSL data is received. + + The argument is a bytes object. + """ + try: + ssldata, appdata = self._sslpipe.feed_ssldata(data) + except ssl.SSLError as e: + if self._loop.get_debug(): + logger.warning('%r: SSL error %s (reason %s)', + self, e.errno, e.reason) + self._abort() + return + + for chunk in ssldata: + self._transport.write(chunk) + + for chunk in appdata: + if chunk: + self._app_protocol.data_received(chunk) + else: + self._start_shutdown() + break + + def eof_received(self): + """Called when the other end of the low-level stream + is half-closed. + + If this returns a false value (including None), the transport + will close itself. If it returns a true value, closing the + transport is up to the protocol. + """ + try: + if self._loop.get_debug(): + logger.debug("%r received EOF", self) + if not self._in_handshake: + keep_open = self._app_protocol.eof_received() + if keep_open: + logger.warning('returning true from eof_received() ' + 'has no effect when using ssl') + finally: + self._transport.close() + + def _get_extra_info(self, name, default=None): + if name in self._extra: + return self._extra[name] + else: + return self._transport.get_extra_info(name, default) + + def _start_shutdown(self): + if self._in_shutdown: + return + self._in_shutdown = True + self._write_appdata(b'') + + def _write_appdata(self, data): + self._write_backlog.append((data, 0)) + self._write_buffer_size += len(data) + self._process_write_backlog() + + def _start_handshake(self): + if self._loop.get_debug(): + logger.debug("%r starts SSL handshake", self) + self._handshake_start_time = self._loop.time() + else: + self._handshake_start_time = None + self._in_handshake = True + # (b'', 1) is a special value in _process_write_backlog() to do + # the SSL handshake + self._write_backlog.append((b'', 1)) + self._loop.call_soon(self._process_write_backlog) + + def _on_handshake_complete(self, handshake_exc): + self._in_handshake = False + + sslobj = self._sslpipe.ssl_object + peercert = None if handshake_exc else sslobj.getpeercert() + try: + if handshake_exc is not None: + raise handshake_exc + if not hasattr(self._sslcontext, 'check_hostname'): + # Verify hostname if requested, Python 3.4+ uses check_hostname + # and checks the hostname in do_handshake() + if (self._server_hostname + and self._sslcontext.verify_mode != ssl.CERT_NONE): + ssl.match_hostname(peercert, self._server_hostname) + except BaseException as exc: + if self._loop.get_debug(): + if isinstance(exc, ssl.CertificateError): + logger.warning("%r: SSL handshake failed " + "on verifying the certificate", + self, exc_info=True) + else: + logger.warning("%r: SSL handshake failed", + self, exc_info=True) + self._transport.close() + if isinstance(exc, Exception): + if self._waiter is not None: + self._waiter.set_exception(exc) + return + else: + raise + + if self._loop.get_debug(): + dt = self._loop.time() - self._handshake_start_time + logger.debug("%r: SSL handshake took %.1f ms", self, dt * 1e3) + + # Add extra info that becomes available after handshake. + self._extra.update(peercert=peercert, + cipher=sslobj.cipher(), + compression=sslobj.compression(), + ) + self._app_protocol.connection_made(self._app_transport) + if self._waiter is not None: + # wait until protocol.connection_made() has been called + self._waiter._set_result_unless_cancelled(None) + self._session_established = True + # In case transport.write() was already called + self._process_write_backlog() + + def _process_write_backlog(self): + # Try to make progress on the write backlog. + if self._transport is None: + return + + try: + for i in range(len(self._write_backlog)): + data, offset = self._write_backlog[0] + if data: + ssldata, offset = self._sslpipe.feed_appdata(data, offset) + elif offset: + ssldata = self._sslpipe.do_handshake(self._on_handshake_complete) + offset = 1 + else: + ssldata = self._sslpipe.shutdown(self._finalize) + offset = 1 + + for chunk in ssldata: + self._transport.write(chunk) + + if offset < len(data): + self._write_backlog[0] = (data, offset) + # A short write means that a write is blocked on a read + # We need to enable reading if it is paused! + assert self._sslpipe.need_ssldata + if self._transport._paused: + self._transport.resume_reading() + break + + # An entire chunk from the backlog was processed. We can + # delete it and reduce the outstanding buffer size. + del self._write_backlog[0] + self._write_buffer_size -= len(data) + except BaseException as exc: + if self._in_handshake: + self._on_handshake_complete(exc) + else: + self._fatal_error(exc, 'Fatal error on SSL transport') + + def _fatal_error(self, exc, message='Fatal error on transport'): + # Should be called from exception handler only. + if isinstance(exc, (BrokenPipeError, ConnectionResetError)): + if self._loop.get_debug(): + logger.debug("%r: %s", self, message, exc_info=True) + else: + self._loop.call_exception_handler({ + 'message': message, + 'exception': exc, + 'transport': self._transport, + 'protocol': self, + }) + if self._transport: + self._transport._force_close(exc) + + def _finalize(self): + if self._transport is not None: + self._transport.close() + + def _abort(self): + if self._transport is not None: + try: + self._transport.abort() + finally: + self._finalize() diff --git a/Lib/asyncio/test_utils.py b/Lib/asyncio/test_utils.py --- a/Lib/asyncio/test_utils.py +++ b/Lib/asyncio/test_utils.py @@ -434,3 +434,8 @@ sock = mock.Mock(socket.socket) sock.gettimeout.return_value = 0.0 return sock + + +def force_legacy_ssl_support(): + return mock.patch('asyncio.sslproto._is_sslproto_available', + return_value=False) diff --git a/Lib/test/test_asyncio/test_events.py b/Lib/test/test_asyncio/test_events.py --- a/Lib/test/test_asyncio/test_events.py +++ b/Lib/test/test_asyncio/test_events.py @@ -650,6 +650,10 @@ *httpd.address) self._test_create_ssl_connection(httpd, create_connection) + def test_legacy_create_ssl_connection(self): + with test_utils.force_legacy_ssl_support(): + self.test_create_ssl_connection() + @unittest.skipIf(ssl is None, 'No ssl module') @unittest.skipUnless(hasattr(socket, 'AF_UNIX'), 'No UNIX Sockets') def test_create_ssl_unix_connection(self): @@ -666,6 +670,10 @@ self._test_create_ssl_connection(httpd, create_connection, check_sockname) + def test_legacy_create_ssl_unix_connection(self): + with test_utils.force_legacy_ssl_support(): + self.test_create_ssl_unix_connection() + def test_create_connection_local_addr(self): with test_utils.run_test_server() as httpd: port = support.find_unused_port() @@ -826,6 +834,10 @@ # stop serving server.close() + def test_legacy_create_server_ssl(self): + with test_utils.force_legacy_ssl_support(): + self.test_create_server_ssl() + @unittest.skipIf(ssl is None, 'No ssl module') @unittest.skipUnless(hasattr(socket, 'AF_UNIX'), 'No UNIX Sockets') def test_create_unix_server_ssl(self): @@ -857,6 +869,10 @@ # stop serving server.close() + def test_legacy_create_unix_server_ssl(self): + with test_utils.force_legacy_ssl_support(): + self.test_create_unix_server_ssl() + @unittest.skipIf(ssl is None, 'No ssl module') def test_create_server_ssl_verify_failed(self): proto = MyProto(loop=self.loop) @@ -881,6 +897,10 @@ self.assertIsNone(proto.transport) server.close() + def test_legacy_create_server_ssl_verify_failed(self): + with test_utils.force_legacy_ssl_support(): + self.test_create_server_ssl_verify_failed() + @unittest.skipIf(ssl is None, 'No ssl module') @unittest.skipUnless(hasattr(socket, 'AF_UNIX'), 'No UNIX Sockets') def test_create_unix_server_ssl_verify_failed(self): @@ -907,6 +927,10 @@ self.assertIsNone(proto.transport) server.close() + def test_legacy_create_unix_server_ssl_verify_failed(self): + with test_utils.force_legacy_ssl_support(): + self.test_create_unix_server_ssl_verify_failed() + @unittest.skipIf(ssl is None, 'No ssl module') def test_create_server_ssl_match_failed(self): proto = MyProto(loop=self.loop) @@ -934,6 +958,10 @@ proto.transport.close() server.close() + def test_legacy_create_server_ssl_match_failed(self): + with test_utils.force_legacy_ssl_support(): + self.test_create_server_ssl_match_failed() + @unittest.skipIf(ssl is None, 'No ssl module') @unittest.skipUnless(hasattr(socket, 'AF_UNIX'), 'No UNIX Sockets') def test_create_unix_server_ssl_verified(self): @@ -958,6 +986,11 @@ proto.transport.close() client.close() server.close() + self.loop.run_until_complete(proto.done) + + def test_legacy_create_unix_server_ssl_verified(self): + with test_utils.force_legacy_ssl_support(): + self.test_create_unix_server_ssl_verified() @unittest.skipIf(ssl is None, 'No ssl module') def test_create_server_ssl_verified(self): @@ -982,6 +1015,11 @@ proto.transport.close() client.close() server.close() + self.loop.run_until_complete(proto.done) + + def test_legacy_create_server_ssl_verified(self): + with test_utils.force_legacy_ssl_support(): + self.test_create_server_ssl_verified() def test_create_server_sock(self): proto = asyncio.Future(loop=self.loop) @@ -1746,20 +1784,20 @@ def create_event_loop(self): return asyncio.ProactorEventLoop() - def test_create_ssl_connection(self): - raise unittest.SkipTest("IocpEventLoop incompatible with SSL") - - def test_create_server_ssl(self): - raise unittest.SkipTest("IocpEventLoop incompatible with SSL") - - def test_create_server_ssl_verify_failed(self): - raise unittest.SkipTest("IocpEventLoop incompatible with SSL") - - def test_create_server_ssl_match_failed(self): - raise unittest.SkipTest("IocpEventLoop incompatible with SSL") - - def test_create_server_ssl_verified(self): - raise unittest.SkipTest("IocpEventLoop incompatible with SSL") + def test_legacy_create_ssl_connection(self): + raise unittest.SkipTest("IocpEventLoop incompatible with legacy SSL") + + def test_legacy_create_server_ssl(self): + raise unittest.SkipTest("IocpEventLoop incompatible with legacy SSL") + + def test_legacy_create_server_ssl_verify_failed(self): + raise unittest.SkipTest("IocpEventLoop incompatible with legacy SSL") + + def test_legacy_create_server_ssl_match_failed(self): + raise unittest.SkipTest("IocpEventLoop incompatible with legacy SSL") + + def test_legacy_create_server_ssl_verified(self): + raise unittest.SkipTest("IocpEventLoop incompatible with legacy SSL") def test_reader_callback(self): raise unittest.SkipTest("IocpEventLoop does not have add_reader()") diff --git a/Lib/test/test_asyncio/test_selector_events.py b/Lib/test/test_asyncio/test_selector_events.py --- a/Lib/test/test_asyncio/test_selector_events.py +++ b/Lib/test/test_asyncio/test_selector_events.py @@ -59,9 +59,13 @@ with test_utils.disable_logger(): transport = self.loop._make_ssl_transport( m, asyncio.Protocol(), m, waiter) - self.assertIsInstance(transport, _SelectorSslTransport) + # Sanity check + class_name = transport.__class__.__name__ + self.assertIn("ssl", class_name.lower()) + self.assertIn("transport", class_name.lower()) @mock.patch('asyncio.selector_events.ssl', None) + @mock.patch('asyncio.sslproto.ssl', None) def test_make_ssl_transport_without_ssl_error(self): m = mock.Mock() self.loop.add_reader = mock.Mock() -- Repository URL: https://hg.python.org/cpython From python-checkins at python.org Wed Jan 14 00:21:43 2015 From: python-checkins at python.org (victor.stinner) Date: Tue, 13 Jan 2015 23:21:43 +0000 Subject: [Python-checkins] =?utf-8?b?Y3B5dGhvbiAoMy40KTogSXNzdWUgIzIyNTYw?= =?utf-8?q?=3A_New_SSL_implementation_based_on_ssl=2EMemoryBIO?= Message-ID: <20150113232142.125888.89020@psf.io> https://hg.python.org/cpython/rev/432b817611f2 changeset: 94128:432b817611f2 branch: 3.4 parent: 94126:0387862a5675 user: Victor Stinner date: Wed Jan 14 00:19:09 2015 +0100 summary: Issue #22560: New SSL implementation based on ssl.MemoryBIO The new SSL implementation is based on the new ssl.MemoryBIO which is only available on Python 3.5. On Python 3.4 and older, the legacy SSL implementation (using SSL_write, SSL_read, etc.) is used. The proactor event loop only supports the new implementation. The new asyncio.sslproto module adds _SSLPipe, SSLProtocol and _SSLProtocolTransport classes. _SSLPipe allows to "wrap" or "unwrap" a socket (switch between cleartext and SSL/TLS). Patch written by Antoine Pitrou. sslproto.py is based on gruvi/ssl.py of the gruvi project written by Geert Jansen. This change adds SSL support to ProactorEventLoop on Python 3.5 and newer! It becomes also possible to implement STARTTTLS: switch a cleartext socket to SSL. files: Lib/asyncio/proactor_events.py | 31 +- Lib/asyncio/selector_events.py | 45 +- Lib/asyncio/sslproto.py | 640 ++++++++++ Lib/asyncio/test_utils.py | 5 + Lib/test/test_asyncio/test_events.py | 66 +- Lib/test/test_asyncio/test_selector_events.py | 6 +- 6 files changed, 751 insertions(+), 42 deletions(-) diff --git a/Lib/asyncio/proactor_events.py b/Lib/asyncio/proactor_events.py --- a/Lib/asyncio/proactor_events.py +++ b/Lib/asyncio/proactor_events.py @@ -11,6 +11,7 @@ from . import base_events from . import constants from . import futures +from . import sslproto from . import transports from .log import logger @@ -367,6 +368,20 @@ return _ProactorSocketTransport(self, sock, protocol, waiter, extra, server) + def _make_ssl_transport(self, rawsock, protocol, sslcontext, waiter=None, + *, server_side=False, server_hostname=None, + extra=None, server=None): + if not sslproto._is_sslproto_available(): + raise NotImplementedError("Proactor event loop requires Python 3.5" + " or newer (ssl.MemoryBIO) to support " + "SSL") + + ssl_protocol = sslproto.SSLProtocol(self, protocol, sslcontext, waiter, + server_side, server_hostname) + _ProactorSocketTransport(self, rawsock, ssl_protocol, + extra=extra, server=server) + return ssl_protocol._app_transport + def _make_duplex_pipe_transport(self, sock, protocol, waiter=None, extra=None): return _ProactorDuplexPipeTransport(self, @@ -455,9 +470,8 @@ def _write_to_self(self): self._csock.send(b'\0') - def _start_serving(self, protocol_factory, sock, ssl=None, server=None): - if ssl: - raise ValueError('IocpEventLoop is incompatible with SSL.') + def _start_serving(self, protocol_factory, sock, + sslcontext=None, server=None): def loop(f=None): try: @@ -467,9 +481,14 @@ logger.debug("%r got a new connection from %r: %r", server, addr, conn) protocol = protocol_factory() - self._make_socket_transport( - conn, protocol, - extra={'peername': addr}, server=server) + if sslcontext is not None: + self._make_ssl_transport( + conn, protocol, sslcontext, server_side=True, + extra={'peername': addr}, server=server) + else: + self._make_socket_transport( + conn, protocol, + extra={'peername': addr}, server=server) if self.is_closed(): return f = self._proactor.accept(sock) diff --git a/Lib/asyncio/selector_events.py b/Lib/asyncio/selector_events.py --- a/Lib/asyncio/selector_events.py +++ b/Lib/asyncio/selector_events.py @@ -10,6 +10,7 @@ import errno import functools import socket +import sys try: import ssl except ImportError: # pragma: no cover @@ -21,6 +22,7 @@ from . import futures from . import selectors from . import transports +from . import sslproto from .log import logger @@ -58,6 +60,24 @@ def _make_ssl_transport(self, rawsock, protocol, sslcontext, waiter=None, *, server_side=False, server_hostname=None, extra=None, server=None): + if not sslproto._is_sslproto_available(): + return self._make_legacy_ssl_transport( + rawsock, protocol, sslcontext, waiter, + server_side=server_side, server_hostname=server_hostname, + extra=extra, server=server) + + ssl_protocol = sslproto.SSLProtocol(self, protocol, sslcontext, waiter, + server_side, server_hostname) + _SelectorSocketTransport(self, rawsock, ssl_protocol, + extra=extra, server=server) + return ssl_protocol._app_transport + + def _make_legacy_ssl_transport(self, rawsock, protocol, sslcontext, + waiter, *, + server_side=False, server_hostname=None, + extra=None, server=None): + # Use the legacy API: SSL_write, SSL_read, etc. The legacy API is used + # on Python 3.4 and older, when ssl.MemoryBIO is not available. return _SelectorSslTransport( self, rawsock, protocol, sslcontext, waiter, server_side, server_hostname, extra, server) @@ -508,7 +528,8 @@ def _fatal_error(self, exc, message='Fatal error on transport'): # Should be called from exception handler only. - if isinstance(exc, (BrokenPipeError, ConnectionResetError)): + if isinstance(exc, (BrokenPipeError, + ConnectionResetError, ConnectionAbortedError)): if self._loop.get_debug(): logger.debug("%r: %s", self, message, exc_info=True) else: @@ -683,26 +704,8 @@ if ssl is None: raise RuntimeError('stdlib ssl module not available') - if server_side: - if not sslcontext: - raise ValueError('Server side ssl needs a valid SSLContext') - else: - if not sslcontext: - # Client side may pass ssl=True to use a default - # context; in that case the sslcontext passed is None. - # The default is secure for client connections. - if hasattr(ssl, 'create_default_context'): - # Python 3.4+: use up-to-date strong settings. - sslcontext = ssl.create_default_context() - if not server_hostname: - sslcontext.check_hostname = False - else: - # Fallback for Python 3.3. - sslcontext = ssl.SSLContext(ssl.PROTOCOL_SSLv23) - sslcontext.options |= ssl.OP_NO_SSLv2 - sslcontext.options |= ssl.OP_NO_SSLv3 - sslcontext.set_default_verify_paths() - sslcontext.verify_mode = ssl.CERT_REQUIRED + if not sslcontext: + sslcontext = sslproto._create_transport_context(server_side, server_hostname) wrap_kwargs = { 'server_side': server_side, diff --git a/Lib/asyncio/sslproto.py b/Lib/asyncio/sslproto.py new file mode 100644 --- /dev/null +++ b/Lib/asyncio/sslproto.py @@ -0,0 +1,640 @@ +import collections +try: + import ssl +except ImportError: # pragma: no cover + ssl = None + +from . import protocols +from . import transports +from .log import logger + + +def _create_transport_context(server_side, server_hostname): + if server_side: + raise ValueError('Server side SSL needs a valid SSLContext') + + # Client side may pass ssl=True to use a default + # context; in that case the sslcontext passed is None. + # The default is secure for client connections. + if hasattr(ssl, 'create_default_context'): + # Python 3.4+: use up-to-date strong settings. + sslcontext = ssl.create_default_context() + if not server_hostname: + sslcontext.check_hostname = False + else: + # Fallback for Python 3.3. + sslcontext = ssl.SSLContext(ssl.PROTOCOL_SSLv23) + sslcontext.options |= ssl.OP_NO_SSLv2 + sslcontext.options |= ssl.OP_NO_SSLv3 + sslcontext.set_default_verify_paths() + sslcontext.verify_mode = ssl.CERT_REQUIRED + return sslcontext + + +def _is_sslproto_available(): + return hasattr(ssl, "MemoryBIO") + + +# States of an _SSLPipe. +_UNWRAPPED = "UNWRAPPED" +_DO_HANDSHAKE = "DO_HANDSHAKE" +_WRAPPED = "WRAPPED" +_SHUTDOWN = "SHUTDOWN" + + +class _SSLPipe(object): + """An SSL "Pipe". + + An SSL pipe allows you to communicate with an SSL/TLS protocol instance + through memory buffers. It can be used to implement a security layer for an + existing connection where you don't have access to the connection's file + descriptor, or for some reason you don't want to use it. + + An SSL pipe can be in "wrapped" and "unwrapped" mode. In unwrapped mode, + data is passed through untransformed. In wrapped mode, application level + data is encrypted to SSL record level data and vice versa. The SSL record + level is the lowest level in the SSL protocol suite and is what travels + as-is over the wire. + + An SslPipe initially is in "unwrapped" mode. To start SSL, call + do_handshake(). To shutdown SSL again, call unwrap(). + """ + + max_size = 256 * 1024 # Buffer size passed to read() + + def __init__(self, context, server_side, server_hostname=None): + """ + The *context* argument specifies the ssl.SSLContext to use. + + The *server_side* argument indicates whether this is a server side or + client side transport. + + The optional *server_hostname* argument can be used to specify the + hostname you are connecting to. You may only specify this parameter if + the _ssl module supports Server Name Indication (SNI). + """ + self._context = context + self._server_side = server_side + self._server_hostname = server_hostname + self._state = _UNWRAPPED + self._incoming = ssl.MemoryBIO() + self._outgoing = ssl.MemoryBIO() + self._sslobj = None + self._need_ssldata = False + self._handshake_cb = None + self._shutdown_cb = None + + @property + def context(self): + """The SSL context passed to the constructor.""" + return self._context + + @property + def ssl_object(self): + """The internal ssl.SSLObject instance. + + Return None if the pipe is not wrapped. + """ + return self._sslobj + + @property + def need_ssldata(self): + """Whether more record level data is needed to complete a handshake + that is currently in progress.""" + return self._need_ssldata + + @property + def wrapped(self): + """ + Whether a security layer is currently in effect. + + Return False during handshake. + """ + return self._state == _WRAPPED + + def do_handshake(self, callback=None): + """Start the SSL handshake. + + Return a list of ssldata. A ssldata element is a list of buffers + + The optional *callback* argument can be used to install a callback that + will be called when the handshake is complete. The callback will be + called with None if successful, else an exception instance. + """ + if self._state != _UNWRAPPED: + raise RuntimeError('handshake in progress or completed') + self._sslobj = self._context.wrap_bio( + self._incoming, self._outgoing, + server_side=self._server_side, + server_hostname=self._server_hostname) + self._state = _DO_HANDSHAKE + self._handshake_cb = callback + ssldata, appdata = self.feed_ssldata(b'', only_handshake=True) + assert len(appdata) == 0 + return ssldata + + def shutdown(self, callback=None): + """Start the SSL shutdown sequence. + + Return a list of ssldata. A ssldata element is a list of buffers + + The optional *callback* argument can be used to install a callback that + will be called when the shutdown is complete. The callback will be + called without arguments. + """ + if self._state == _UNWRAPPED: + raise RuntimeError('no security layer present') + if self._state == _SHUTDOWN: + raise RuntimeError('shutdown in progress') + assert self._state in (_WRAPPED, _DO_HANDSHAKE) + self._state = _SHUTDOWN + self._shutdown_cb = callback + ssldata, appdata = self.feed_ssldata(b'') + assert appdata == [] or appdata == [b''] + return ssldata + + def feed_eof(self): + """Send a potentially "ragged" EOF. + + This method will raise an SSL_ERROR_EOF exception if the EOF is + unexpected. + """ + self._incoming.write_eof() + ssldata, appdata = self.feed_ssldata(b'') + assert appdata == [] or appdata == [b''] + + def feed_ssldata(self, data, only_handshake=False): + """Feed SSL record level data into the pipe. + + The data must be a bytes instance. It is OK to send an empty bytes + instance. This can be used to get ssldata for a handshake initiated by + this endpoint. + + Return a (ssldata, appdata) tuple. The ssldata element is a list of + buffers containing SSL data that needs to be sent to the remote SSL. + + The appdata element is a list of buffers containing plaintext data that + needs to be forwarded to the application. The appdata list may contain + an empty buffer indicating an SSL "close_notify" alert. This alert must + be acknowledged by calling shutdown(). + """ + if self._state == _UNWRAPPED: + # If unwrapped, pass plaintext data straight through. + if data: + appdata = [data] + else: + appdata = [] + return ([], appdata) + + self._need_ssldata = False + if data: + self._incoming.write(data) + + ssldata = [] + appdata = [] + try: + if self._state == _DO_HANDSHAKE: + # Call do_handshake() until it doesn't raise anymore. + self._sslobj.do_handshake() + self._state = _WRAPPED + if self._handshake_cb: + self._handshake_cb(None) + if only_handshake: + return (ssldata, appdata) + # Handshake done: execute the wrapped block + + if self._state == _WRAPPED: + # Main state: read data from SSL until close_notify + while True: + chunk = self._sslobj.read(self.max_size) + appdata.append(chunk) + if not chunk: # close_notify + break + + elif self._state == _SHUTDOWN: + # Call shutdown() until it doesn't raise anymore. + self._sslobj.unwrap() + self._sslobj = None + self._state = _UNWRAPPED + if self._shutdown_cb: + self._shutdown_cb() + + elif self._state == _UNWRAPPED: + # Drain possible plaintext data after close_notify. + appdata.append(self._incoming.read()) + except (ssl.SSLError, ssl.CertificateError) as exc: + if getattr(exc, 'errno', None) not in ( + ssl.SSL_ERROR_WANT_READ, ssl.SSL_ERROR_WANT_WRITE, + ssl.SSL_ERROR_SYSCALL): + if self._state == _DO_HANDSHAKE and self._handshake_cb: + self._handshake_cb(exc) + raise + self._need_ssldata = (exc.errno == ssl.SSL_ERROR_WANT_READ) + + # Check for record level data that needs to be sent back. + # Happens for the initial handshake and renegotiations. + if self._outgoing.pending: + ssldata.append(self._outgoing.read()) + return (ssldata, appdata) + + def feed_appdata(self, data, offset=0): + """Feed plaintext data into the pipe. + + Return an (ssldata, offset) tuple. The ssldata element is a list of + buffers containing record level data that needs to be sent to the + remote SSL instance. The offset is the number of plaintext bytes that + were processed, which may be less than the length of data. + + NOTE: In case of short writes, this call MUST be retried with the SAME + buffer passed into the *data* argument (i.e. the id() must be the + same). This is an OpenSSL requirement. A further particularity is that + a short write will always have offset == 0, because the _ssl module + does not enable partial writes. And even though the offset is zero, + there will still be encrypted data in ssldata. + """ + assert 0 <= offset <= len(data) + if self._state == _UNWRAPPED: + # pass through data in unwrapped mode + if offset < len(data): + ssldata = [data[offset:]] + else: + ssldata = [] + return (ssldata, len(data)) + + ssldata = [] + view = memoryview(data) + while True: + self._need_ssldata = False + try: + if offset < len(view): + offset += self._sslobj.write(view[offset:]) + except ssl.SSLError as exc: + # It is not allowed to call write() after unwrap() until the + # close_notify is acknowledged. We return the condition to the + # caller as a short write. + if exc.reason == 'PROTOCOL_IS_SHUTDOWN': + exc.errno = ssl.SSL_ERROR_WANT_READ + if exc.errno not in (ssl.SSL_ERROR_WANT_READ, + ssl.SSL_ERROR_WANT_WRITE, + ssl.SSL_ERROR_SYSCALL): + raise + self._need_ssldata = (exc.errno == ssl.SSL_ERROR_WANT_READ) + + # See if there's any record level data back for us. + if self._outgoing.pending: + ssldata.append(self._outgoing.read()) + if offset == len(view) or self._need_ssldata: + break + return (ssldata, offset) + + +class _SSLProtocolTransport(transports._FlowControlMixin, + transports.Transport): + + def __init__(self, loop, ssl_protocol, app_protocol): + self._loop = loop + self._ssl_protocol = ssl_protocol + self._app_protocol = app_protocol + + def get_extra_info(self, name, default=None): + """Get optional transport information.""" + return self._ssl_protocol._get_extra_info(name, default) + + def close(self): + """Close the transport. + + Buffered data will be flushed asynchronously. No more data + will be received. After all buffered data is flushed, the + protocol's connection_lost() method will (eventually) called + with None as its argument. + """ + self._ssl_protocol._start_shutdown() + + def pause_reading(self): + """Pause the receiving end. + + No data will be passed to the protocol's data_received() + method until resume_reading() is called. + """ + self._ssl_protocol._transport.pause_reading() + + def resume_reading(self): + """Resume the receiving end. + + Data received will once again be passed to the protocol's + data_received() method. + """ + self._ssl_protocol._transport.resume_reading() + + def set_write_buffer_limits(self, high=None, low=None): + """Set the high- and low-water limits for write flow control. + + These two values control when to call the protocol's + pause_writing() and resume_writing() methods. If specified, + the low-water limit must be less than or equal to the + high-water limit. Neither value can be negative. + + The defaults are implementation-specific. If only the + high-water limit is given, the low-water limit defaults to a + implementation-specific value less than or equal to the + high-water limit. Setting high to zero forces low to zero as + well, and causes pause_writing() to be called whenever the + buffer becomes non-empty. Setting low to zero causes + resume_writing() to be called only once the buffer is empty. + Use of zero for either limit is generally sub-optimal as it + reduces opportunities for doing I/O and computation + concurrently. + """ + self._ssl_protocol._transport.set_write_buffer_limits(high, low) + + def get_write_buffer_size(self): + """Return the current size of the write buffer.""" + return self._ssl_protocol._transport.get_write_buffer_size() + + def write(self, data): + """Write some data bytes to the transport. + + This does not block; it buffers the data and arranges for it + to be sent out asynchronously. + """ + if not isinstance(data, (bytes, bytearray, memoryview)): + raise TypeError("data: expecting a bytes-like instance, got {!r}" + .format(type(data).__name__)) + if not data: + return + self._ssl_protocol._write_appdata(data) + + def can_write_eof(self): + """Return True if this transport supports write_eof(), False if not.""" + return False + + def abort(self): + """Close the transport immediately. + + Buffered data will be lost. No more data will be received. + The protocol's connection_lost() method will (eventually) be + called with None as its argument. + """ + self._ssl_protocol._abort() + + +class SSLProtocol(protocols.Protocol): + """SSL protocol. + + Implementation of SSL on top of a socket using incoming and outgoing + buffers which are ssl.MemoryBIO objects. + """ + + def __init__(self, loop, app_protocol, sslcontext, waiter, + server_side=False, server_hostname=None): + if ssl is None: + raise RuntimeError('stdlib ssl module not available') + + if not sslcontext: + sslcontext = _create_transport_context(server_side, server_hostname) + + self._server_side = server_side + if server_hostname and not server_side: + self._server_hostname = server_hostname + else: + self._server_hostname = None + self._sslcontext = sslcontext + # SSL-specific extra info. More info are set when the handshake + # completes. + self._extra = dict(sslcontext=sslcontext) + + # App data write buffering + self._write_backlog = collections.deque() + self._write_buffer_size = 0 + + self._waiter = waiter + self._closing = False + self._loop = loop + self._app_protocol = app_protocol + self._app_transport = _SSLProtocolTransport(self._loop, + self, self._app_protocol) + self._sslpipe = None + self._session_established = False + self._in_handshake = False + self._in_shutdown = False + + def connection_made(self, transport): + """Called when the low-level connection is made. + + Start the SSL handshake. + """ + self._transport = transport + self._sslpipe = _SSLPipe(self._sslcontext, + self._server_side, + self._server_hostname) + self._start_handshake() + + def connection_lost(self, exc): + """Called when the low-level connection is lost or closed. + + The argument is an exception object or None (the latter + meaning a regular EOF is received or the connection was + aborted or closed). + """ + if self._session_established: + self._session_established = False + self._loop.call_soon(self._app_protocol.connection_lost, exc) + self._transport = None + self._app_transport = None + + def pause_writing(self): + """Called when the low-level transport's buffer goes over + the high-water mark. + """ + self._app_protocol.pause_writing() + + def resume_writing(self): + """Called when the low-level transport's buffer drains below + the low-water mark. + """ + self._app_protocol.resume_writing() + + def data_received(self, data): + """Called when some SSL data is received. + + The argument is a bytes object. + """ + try: + ssldata, appdata = self._sslpipe.feed_ssldata(data) + except ssl.SSLError as e: + if self._loop.get_debug(): + logger.warning('%r: SSL error %s (reason %s)', + self, e.errno, e.reason) + self._abort() + return + + for chunk in ssldata: + self._transport.write(chunk) + + for chunk in appdata: + if chunk: + self._app_protocol.data_received(chunk) + else: + self._start_shutdown() + break + + def eof_received(self): + """Called when the other end of the low-level stream + is half-closed. + + If this returns a false value (including None), the transport + will close itself. If it returns a true value, closing the + transport is up to the protocol. + """ + try: + if self._loop.get_debug(): + logger.debug("%r received EOF", self) + if not self._in_handshake: + keep_open = self._app_protocol.eof_received() + if keep_open: + logger.warning('returning true from eof_received() ' + 'has no effect when using ssl') + finally: + self._transport.close() + + def _get_extra_info(self, name, default=None): + if name in self._extra: + return self._extra[name] + else: + return self._transport.get_extra_info(name, default) + + def _start_shutdown(self): + if self._in_shutdown: + return + self._in_shutdown = True + self._write_appdata(b'') + + def _write_appdata(self, data): + self._write_backlog.append((data, 0)) + self._write_buffer_size += len(data) + self._process_write_backlog() + + def _start_handshake(self): + if self._loop.get_debug(): + logger.debug("%r starts SSL handshake", self) + self._handshake_start_time = self._loop.time() + else: + self._handshake_start_time = None + self._in_handshake = True + # (b'', 1) is a special value in _process_write_backlog() to do + # the SSL handshake + self._write_backlog.append((b'', 1)) + self._loop.call_soon(self._process_write_backlog) + + def _on_handshake_complete(self, handshake_exc): + self._in_handshake = False + + sslobj = self._sslpipe.ssl_object + peercert = None if handshake_exc else sslobj.getpeercert() + try: + if handshake_exc is not None: + raise handshake_exc + if not hasattr(self._sslcontext, 'check_hostname'): + # Verify hostname if requested, Python 3.4+ uses check_hostname + # and checks the hostname in do_handshake() + if (self._server_hostname + and self._sslcontext.verify_mode != ssl.CERT_NONE): + ssl.match_hostname(peercert, self._server_hostname) + except BaseException as exc: + if self._loop.get_debug(): + if isinstance(exc, ssl.CertificateError): + logger.warning("%r: SSL handshake failed " + "on verifying the certificate", + self, exc_info=True) + else: + logger.warning("%r: SSL handshake failed", + self, exc_info=True) + self._transport.close() + if isinstance(exc, Exception): + if self._waiter is not None: + self._waiter.set_exception(exc) + return + else: + raise + + if self._loop.get_debug(): + dt = self._loop.time() - self._handshake_start_time + logger.debug("%r: SSL handshake took %.1f ms", self, dt * 1e3) + + # Add extra info that becomes available after handshake. + self._extra.update(peercert=peercert, + cipher=sslobj.cipher(), + compression=sslobj.compression(), + ) + self._app_protocol.connection_made(self._app_transport) + if self._waiter is not None: + # wait until protocol.connection_made() has been called + self._waiter._set_result_unless_cancelled(None) + self._session_established = True + # In case transport.write() was already called + self._process_write_backlog() + + def _process_write_backlog(self): + # Try to make progress on the write backlog. + if self._transport is None: + return + + try: + for i in range(len(self._write_backlog)): + data, offset = self._write_backlog[0] + if data: + ssldata, offset = self._sslpipe.feed_appdata(data, offset) + elif offset: + ssldata = self._sslpipe.do_handshake(self._on_handshake_complete) + offset = 1 + else: + ssldata = self._sslpipe.shutdown(self._finalize) + offset = 1 + + for chunk in ssldata: + self._transport.write(chunk) + + if offset < len(data): + self._write_backlog[0] = (data, offset) + # A short write means that a write is blocked on a read + # We need to enable reading if it is paused! + assert self._sslpipe.need_ssldata + if self._transport._paused: + self._transport.resume_reading() + break + + # An entire chunk from the backlog was processed. We can + # delete it and reduce the outstanding buffer size. + del self._write_backlog[0] + self._write_buffer_size -= len(data) + except BaseException as exc: + if self._in_handshake: + self._on_handshake_complete(exc) + else: + self._fatal_error(exc, 'Fatal error on SSL transport') + + def _fatal_error(self, exc, message='Fatal error on transport'): + # Should be called from exception handler only. + if isinstance(exc, (BrokenPipeError, ConnectionResetError)): + if self._loop.get_debug(): + logger.debug("%r: %s", self, message, exc_info=True) + else: + self._loop.call_exception_handler({ + 'message': message, + 'exception': exc, + 'transport': self._transport, + 'protocol': self, + }) + if self._transport: + self._transport._force_close(exc) + + def _finalize(self): + if self._transport is not None: + self._transport.close() + + def _abort(self): + if self._transport is not None: + try: + self._transport.abort() + finally: + self._finalize() diff --git a/Lib/asyncio/test_utils.py b/Lib/asyncio/test_utils.py --- a/Lib/asyncio/test_utils.py +++ b/Lib/asyncio/test_utils.py @@ -434,3 +434,8 @@ sock = mock.Mock(socket.socket) sock.gettimeout.return_value = 0.0 return sock + + +def force_legacy_ssl_support(): + return mock.patch('asyncio.sslproto._is_sslproto_available', + return_value=False) diff --git a/Lib/test/test_asyncio/test_events.py b/Lib/test/test_asyncio/test_events.py --- a/Lib/test/test_asyncio/test_events.py +++ b/Lib/test/test_asyncio/test_events.py @@ -650,6 +650,10 @@ *httpd.address) self._test_create_ssl_connection(httpd, create_connection) + def test_legacy_create_ssl_connection(self): + with test_utils.force_legacy_ssl_support(): + self.test_create_ssl_connection() + @unittest.skipIf(ssl is None, 'No ssl module') @unittest.skipUnless(hasattr(socket, 'AF_UNIX'), 'No UNIX Sockets') def test_create_ssl_unix_connection(self): @@ -666,6 +670,10 @@ self._test_create_ssl_connection(httpd, create_connection, check_sockname) + def test_legacy_create_ssl_unix_connection(self): + with test_utils.force_legacy_ssl_support(): + self.test_create_ssl_unix_connection() + def test_create_connection_local_addr(self): with test_utils.run_test_server() as httpd: port = support.find_unused_port() @@ -826,6 +834,10 @@ # stop serving server.close() + def test_legacy_create_server_ssl(self): + with test_utils.force_legacy_ssl_support(): + self.test_create_server_ssl() + @unittest.skipIf(ssl is None, 'No ssl module') @unittest.skipUnless(hasattr(socket, 'AF_UNIX'), 'No UNIX Sockets') def test_create_unix_server_ssl(self): @@ -857,6 +869,10 @@ # stop serving server.close() + def test_legacy_create_unix_server_ssl(self): + with test_utils.force_legacy_ssl_support(): + self.test_create_unix_server_ssl() + @unittest.skipIf(ssl is None, 'No ssl module') def test_create_server_ssl_verify_failed(self): proto = MyProto(loop=self.loop) @@ -881,6 +897,10 @@ self.assertIsNone(proto.transport) server.close() + def test_legacy_create_server_ssl_verify_failed(self): + with test_utils.force_legacy_ssl_support(): + self.test_create_server_ssl_verify_failed() + @unittest.skipIf(ssl is None, 'No ssl module') @unittest.skipUnless(hasattr(socket, 'AF_UNIX'), 'No UNIX Sockets') def test_create_unix_server_ssl_verify_failed(self): @@ -907,6 +927,10 @@ self.assertIsNone(proto.transport) server.close() + def test_legacy_create_unix_server_ssl_verify_failed(self): + with test_utils.force_legacy_ssl_support(): + self.test_create_unix_server_ssl_verify_failed() + @unittest.skipIf(ssl is None, 'No ssl module') def test_create_server_ssl_match_failed(self): proto = MyProto(loop=self.loop) @@ -934,6 +958,10 @@ proto.transport.close() server.close() + def test_legacy_create_server_ssl_match_failed(self): + with test_utils.force_legacy_ssl_support(): + self.test_create_server_ssl_match_failed() + @unittest.skipIf(ssl is None, 'No ssl module') @unittest.skipUnless(hasattr(socket, 'AF_UNIX'), 'No UNIX Sockets') def test_create_unix_server_ssl_verified(self): @@ -958,6 +986,11 @@ proto.transport.close() client.close() server.close() + self.loop.run_until_complete(proto.done) + + def test_legacy_create_unix_server_ssl_verified(self): + with test_utils.force_legacy_ssl_support(): + self.test_create_unix_server_ssl_verified() @unittest.skipIf(ssl is None, 'No ssl module') def test_create_server_ssl_verified(self): @@ -982,6 +1015,11 @@ proto.transport.close() client.close() server.close() + self.loop.run_until_complete(proto.done) + + def test_legacy_create_server_ssl_verified(self): + with test_utils.force_legacy_ssl_support(): + self.test_create_server_ssl_verified() def test_create_server_sock(self): proto = asyncio.Future(loop=self.loop) @@ -1746,20 +1784,20 @@ def create_event_loop(self): return asyncio.ProactorEventLoop() - def test_create_ssl_connection(self): - raise unittest.SkipTest("IocpEventLoop incompatible with SSL") - - def test_create_server_ssl(self): - raise unittest.SkipTest("IocpEventLoop incompatible with SSL") - - def test_create_server_ssl_verify_failed(self): - raise unittest.SkipTest("IocpEventLoop incompatible with SSL") - - def test_create_server_ssl_match_failed(self): - raise unittest.SkipTest("IocpEventLoop incompatible with SSL") - - def test_create_server_ssl_verified(self): - raise unittest.SkipTest("IocpEventLoop incompatible with SSL") + def test_legacy_create_ssl_connection(self): + raise unittest.SkipTest("IocpEventLoop incompatible with legacy SSL") + + def test_legacy_create_server_ssl(self): + raise unittest.SkipTest("IocpEventLoop incompatible with legacy SSL") + + def test_legacy_create_server_ssl_verify_failed(self): + raise unittest.SkipTest("IocpEventLoop incompatible with legacy SSL") + + def test_legacy_create_server_ssl_match_failed(self): + raise unittest.SkipTest("IocpEventLoop incompatible with legacy SSL") + + def test_legacy_create_server_ssl_verified(self): + raise unittest.SkipTest("IocpEventLoop incompatible with legacy SSL") def test_reader_callback(self): raise unittest.SkipTest("IocpEventLoop does not have add_reader()") diff --git a/Lib/test/test_asyncio/test_selector_events.py b/Lib/test/test_asyncio/test_selector_events.py --- a/Lib/test/test_asyncio/test_selector_events.py +++ b/Lib/test/test_asyncio/test_selector_events.py @@ -59,9 +59,13 @@ with test_utils.disable_logger(): transport = self.loop._make_ssl_transport( m, asyncio.Protocol(), m, waiter) - self.assertIsInstance(transport, _SelectorSslTransport) + # Sanity check + class_name = transport.__class__.__name__ + self.assertIn("ssl", class_name.lower()) + self.assertIn("transport", class_name.lower()) @mock.patch('asyncio.selector_events.ssl', None) + @mock.patch('asyncio.sslproto.ssl', None) def test_make_ssl_transport_without_ssl_error(self): m = mock.Mock() self.loop.add_reader = mock.Mock() -- Repository URL: https://hg.python.org/cpython From python-checkins at python.org Wed Jan 14 00:30:48 2015 From: python-checkins at python.org (victor.stinner) Date: Tue, 13 Jan 2015 23:30:48 +0000 Subject: [Python-checkins] =?utf-8?q?cpython=3A_Issue_=2322560=2C_asyncio_?= =?utf-8?q?doc=3A_ProactorEventLoop_now_supports_SSL!?= Message-ID: <20150113233029.72575.56307@psf.io> https://hg.python.org/cpython/rev/b9fbbe7103e7 changeset: 94130:b9fbbe7103e7 user: Victor Stinner date: Wed Jan 14 00:30:22 2015 +0100 summary: Issue #22560, asyncio doc: ProactorEventLoop now supports SSL! files: Doc/library/asyncio-eventloops.rst | 6 ++++-- 1 files changed, 4 insertions(+), 2 deletions(-) diff --git a/Doc/library/asyncio-eventloops.rst b/Doc/library/asyncio-eventloops.rst --- a/Doc/library/asyncio-eventloops.rst +++ b/Doc/library/asyncio-eventloops.rst @@ -100,8 +100,6 @@ :class:`ProactorEventLoop` specific limits: -- SSL is not supported: :meth:`~BaseEventLoop.create_connection` and - :meth:`~BaseEventLoop.create_server` cannot be used with SSL for example - :meth:`~BaseEventLoop.create_datagram_endpoint` (UDP) is not supported - :meth:`~BaseEventLoop.add_reader` and :meth:`~BaseEventLoop.add_writer` are not supported @@ -112,6 +110,10 @@ `_) and on the Windows configuration. See :ref:`asyncio delayed calls `. +.. versionchanged:: 3.5 + + :class:`ProactorEventLoop` now supports SSL. + Mac OS X ^^^^^^^^ -- Repository URL: https://hg.python.org/cpython From python-checkins at python.org Wed Jan 14 00:54:45 2015 From: python-checkins at python.org (victor.stinner) Date: Tue, 13 Jan 2015 23:54:45 +0000 Subject: [Python-checkins] =?utf-8?q?cpython_=28merge_3=2E4_-=3E_default?= =?utf-8?q?=29=3A_Merge_3=2E4_=28asyncio=29?= Message-ID: <20150113235437.22417.55706@psf.io> https://hg.python.org/cpython/rev/9e5d35ee0903 changeset: 94132:9e5d35ee0903 parent: 94130:b9fbbe7103e7 parent: 94131:94a6f9a3580e user: Victor Stinner date: Wed Jan 14 00:54:00 2015 +0100 summary: Merge 3.4 (asyncio) files: Lib/asyncio/streams.py | 47 ++++++++++++++--------------- 1 files changed, 22 insertions(+), 25 deletions(-) diff --git a/Lib/asyncio/streams.py b/Lib/asyncio/streams.py --- a/Lib/asyncio/streams.py +++ b/Lib/asyncio/streams.py @@ -313,8 +313,8 @@ else: self._loop = loop self._buffer = bytearray() - self._eof = False # Whether we're done. - self._waiter = None # A future. + self._eof = False # Whether we're done. + self._waiter = None # A future used by _wait_for_data() self._exception = None self._transport = None self._paused = False @@ -331,6 +331,14 @@ if not waiter.cancelled(): waiter.set_exception(exc) + def _wakeup_waiter(self): + """Wakeup read() or readline() function waiting for data or EOF.""" + waiter = self._waiter + if waiter is not None: + self._waiter = None + if not waiter.cancelled(): + waiter.set_result(None) + def set_transport(self, transport): assert self._transport is None, 'Transport already set' self._transport = transport @@ -342,11 +350,7 @@ def feed_eof(self): self._eof = True - waiter = self._waiter - if waiter is not None: - self._waiter = None - if not waiter.cancelled(): - waiter.set_result(True) + self._wakeup_waiter() def at_eof(self): """Return True if the buffer is empty and 'feed_eof' was called.""" @@ -359,12 +363,7 @@ return self._buffer.extend(data) - - waiter = self._waiter - if waiter is not None: - self._waiter = None - if not waiter.cancelled(): - waiter.set_result(False) + self._wakeup_waiter() if (self._transport is not None and not self._paused and @@ -379,7 +378,8 @@ else: self._paused = True - def _create_waiter(self, func_name): + def _wait_for_data(self, func_name): + """Wait until feed_data() or feed_eof() is called.""" # StreamReader uses a future to link the protocol feed_data() method # to a read coroutine. Running two read coroutines at the same time # would have an unexpected behaviour. It would not possible to know @@ -387,7 +387,12 @@ if self._waiter is not None: raise RuntimeError('%s() called while another coroutine is ' 'already waiting for incoming data' % func_name) - return futures.Future(loop=self._loop) + + self._waiter = futures.Future(loop=self._loop) + try: + yield from self._waiter + finally: + self._waiter = None @coroutine def readline(self): @@ -417,11 +422,7 @@ break if not_enough: - self._waiter = self._create_waiter('readline') - try: - yield from self._waiter - finally: - self._waiter = None + yield from self._wait_for_data('readline') self._maybe_resume_transport() return bytes(line) @@ -448,11 +449,7 @@ return b''.join(blocks) else: if not self._buffer and not self._eof: - self._waiter = self._create_waiter('read') - try: - yield from self._waiter - finally: - self._waiter = None + yield from self._wait_for_data('read') if n < 0 or len(self._buffer) <= n: data = bytes(self._buffer) -- Repository URL: https://hg.python.org/cpython From python-checkins at python.org Wed Jan 14 00:54:45 2015 From: python-checkins at python.org (victor.stinner) Date: Tue, 13 Jan 2015 23:54:45 +0000 Subject: [Python-checkins] =?utf-8?b?Y3B5dGhvbiAoMy40KTogSXNzdWUgIzIzMTk4?= =?utf-8?q?=3A_Reactor_asyncio=2EStreamReader?= Message-ID: <20150113235437.22405.80279@psf.io> https://hg.python.org/cpython/rev/94a6f9a3580e changeset: 94131:94a6f9a3580e branch: 3.4 parent: 94128:432b817611f2 user: Victor Stinner date: Wed Jan 14 00:53:37 2015 +0100 summary: Issue #23198: Reactor asyncio.StreamReader - Add a new _wakeup_waiter() method - Replace _create_waiter() method with a _wait_for_data() coroutine function - Use the value None instead of True or False to wake up the waiter files: Lib/asyncio/streams.py | 47 ++++++++++++++--------------- 1 files changed, 22 insertions(+), 25 deletions(-) diff --git a/Lib/asyncio/streams.py b/Lib/asyncio/streams.py --- a/Lib/asyncio/streams.py +++ b/Lib/asyncio/streams.py @@ -313,8 +313,8 @@ else: self._loop = loop self._buffer = bytearray() - self._eof = False # Whether we're done. - self._waiter = None # A future. + self._eof = False # Whether we're done. + self._waiter = None # A future used by _wait_for_data() self._exception = None self._transport = None self._paused = False @@ -331,6 +331,14 @@ if not waiter.cancelled(): waiter.set_exception(exc) + def _wakeup_waiter(self): + """Wakeup read() or readline() function waiting for data or EOF.""" + waiter = self._waiter + if waiter is not None: + self._waiter = None + if not waiter.cancelled(): + waiter.set_result(None) + def set_transport(self, transport): assert self._transport is None, 'Transport already set' self._transport = transport @@ -342,11 +350,7 @@ def feed_eof(self): self._eof = True - waiter = self._waiter - if waiter is not None: - self._waiter = None - if not waiter.cancelled(): - waiter.set_result(True) + self._wakeup_waiter() def at_eof(self): """Return True if the buffer is empty and 'feed_eof' was called.""" @@ -359,12 +363,7 @@ return self._buffer.extend(data) - - waiter = self._waiter - if waiter is not None: - self._waiter = None - if not waiter.cancelled(): - waiter.set_result(False) + self._wakeup_waiter() if (self._transport is not None and not self._paused and @@ -379,7 +378,8 @@ else: self._paused = True - def _create_waiter(self, func_name): + def _wait_for_data(self, func_name): + """Wait until feed_data() or feed_eof() is called.""" # StreamReader uses a future to link the protocol feed_data() method # to a read coroutine. Running two read coroutines at the same time # would have an unexpected behaviour. It would not possible to know @@ -387,7 +387,12 @@ if self._waiter is not None: raise RuntimeError('%s() called while another coroutine is ' 'already waiting for incoming data' % func_name) - return futures.Future(loop=self._loop) + + self._waiter = futures.Future(loop=self._loop) + try: + yield from self._waiter + finally: + self._waiter = None @coroutine def readline(self): @@ -417,11 +422,7 @@ break if not_enough: - self._waiter = self._create_waiter('readline') - try: - yield from self._waiter - finally: - self._waiter = None + yield from self._wait_for_data('readline') self._maybe_resume_transport() return bytes(line) @@ -448,11 +449,7 @@ return b''.join(blocks) else: if not self._buffer and not self._eof: - self._waiter = self._create_waiter('read') - try: - yield from self._waiter - finally: - self._waiter = None + yield from self._wait_for_data('read') if n < 0 or len(self._buffer) <= n: data = bytes(self._buffer) -- Repository URL: https://hg.python.org/cpython From python-checkins at python.org Wed Jan 14 02:15:03 2015 From: python-checkins at python.org (victor.stinner) Date: Wed, 14 Jan 2015 01:15:03 +0000 Subject: [Python-checkins] =?utf-8?q?cpython_=28merge_3=2E4_-=3E_default?= =?utf-8?q?=29=3A_Merge_3=2E4_=28asyncio=29?= Message-ID: <20150114011446.125880.57496@psf.io> https://hg.python.org/cpython/rev/c47fe739f9dd changeset: 94134:c47fe739f9dd parent: 94132:9e5d35ee0903 parent: 94133:1eae3b6fbec6 user: Victor Stinner date: Wed Jan 14 02:13:51 2015 +0100 summary: Merge 3.4 (asyncio) files: Lib/asyncio/base_subprocess.py | 73 ++++++++--- Lib/asyncio/subprocess.py | 16 ++- Lib/test/test_asyncio/test_subprocess.py | 36 +++++ 3 files changed, 100 insertions(+), 25 deletions(-) diff --git a/Lib/asyncio/base_subprocess.py b/Lib/asyncio/base_subprocess.py --- a/Lib/asyncio/base_subprocess.py +++ b/Lib/asyncio/base_subprocess.py @@ -96,32 +96,61 @@ def kill(self): self._proc.kill() + def _kill_wait(self): + """Close pipes, kill the subprocess and read its return status. + + Function called when an exception is raised during the creation + of a subprocess. + """ + if self._loop.get_debug(): + logger.warning('Exception during subprocess creation, ' + 'kill the subprocess %r', + self, + exc_info=True) + + proc = self._proc + if proc.stdout: + proc.stdout.close() + if proc.stderr: + proc.stderr.close() + if proc.stdin: + proc.stdin.close() + try: + proc.kill() + except ProcessLookupError: + pass + proc.wait() + @coroutine def _post_init(self): - proc = self._proc - loop = self._loop - if proc.stdin is not None: - _, pipe = yield from loop.connect_write_pipe( - lambda: WriteSubprocessPipeProto(self, 0), - proc.stdin) - self._pipes[0] = pipe - if proc.stdout is not None: - _, pipe = yield from loop.connect_read_pipe( - lambda: ReadSubprocessPipeProto(self, 1), - proc.stdout) - self._pipes[1] = pipe - if proc.stderr is not None: - _, pipe = yield from loop.connect_read_pipe( - lambda: ReadSubprocessPipeProto(self, 2), - proc.stderr) - self._pipes[2] = pipe + try: + proc = self._proc + loop = self._loop + if proc.stdin is not None: + _, pipe = yield from loop.connect_write_pipe( + lambda: WriteSubprocessPipeProto(self, 0), + proc.stdin) + self._pipes[0] = pipe + if proc.stdout is not None: + _, pipe = yield from loop.connect_read_pipe( + lambda: ReadSubprocessPipeProto(self, 1), + proc.stdout) + self._pipes[1] = pipe + if proc.stderr is not None: + _, pipe = yield from loop.connect_read_pipe( + lambda: ReadSubprocessPipeProto(self, 2), + proc.stderr) + self._pipes[2] = pipe - assert self._pending_calls is not None + assert self._pending_calls is not None - self._loop.call_soon(self._protocol.connection_made, self) - for callback, data in self._pending_calls: - self._loop.call_soon(callback, *data) - self._pending_calls = None + self._loop.call_soon(self._protocol.connection_made, self) + for callback, data in self._pending_calls: + self._loop.call_soon(callback, *data) + self._pending_calls = None + except: + self._kill_wait() + raise def _call(self, cb, *data): if self._pending_calls is not None: diff --git a/Lib/asyncio/subprocess.py b/Lib/asyncio/subprocess.py --- a/Lib/asyncio/subprocess.py +++ b/Lib/asyncio/subprocess.py @@ -60,7 +60,9 @@ protocol=self, reader=None, loop=self._loop) - self.waiter.set_result(None) + + if not self.waiter.cancelled(): + self.waiter.set_result(None) def pipe_data_received(self, fd, data): if fd == 1: @@ -216,7 +218,11 @@ protocol_factory, cmd, stdin=stdin, stdout=stdout, stderr=stderr, **kwds) - yield from protocol.waiter + try: + yield from protocol.waiter + except: + transport._kill_wait() + raise return Process(transport, protocol, loop) @coroutine @@ -232,5 +238,9 @@ program, *args, stdin=stdin, stdout=stdout, stderr=stderr, **kwds) - yield from protocol.waiter + try: + yield from protocol.waiter + except: + transport._kill_wait() + raise return Process(transport, protocol, loop) diff --git a/Lib/test/test_asyncio/test_subprocess.py b/Lib/test/test_asyncio/test_subprocess.py --- a/Lib/test/test_asyncio/test_subprocess.py +++ b/Lib/test/test_asyncio/test_subprocess.py @@ -251,6 +251,42 @@ self.loop.run_until_complete(cancel_wait()) + def test_cancel_make_subprocess_transport_exec(self): + @asyncio.coroutine + def cancel_make_transport(): + coro = asyncio.create_subprocess_exec(*PROGRAM_BLOCKED, + loop=self.loop) + task = self.loop.create_task(coro) + + self.loop.call_soon(task.cancel) + try: + yield from task + except asyncio.CancelledError: + pass + + # ignore the log: + # "Exception during subprocess creation, kill the subprocess" + with test_utils.disable_logger(): + self.loop.run_until_complete(cancel_make_transport()) + + def test_cancel_post_init(self): + @asyncio.coroutine + def cancel_make_transport(): + coro = self.loop.subprocess_exec(asyncio.SubprocessProtocol, + *PROGRAM_BLOCKED) + task = self.loop.create_task(coro) + + self.loop.call_soon(task.cancel) + try: + yield from task + except asyncio.CancelledError: + pass + + # ignore the log: + # "Exception during subprocess creation, kill the subprocess" + with test_utils.disable_logger(): + self.loop.run_until_complete(cancel_make_transport()) + if sys.platform != 'win32': # Unix -- Repository URL: https://hg.python.org/cpython From python-checkins at python.org Wed Jan 14 02:15:03 2015 From: python-checkins at python.org (victor.stinner) Date: Wed, 14 Jan 2015 01:15:03 +0000 Subject: [Python-checkins] =?utf-8?q?cpython_=283=2E4=29=3A_Python_issue_?= =?utf-8?q?=2323173=3A_sync_with_Tulip?= Message-ID: <20150114011446.125886.76755@psf.io> https://hg.python.org/cpython/rev/1eae3b6fbec6 changeset: 94133:1eae3b6fbec6 branch: 3.4 parent: 94131:94a6f9a3580e user: Victor Stinner date: Wed Jan 14 02:10:33 2015 +0100 summary: Python issue #23173: sync with Tulip * If an exception is raised during the creation of a subprocess, kill the subprocess (close pipes, kill and read the return status). Log an error in such case. * Fix SubprocessStreamProtocol.connection_made() to handle cancelled waiter. Add unit test cancelling subprocess methods. files: Lib/asyncio/base_subprocess.py | 73 ++++++++--- Lib/asyncio/subprocess.py | 16 ++- Lib/test/test_asyncio/test_subprocess.py | 36 +++++ 3 files changed, 100 insertions(+), 25 deletions(-) diff --git a/Lib/asyncio/base_subprocess.py b/Lib/asyncio/base_subprocess.py --- a/Lib/asyncio/base_subprocess.py +++ b/Lib/asyncio/base_subprocess.py @@ -96,32 +96,61 @@ def kill(self): self._proc.kill() + def _kill_wait(self): + """Close pipes, kill the subprocess and read its return status. + + Function called when an exception is raised during the creation + of a subprocess. + """ + if self._loop.get_debug(): + logger.warning('Exception during subprocess creation, ' + 'kill the subprocess %r', + self, + exc_info=True) + + proc = self._proc + if proc.stdout: + proc.stdout.close() + if proc.stderr: + proc.stderr.close() + if proc.stdin: + proc.stdin.close() + try: + proc.kill() + except ProcessLookupError: + pass + proc.wait() + @coroutine def _post_init(self): - proc = self._proc - loop = self._loop - if proc.stdin is not None: - _, pipe = yield from loop.connect_write_pipe( - lambda: WriteSubprocessPipeProto(self, 0), - proc.stdin) - self._pipes[0] = pipe - if proc.stdout is not None: - _, pipe = yield from loop.connect_read_pipe( - lambda: ReadSubprocessPipeProto(self, 1), - proc.stdout) - self._pipes[1] = pipe - if proc.stderr is not None: - _, pipe = yield from loop.connect_read_pipe( - lambda: ReadSubprocessPipeProto(self, 2), - proc.stderr) - self._pipes[2] = pipe + try: + proc = self._proc + loop = self._loop + if proc.stdin is not None: + _, pipe = yield from loop.connect_write_pipe( + lambda: WriteSubprocessPipeProto(self, 0), + proc.stdin) + self._pipes[0] = pipe + if proc.stdout is not None: + _, pipe = yield from loop.connect_read_pipe( + lambda: ReadSubprocessPipeProto(self, 1), + proc.stdout) + self._pipes[1] = pipe + if proc.stderr is not None: + _, pipe = yield from loop.connect_read_pipe( + lambda: ReadSubprocessPipeProto(self, 2), + proc.stderr) + self._pipes[2] = pipe - assert self._pending_calls is not None + assert self._pending_calls is not None - self._loop.call_soon(self._protocol.connection_made, self) - for callback, data in self._pending_calls: - self._loop.call_soon(callback, *data) - self._pending_calls = None + self._loop.call_soon(self._protocol.connection_made, self) + for callback, data in self._pending_calls: + self._loop.call_soon(callback, *data) + self._pending_calls = None + except: + self._kill_wait() + raise def _call(self, cb, *data): if self._pending_calls is not None: diff --git a/Lib/asyncio/subprocess.py b/Lib/asyncio/subprocess.py --- a/Lib/asyncio/subprocess.py +++ b/Lib/asyncio/subprocess.py @@ -60,7 +60,9 @@ protocol=self, reader=None, loop=self._loop) - self.waiter.set_result(None) + + if not self.waiter.cancelled(): + self.waiter.set_result(None) def pipe_data_received(self, fd, data): if fd == 1: @@ -216,7 +218,11 @@ protocol_factory, cmd, stdin=stdin, stdout=stdout, stderr=stderr, **kwds) - yield from protocol.waiter + try: + yield from protocol.waiter + except: + transport._kill_wait() + raise return Process(transport, protocol, loop) @coroutine @@ -232,5 +238,9 @@ program, *args, stdin=stdin, stdout=stdout, stderr=stderr, **kwds) - yield from protocol.waiter + try: + yield from protocol.waiter + except: + transport._kill_wait() + raise return Process(transport, protocol, loop) diff --git a/Lib/test/test_asyncio/test_subprocess.py b/Lib/test/test_asyncio/test_subprocess.py --- a/Lib/test/test_asyncio/test_subprocess.py +++ b/Lib/test/test_asyncio/test_subprocess.py @@ -251,6 +251,42 @@ self.loop.run_until_complete(cancel_wait()) + def test_cancel_make_subprocess_transport_exec(self): + @asyncio.coroutine + def cancel_make_transport(): + coro = asyncio.create_subprocess_exec(*PROGRAM_BLOCKED, + loop=self.loop) + task = self.loop.create_task(coro) + + self.loop.call_soon(task.cancel) + try: + yield from task + except asyncio.CancelledError: + pass + + # ignore the log: + # "Exception during subprocess creation, kill the subprocess" + with test_utils.disable_logger(): + self.loop.run_until_complete(cancel_make_transport()) + + def test_cancel_post_init(self): + @asyncio.coroutine + def cancel_make_transport(): + coro = self.loop.subprocess_exec(asyncio.SubprocessProtocol, + *PROGRAM_BLOCKED) + task = self.loop.create_task(coro) + + self.loop.call_soon(task.cancel) + try: + yield from task + except asyncio.CancelledError: + pass + + # ignore the log: + # "Exception during subprocess creation, kill the subprocess" + with test_utils.disable_logger(): + self.loop.run_until_complete(cancel_make_transport()) + if sys.platform != 'win32': # Unix -- Repository URL: https://hg.python.org/cpython From python-checkins at python.org Wed Jan 14 07:57:51 2015 From: python-checkins at python.org (raymond.hettinger) Date: Wed, 14 Jan 2015 06:57:51 +0000 Subject: [Python-checkins] =?utf-8?q?cpython=3A_No_need_to_rebuild_a_const?= =?utf-8?q?ant_dictionary_on_every_call=2E__Move_convert_mapping?= Message-ID: <20150114065744.72555.53906@psf.io> https://hg.python.org/cpython/rev/f99ed1331952 changeset: 94135:f99ed1331952 user: Raymond Hettinger date: Tue Jan 13 22:57:35 2015 -0800 summary: No need to rebuild a constant dictionary on every call. Move convert mapping to module level. files: Lib/functools.py | 33 +++++++++++++++++---------------- 1 files changed, 17 insertions(+), 16 deletions(-) diff --git a/Lib/functools.py b/Lib/functools.py --- a/Lib/functools.py +++ b/Lib/functools.py @@ -174,28 +174,29 @@ return op_result return not op_result +_convert = { + '__lt__': [('__gt__', _gt_from_lt), + ('__le__', _le_from_lt), + ('__ge__', _ge_from_lt)], + '__le__': [('__ge__', _ge_from_le), + ('__lt__', _lt_from_le), + ('__gt__', _gt_from_le)], + '__gt__': [('__lt__', _lt_from_gt), + ('__ge__', _ge_from_gt), + ('__le__', _le_from_gt)], + '__ge__': [('__le__', _le_from_ge), + ('__gt__', _gt_from_ge), + ('__lt__', _lt_from_ge)] +} + def total_ordering(cls): """Class decorator that fills in missing ordering methods""" - convert = { - '__lt__': [('__gt__', _gt_from_lt), - ('__le__', _le_from_lt), - ('__ge__', _ge_from_lt)], - '__le__': [('__ge__', _ge_from_le), - ('__lt__', _lt_from_le), - ('__gt__', _gt_from_le)], - '__gt__': [('__lt__', _lt_from_gt), - ('__ge__', _ge_from_gt), - ('__le__', _le_from_gt)], - '__ge__': [('__le__', _le_from_ge), - ('__gt__', _gt_from_ge), - ('__lt__', _lt_from_ge)] - } # Find user-defined comparisons (not those inherited from object). - roots = [op for op in convert if getattr(cls, op, None) is not getattr(object, op, None)] + roots = [op for op in _convert if getattr(cls, op, None) is not getattr(object, op, None)] if not roots: raise ValueError('must define at least one ordering operation: < > <= >=') root = max(roots) # prefer __lt__ to __le__ to __gt__ to __ge__ - for opname, opfunc in convert[root]: + for opname, opfunc in _convert[root]: if opname not in roots: opfunc.__name__ = opname setattr(cls, opname, opfunc) -- Repository URL: https://hg.python.org/cpython From python-checkins at python.org Wed Jan 14 08:26:46 2015 From: python-checkins at python.org (georg.brandl) Date: Wed, 14 Jan 2015 07:26:46 +0000 Subject: [Python-checkins] =?utf-8?q?cpython=3A_Closes_=2323181=3A_codepoi?= =?utf-8?q?nt_-=3E_code_point?= Message-ID: <20150114072635.125880.66551@psf.io> https://hg.python.org/cpython/rev/c917ba25c007 changeset: 94136:c917ba25c007 user: Georg Brandl date: Wed Jan 14 08:26:30 2015 +0100 summary: Closes #23181: codepoint -> code point files: Doc/c-api/unicode.rst | 2 +- Doc/library/codecs.rst | 12 ++++++------ Doc/library/email.mime.rst | 2 +- Doc/library/functions.rst | 2 +- Doc/library/html.entities.rst | 4 ++-- Doc/tutorial/datastructures.rst | 2 +- Doc/whatsnew/3.3.rst | 18 +++++++++--------- 7 files changed, 21 insertions(+), 21 deletions(-) diff --git a/Doc/c-api/unicode.rst b/Doc/c-api/unicode.rst --- a/Doc/c-api/unicode.rst +++ b/Doc/c-api/unicode.rst @@ -1141,7 +1141,7 @@ mark (U+FEFF). In the other two modes, no BOM mark is prepended. If *Py_UNICODE_WIDE* is not defined, surrogate pairs will be output - as a single codepoint. + as a single code point. Return *NULL* if an exception was raised by the codec. diff --git a/Doc/library/codecs.rst b/Doc/library/codecs.rst --- a/Doc/library/codecs.rst +++ b/Doc/library/codecs.rst @@ -841,7 +841,7 @@ Encodings and Unicode --------------------- -Strings are stored internally as sequences of codepoints in +Strings are stored internally as sequences of code points in range ``0x0``-``0x10FFFF``. (See :pep:`393` for more details about the implementation.) Once a string object is used outside of CPU and memory, endianness @@ -852,23 +852,23 @@ collectivity referred to as :term:`text encodings `. The simplest text encoding (called ``'latin-1'`` or ``'iso-8859-1'``) maps -the codepoints 0-255 to the bytes ``0x0``-``0xff``, which means that a string -object that contains codepoints above ``U+00FF`` can't be encoded with this +the code points 0-255 to the bytes ``0x0``-``0xff``, which means that a string +object that contains code points above ``U+00FF`` can't be encoded with this codec. Doing so will raise a :exc:`UnicodeEncodeError` that looks like the following (although the details of the error message may differ): ``UnicodeEncodeError: 'latin-1' codec can't encode character '\u1234' in position 3: ordinal not in range(256)``. There's another group of encodings (the so called charmap encodings) that choose -a different subset of all Unicode code points and how these codepoints are +a different subset of all Unicode code points and how these code points are mapped to the bytes ``0x0``-``0xff``. To see how this is done simply open e.g. :file:`encodings/cp1252.py` (which is an encoding that is used primarily on Windows). There's a string constant with 256 characters that shows you which character is mapped to which byte value. -All of these encodings can only encode 256 of the 1114112 codepoints +All of these encodings can only encode 256 of the 1114112 code points defined in Unicode. A simple and straightforward way that can store each Unicode -code point, is to store each codepoint as four consecutive bytes. There are two +code point, is to store each code point as four consecutive bytes. There are two possibilities: store the bytes in big endian or in little endian order. These two encodings are called ``UTF-32-BE`` and ``UTF-32-LE`` respectively. Their disadvantage is that if e.g. you use ``UTF-32-BE`` on a little endian machine you diff --git a/Doc/library/email.mime.rst b/Doc/library/email.mime.rst --- a/Doc/library/email.mime.rst +++ b/Doc/library/email.mime.rst @@ -194,7 +194,7 @@ minor type and defaults to :mimetype:`plain`. *_charset* is the character set of the text and is passed as an argument to the :class:`~email.mime.nonmultipart.MIMENonMultipart` constructor; it defaults - to ``us-ascii`` if the string contains only ``ascii`` codepoints, and + to ``us-ascii`` if the string contains only ``ascii`` code points, and ``utf-8`` otherwise. The *_charset* parameter accepts either a string or a :class:`~email.charset.Charset` instance. diff --git a/Doc/library/functions.rst b/Doc/library/functions.rst --- a/Doc/library/functions.rst +++ b/Doc/library/functions.rst @@ -156,7 +156,7 @@ .. function:: chr(i) - Return the string representing a character whose Unicode codepoint is the + Return the string representing a character whose Unicode code point is the integer *i*. For example, ``chr(97)`` returns the string ``'a'``, while ``chr(931)`` returns the string ``'?'``. This is the inverse of :func:`ord`. diff --git a/Doc/library/html.entities.rst b/Doc/library/html.entities.rst --- a/Doc/library/html.entities.rst +++ b/Doc/library/html.entities.rst @@ -33,12 +33,12 @@ .. data:: name2codepoint - A dictionary that maps HTML entity names to the Unicode codepoints. + A dictionary that maps HTML entity names to the Unicode code points. .. data:: codepoint2name - A dictionary that maps Unicode codepoints to HTML entity names. + A dictionary that maps Unicode code points to HTML entity names. .. rubric:: Footnotes diff --git a/Doc/tutorial/datastructures.rst b/Doc/tutorial/datastructures.rst --- a/Doc/tutorial/datastructures.rst +++ b/Doc/tutorial/datastructures.rst @@ -685,7 +685,7 @@ all items of two sequences compare equal, the sequences are considered equal. If one sequence is an initial sub-sequence of the other, the shorter sequence is the smaller (lesser) one. Lexicographical ordering for strings uses the Unicode -codepoint number to order individual characters. Some examples of comparisons +code point number to order individual characters. Some examples of comparisons between sequences of the same type:: (1, 2, 3) < (1, 2, 4) diff --git a/Doc/whatsnew/3.3.rst b/Doc/whatsnew/3.3.rst --- a/Doc/whatsnew/3.3.rst +++ b/Doc/whatsnew/3.3.rst @@ -228,7 +228,7 @@ Changes introduced by :pep:`393` are the following: -* Python now always supports the full range of Unicode codepoints, including +* Python now always supports the full range of Unicode code points, including non-BMP ones (i.e. from ``U+0000`` to ``U+10FFFF``). The distinction between narrow and wide builds no longer exists and Python now behaves like a wide build, even under Windows. @@ -246,7 +246,7 @@ so ``'\U0010FFFF'[0]`` now returns ``'\U0010FFFF'`` and not ``'\uDBFF'``; * all other functions in the standard library now correctly handle - non-BMP codepoints. + non-BMP code points. * The value of :data:`sys.maxunicode` is now always ``1114111`` (``0x10FFFF`` in hexadecimal). The :c:func:`PyUnicode_GetMax` function still returns @@ -258,13 +258,13 @@ Performance and resource usage ------------------------------ -The storage of Unicode strings now depends on the highest codepoint in the string: - -* pure ASCII and Latin1 strings (``U+0000-U+00FF``) use 1 byte per codepoint; - -* BMP strings (``U+0000-U+FFFF``) use 2 bytes per codepoint; - -* non-BMP strings (``U+10000-U+10FFFF``) use 4 bytes per codepoint. +The storage of Unicode strings now depends on the highest code point in the string: + +* pure ASCII and Latin1 strings (``U+0000-U+00FF``) use 1 byte per code point; + +* BMP strings (``U+0000-U+FFFF``) use 2 bytes per code point; + +* non-BMP strings (``U+10000-U+10FFFF``) use 4 bytes per code point. The net effect is that for most applications, memory usage of string storage should decrease significantly - especially compared to former -- Repository URL: https://hg.python.org/cpython From python-checkins at python.org Wed Jan 14 08:41:19 2015 From: python-checkins at python.org (georg.brandl) Date: Wed, 14 Jan 2015 07:41:19 +0000 Subject: [Python-checkins] =?utf-8?b?Y3B5dGhvbiAoMi43KTogQ2xvc2VzICMyMzE4?= =?utf-8?q?1=3A_codepoint_-=3E_code_point?= Message-ID: <20150114074118.72571.59437@psf.io> https://hg.python.org/cpython/rev/e280a04625cc changeset: 94137:e280a04625cc branch: 2.7 parent: 94123:6a19e37ce94d user: Georg Brandl date: Wed Jan 14 08:26:30 2015 +0100 summary: Closes #23181: codepoint -> code point files: Doc/c-api/unicode.rst | 4 ++-- Doc/library/codecs.rst | 12 ++++++------ Doc/library/htmllib.rst | 4 ++-- Doc/library/json.rst | 2 +- Doc/tutorial/interpreter.rst | 2 +- 5 files changed, 12 insertions(+), 12 deletions(-) diff --git a/Doc/c-api/unicode.rst b/Doc/c-api/unicode.rst --- a/Doc/c-api/unicode.rst +++ b/Doc/c-api/unicode.rst @@ -547,7 +547,7 @@ After completion, *\*byteorder* is set to the current byte order at the end of input data. - In a narrow build codepoints outside the BMP will be decoded as surrogate pairs. + In a narrow build code points outside the BMP will be decoded as surrogate pairs. If *byteorder* is *NULL*, the codec starts in native order mode. @@ -580,7 +580,7 @@ mark (U+FEFF). In the other two modes, no BOM mark is prepended. If *Py_UNICODE_WIDE* is not defined, surrogate pairs will be output - as a single codepoint. + as a single code point. Return *NULL* if an exception was raised by the codec. diff --git a/Doc/library/codecs.rst b/Doc/library/codecs.rst --- a/Doc/library/codecs.rst +++ b/Doc/library/codecs.rst @@ -787,7 +787,7 @@ Encodings and Unicode --------------------- -Unicode strings are stored internally as sequences of codepoints (to be precise +Unicode strings are stored internally as sequences of code points (to be precise as :c:type:`Py_UNICODE` arrays). Depending on the way Python is compiled (either via ``--enable-unicode=ucs2`` or ``--enable-unicode=ucs4``, with the former being the default) :c:type:`Py_UNICODE` is either a 16-bit or 32-bit data @@ -796,24 +796,24 @@ unicode object into a sequence of bytes is called encoding and recreating the unicode object from the sequence of bytes is known as decoding. There are many different methods for how this transformation can be done (these methods are -also called encodings). The simplest method is to map the codepoints 0-255 to +also called encodings). The simplest method is to map the code points 0-255 to the bytes ``0x0``-``0xff``. This means that a unicode object that contains -codepoints above ``U+00FF`` can't be encoded with this method (which is called +code points above ``U+00FF`` can't be encoded with this method (which is called ``'latin-1'`` or ``'iso-8859-1'``). :func:`unicode.encode` will raise a :exc:`UnicodeEncodeError` that looks like this: ``UnicodeEncodeError: 'latin-1' codec can't encode character u'\u1234' in position 3: ordinal not in range(256)``. There's another group of encodings (the so called charmap encodings) that choose -a different subset of all unicode code points and how these codepoints are +a different subset of all unicode code points and how these code points are mapped to the bytes ``0x0``-``0xff``. To see how this is done simply open e.g. :file:`encodings/cp1252.py` (which is an encoding that is used primarily on Windows). There's a string constant with 256 characters that shows you which character is mapped to which byte value. -All of these encodings can only encode 256 of the 1114112 codepoints +All of these encodings can only encode 256 of the 1114112 code points defined in unicode. A simple and straightforward way that can store each Unicode -code point, is to store each codepoint as four consecutive bytes. There are two +code point, is to store each code point as four consecutive bytes. There are two possibilities: store the bytes in big endian or in little endian order. These two encodings are called ``UTF-32-BE`` and ``UTF-32-LE`` respectively. Their disadvantage is that if e.g. you use ``UTF-32-BE`` on a little endian machine you diff --git a/Doc/library/htmllib.rst b/Doc/library/htmllib.rst --- a/Doc/library/htmllib.rst +++ b/Doc/library/htmllib.rst @@ -185,14 +185,14 @@ .. data:: name2codepoint - A dictionary that maps HTML entity names to the Unicode codepoints. + A dictionary that maps HTML entity names to the Unicode code points. .. versionadded:: 2.3 .. data:: codepoint2name - A dictionary that maps Unicode codepoints to HTML entity names. + A dictionary that maps Unicode code points to HTML entity names. .. versionadded:: 2.3 diff --git a/Doc/library/json.rst b/Doc/library/json.rst --- a/Doc/library/json.rst +++ b/Doc/library/json.rst @@ -533,7 +533,7 @@ that don't correspond to valid Unicode characters (e.g. unpaired UTF-16 surrogates), but it does note that they may cause interoperability problems. By default, this module accepts and outputs (when present in the original -:class:`str`) codepoints for such sequences. +:class:`str`) code points for such sequences. Infinite and NaN Number Values diff --git a/Doc/tutorial/interpreter.rst b/Doc/tutorial/interpreter.rst --- a/Doc/tutorial/interpreter.rst +++ b/Doc/tutorial/interpreter.rst @@ -140,7 +140,7 @@ For example, to write Unicode literals including the Euro currency symbol, the ISO-8859-15 encoding can be used, with the Euro symbol having the ordinal value 164. This script, when saved in the ISO-8859-15 encoding, will print the value -8364 (the Unicode codepoint corresponding to the Euro symbol) and then exit:: +8364 (the Unicode code point corresponding to the Euro symbol) and then exit:: # -*- coding: iso-8859-15 -*- -- Repository URL: https://hg.python.org/cpython From solipsis at pitrou.net Wed Jan 14 10:06:38 2015 From: solipsis at pitrou.net (solipsis at pitrou.net) Date: Wed, 14 Jan 2015 10:06:38 +0100 Subject: [Python-checkins] Daily reference leaks (c47fe739f9dd): sum=12 Message-ID: results for c47fe739f9dd on branch "default" -------------------------------------------- test_asyncio leaked [3, 0, 0] memory blocks, sum=3 test_collections leaked [4, 0, 0] references, sum=4 test_collections leaked [2, 0, 0] memory blocks, sum=2 test_functools leaked [0, 0, 3] memory blocks, sum=3 Command line was: ['./python', '-m', 'test.regrtest', '-uall', '-R', '3:3:/home/antoine/cpython/refleaks/reflog43SFzV', '-x'] From python-checkins at python.org Wed Jan 14 16:04:35 2015 From: python-checkins at python.org (victor.stinner) Date: Wed, 14 Jan 2015 15:04:35 +0000 Subject: [Python-checkins] =?utf-8?q?cpython=3A_Issue_=2322038=2C_configur?= =?utf-8?q?e=3A_HAVE=5FSTD=5FATOMIC_now_also_check_that_=22atomic=5Fint=22?= =?utf-8?q?_and?= Message-ID: <20150114150246.125906.80051@psf.io> https://hg.python.org/cpython/rev/dacc944641b1 changeset: 94138:dacc944641b1 parent: 94136:c917ba25c007 user: Victor Stinner date: Wed Jan 14 16:01:46 2015 +0100 summary: Issue #22038, configure: HAVE_STD_ATOMIC now also check that "atomic_int" and "_Atomic void*" types work. Change needed on FreeBSD 10 where stdatomic.h is available but the compiler fails on "_Atomic void*" with "_Atomic cannot be applied to incomplete type 'void'". files: configure | 3 ++- configure.ac | 6 ++++-- pyconfig.h.in | 2 +- 3 files changed, 7 insertions(+), 4 deletions(-) diff --git a/configure b/configure --- a/configure +++ b/configure @@ -15711,7 +15711,8 @@ #include - _Atomic int value = ATOMIC_VAR_INIT(1); + atomic_int value = ATOMIC_VAR_INIT(1); + _Atomic void *py_atomic_address = (void*) &value; int main() { int loaded_value = atomic_load(&value); return 0; diff --git a/configure.ac b/configure.ac --- a/configure.ac +++ b/configure.ac @@ -4890,7 +4890,8 @@ [ AC_LANG_SOURCE([[ #include - _Atomic int value = ATOMIC_VAR_INIT(1); + atomic_int value = ATOMIC_VAR_INIT(1); + _Atomic void *py_atomic_address = (void*) &value; int main() { int loaded_value = atomic_load(&value); return 0; @@ -4901,7 +4902,8 @@ AC_MSG_RESULT($have_stdatomic_h) if test "$have_stdatomic_h" = yes; then - AC_DEFINE(HAVE_STD_ATOMIC, 1, [Has stdatomic.h]) + AC_DEFINE(HAVE_STD_ATOMIC, 1, + [Has stdatomic.h, atomic_int and _Atomic void* types work]) fi # Check for GCC >= 4.7 __atomic builtins diff --git a/pyconfig.h.in b/pyconfig.h.in --- a/pyconfig.h.in +++ b/pyconfig.h.in @@ -880,7 +880,7 @@ /* Define to 1 if you have the header file. */ #undef HAVE_STDLIB_H -/* Has stdatomic.h */ +/* Has stdatomic.h, atomic_int and _Atomic void* types work */ #undef HAVE_STD_ATOMIC /* Define to 1 if you have the `strdup' function. */ -- Repository URL: https://hg.python.org/cpython From python-checkins at python.org Wed Jan 14 17:03:00 2015 From: python-checkins at python.org (victor.stinner) Date: Wed, 14 Jan 2015 16:03:00 +0000 Subject: [Python-checkins] =?utf-8?q?cpython_=28merge_3=2E4_-=3E_default?= =?utf-8?q?=29=3A_Merge_3=2E4_=28asyncio=29?= Message-ID: <20150114160121.8751.38471@psf.io> https://hg.python.org/cpython/rev/697f727fa91f changeset: 94140:697f727fa91f parent: 94138:dacc944641b1 parent: 94139:c9ad45b15919 user: Victor Stinner date: Wed Jan 14 17:00:29 2015 +0100 summary: Merge 3.4 (asyncio) files: Lib/asyncio/selector_events.py | 2 +- Lib/asyncio/sslproto.py | 5 +- Lib/test/test_asyncio/test_selector_events.py | 20 +++- Lib/test/test_asyncio/test_sslproto.py | 45 ++++++++++ 4 files changed, 65 insertions(+), 7 deletions(-) diff --git a/Lib/asyncio/selector_events.py b/Lib/asyncio/selector_events.py --- a/Lib/asyncio/selector_events.py +++ b/Lib/asyncio/selector_events.py @@ -750,7 +750,7 @@ self._loop.remove_reader(self._sock_fd) self._loop.remove_writer(self._sock_fd) self._sock.close() - if self._waiter is not None: + if self._waiter is not None and not self._waiter.cancelled(): self._waiter.set_exception(exc) if isinstance(exc, Exception): return diff --git a/Lib/asyncio/sslproto.py b/Lib/asyncio/sslproto.py --- a/Lib/asyncio/sslproto.py +++ b/Lib/asyncio/sslproto.py @@ -530,10 +530,11 @@ self._in_handshake = False sslobj = self._sslpipe.ssl_object - peercert = None if handshake_exc else sslobj.getpeercert() try: if handshake_exc is not None: raise handshake_exc + + peercert = sslobj.getpeercert() if not hasattr(self._sslcontext, 'check_hostname'): # Verify hostname if requested, Python 3.4+ uses check_hostname # and checks the hostname in do_handshake() @@ -551,7 +552,7 @@ self, exc_info=True) self._transport.close() if isinstance(exc, Exception): - if self._waiter is not None: + if self._waiter is not None and not self._waiter.cancelled(): self._waiter.set_exception(exc) return else: diff --git a/Lib/test/test_asyncio/test_selector_events.py b/Lib/test/test_asyncio/test_selector_events.py --- a/Lib/test/test_asyncio/test_selector_events.py +++ b/Lib/test/test_asyncio/test_selector_events.py @@ -1148,16 +1148,28 @@ self.assertTrue(self.sslsock.close.called) def test_on_handshake_base_exc(self): + waiter = asyncio.Future(loop=self.loop) transport = _SelectorSslTransport( - self.loop, self.sock, self.protocol, self.sslcontext) - transport._waiter = asyncio.Future(loop=self.loop) + self.loop, self.sock, self.protocol, self.sslcontext, waiter) exc = BaseException() self.sslsock.do_handshake.side_effect = exc with test_utils.disable_logger(): self.assertRaises(BaseException, transport._on_handshake, 0) self.assertTrue(self.sslsock.close.called) - self.assertTrue(transport._waiter.done()) - self.assertIs(exc, transport._waiter.exception()) + self.assertTrue(waiter.done()) + self.assertIs(exc, waiter.exception()) + + def test_cancel_handshake(self): + # Python issue #23197: cancelling an handshake must not raise an + # exception or log an error, even if the handshake failed + waiter = asyncio.Future(loop=self.loop) + transport = _SelectorSslTransport( + self.loop, self.sock, self.protocol, self.sslcontext, waiter) + waiter.cancel() + exc = ValueError() + self.sslsock.do_handshake.side_effect = exc + with test_utils.disable_logger(): + transport._on_handshake(0) def test_pause_resume_reading(self): tr = self._make_one() diff --git a/Lib/test/test_asyncio/test_sslproto.py b/Lib/test/test_asyncio/test_sslproto.py new file mode 100644 --- /dev/null +++ b/Lib/test/test_asyncio/test_sslproto.py @@ -0,0 +1,45 @@ +"""Tests for asyncio/sslproto.py.""" + +import unittest +from unittest import mock + +import asyncio +from asyncio import sslproto +from asyncio import test_utils + + +class SslProtoHandshakeTests(test_utils.TestCase): + + def setUp(self): + self.loop = asyncio.new_event_loop() + self.set_event_loop(self.loop) + + def test_cancel_handshake(self): + # Python issue #23197: cancelling an handshake must not raise an + # exception or log an error, even if the handshake failed + sslcontext = test_utils.dummy_ssl_context() + app_proto = asyncio.Protocol() + waiter = asyncio.Future(loop=self.loop) + ssl_proto = sslproto.SSLProtocol(self.loop, app_proto, sslcontext, + waiter) + handshake_fut = asyncio.Future(loop=self.loop) + + def do_handshake(callback): + exc = Exception() + callback(exc) + handshake_fut.set_result(None) + return [] + + waiter.cancel() + transport = mock.Mock() + sslpipe = mock.Mock() + sslpipe.do_handshake.side_effect = do_handshake + with mock.patch('asyncio.sslproto._SSLPipe', return_value=sslpipe): + ssl_proto.connection_made(transport) + + with test_utils.disable_logger(): + self.loop.run_until_complete(handshake_fut) + + +if __name__ == '__main__': + unittest.main() -- Repository URL: https://hg.python.org/cpython From python-checkins at python.org Wed Jan 14 17:03:00 2015 From: python-checkins at python.org (victor.stinner) Date: Wed, 14 Jan 2015 16:03:00 +0000 Subject: [Python-checkins] =?utf-8?b?Y3B5dGhvbiAoMy40KTogSXNzdWUgIzIzMTk3?= =?utf-8?q?=2C_asyncio=3A_On_SSL_handshake_failure=2C_check_if_the_waiter_?= =?utf-8?q?is?= Message-ID: <20150114160120.125886.1047@psf.io> https://hg.python.org/cpython/rev/c9ad45b15919 changeset: 94139:c9ad45b15919 branch: 3.4 parent: 94133:1eae3b6fbec6 user: Victor Stinner date: Wed Jan 14 16:56:20 2015 +0100 summary: Issue #23197, asyncio: On SSL handshake failure, check if the waiter is cancelled before setting its exception. * Add unit tests for this case. * Cleanup also sslproto.py files: Lib/asyncio/selector_events.py | 2 +- Lib/asyncio/sslproto.py | 5 +- Lib/test/test_asyncio/test_selector_events.py | 20 +++- Lib/test/test_asyncio/test_sslproto.py | 45 ++++++++++ 4 files changed, 65 insertions(+), 7 deletions(-) diff --git a/Lib/asyncio/selector_events.py b/Lib/asyncio/selector_events.py --- a/Lib/asyncio/selector_events.py +++ b/Lib/asyncio/selector_events.py @@ -750,7 +750,7 @@ self._loop.remove_reader(self._sock_fd) self._loop.remove_writer(self._sock_fd) self._sock.close() - if self._waiter is not None: + if self._waiter is not None and not self._waiter.cancelled(): self._waiter.set_exception(exc) if isinstance(exc, Exception): return diff --git a/Lib/asyncio/sslproto.py b/Lib/asyncio/sslproto.py --- a/Lib/asyncio/sslproto.py +++ b/Lib/asyncio/sslproto.py @@ -530,10 +530,11 @@ self._in_handshake = False sslobj = self._sslpipe.ssl_object - peercert = None if handshake_exc else sslobj.getpeercert() try: if handshake_exc is not None: raise handshake_exc + + peercert = sslobj.getpeercert() if not hasattr(self._sslcontext, 'check_hostname'): # Verify hostname if requested, Python 3.4+ uses check_hostname # and checks the hostname in do_handshake() @@ -551,7 +552,7 @@ self, exc_info=True) self._transport.close() if isinstance(exc, Exception): - if self._waiter is not None: + if self._waiter is not None and not self._waiter.cancelled(): self._waiter.set_exception(exc) return else: diff --git a/Lib/test/test_asyncio/test_selector_events.py b/Lib/test/test_asyncio/test_selector_events.py --- a/Lib/test/test_asyncio/test_selector_events.py +++ b/Lib/test/test_asyncio/test_selector_events.py @@ -1148,16 +1148,28 @@ self.assertTrue(self.sslsock.close.called) def test_on_handshake_base_exc(self): + waiter = asyncio.Future(loop=self.loop) transport = _SelectorSslTransport( - self.loop, self.sock, self.protocol, self.sslcontext) - transport._waiter = asyncio.Future(loop=self.loop) + self.loop, self.sock, self.protocol, self.sslcontext, waiter) exc = BaseException() self.sslsock.do_handshake.side_effect = exc with test_utils.disable_logger(): self.assertRaises(BaseException, transport._on_handshake, 0) self.assertTrue(self.sslsock.close.called) - self.assertTrue(transport._waiter.done()) - self.assertIs(exc, transport._waiter.exception()) + self.assertTrue(waiter.done()) + self.assertIs(exc, waiter.exception()) + + def test_cancel_handshake(self): + # Python issue #23197: cancelling an handshake must not raise an + # exception or log an error, even if the handshake failed + waiter = asyncio.Future(loop=self.loop) + transport = _SelectorSslTransport( + self.loop, self.sock, self.protocol, self.sslcontext, waiter) + waiter.cancel() + exc = ValueError() + self.sslsock.do_handshake.side_effect = exc + with test_utils.disable_logger(): + transport._on_handshake(0) def test_pause_resume_reading(self): tr = self._make_one() diff --git a/Lib/test/test_asyncio/test_sslproto.py b/Lib/test/test_asyncio/test_sslproto.py new file mode 100644 --- /dev/null +++ b/Lib/test/test_asyncio/test_sslproto.py @@ -0,0 +1,45 @@ +"""Tests for asyncio/sslproto.py.""" + +import unittest +from unittest import mock + +import asyncio +from asyncio import sslproto +from asyncio import test_utils + + +class SslProtoHandshakeTests(test_utils.TestCase): + + def setUp(self): + self.loop = asyncio.new_event_loop() + self.set_event_loop(self.loop) + + def test_cancel_handshake(self): + # Python issue #23197: cancelling an handshake must not raise an + # exception or log an error, even if the handshake failed + sslcontext = test_utils.dummy_ssl_context() + app_proto = asyncio.Protocol() + waiter = asyncio.Future(loop=self.loop) + ssl_proto = sslproto.SSLProtocol(self.loop, app_proto, sslcontext, + waiter) + handshake_fut = asyncio.Future(loop=self.loop) + + def do_handshake(callback): + exc = Exception() + callback(exc) + handshake_fut.set_result(None) + return [] + + waiter.cancel() + transport = mock.Mock() + sslpipe = mock.Mock() + sslpipe.do_handshake.side_effect = do_handshake + with mock.patch('asyncio.sslproto._SSLPipe', return_value=sslpipe): + ssl_proto.connection_made(transport) + + with test_utils.disable_logger(): + self.loop.run_until_complete(handshake_fut) + + +if __name__ == '__main__': + unittest.main() -- Repository URL: https://hg.python.org/cpython From python-checkins at python.org Wed Jan 14 17:11:06 2015 From: python-checkins at python.org (victor.stinner) Date: Wed, 14 Jan 2015 16:11:06 +0000 Subject: [Python-checkins] =?utf-8?q?cpython=3A_Closes_=2323234=3A_Refacto?= =?utf-8?q?r_subprocess?= Message-ID: <20150114160850.8739.26565@psf.io> https://hg.python.org/cpython/rev/0c5ae257966f changeset: 94141:0c5ae257966f user: Victor Stinner date: Wed Jan 14 17:07:59 2015 +0100 summary: Closes #23234: Refactor subprocess Use new OSError exceptions, factorize stdin.write() code. files: Lib/subprocess.py | 58 ++++++++++++++-------------------- 1 files changed, 24 insertions(+), 34 deletions(-) diff --git a/Lib/subprocess.py b/Lib/subprocess.py --- a/Lib/subprocess.py +++ b/Lib/subprocess.py @@ -920,6 +920,22 @@ self._devnull = os.open(os.devnull, os.O_RDWR) return self._devnull + def _stdin_write(self, input): + if input: + try: + self.stdin.write(input) + except BrokenPipeError: + # communicate() must ignore broken pipe error + pass + except OSError as e: + if e.errno == errno.EINVAL and self.poll() is not None: + # Issue #19612: On Windows, stdin.write() fails with EINVAL + # if the process already exited before the write + pass + else: + raise + self.stdin.close() + def communicate(self, input=None, timeout=None): """Interact with process: Send data to stdin. Read data from stdout and stderr, until end-of-file is reached. Wait for @@ -945,13 +961,7 @@ stdout = None stderr = None if self.stdin: - if input: - try: - self.stdin.write(input) - except OSError as e: - if e.errno != errno.EPIPE and e.errno != errno.EINVAL: - raise - self.stdin.close() + self._stdin_write(input) elif self.stdout: stdout = _eintr_retry_call(self.stdout.read) self.stdout.close() @@ -1200,21 +1210,7 @@ self.stderr_thread.start() if self.stdin: - if input is not None: - try: - self.stdin.write(input) - except OSError as e: - if e.errno == errno.EPIPE: - # communicate() should ignore pipe full error - pass - elif (e.errno == errno.EINVAL - and self.poll() is not None): - # Issue #19612: stdin.write() fails with EINVAL - # if the process already exited before the write - pass - else: - raise - self.stdin.close() + self._stdin_write(input) # Wait for the reader threads, or time out. If we time out, the # threads remain reading and the fds left open in case the user @@ -1425,9 +1421,8 @@ if errpipe_data: try: _eintr_retry_call(os.waitpid, self.pid, 0) - except OSError as e: - if e.errno != errno.ECHILD: - raise + except ChildProcessError: + pass try: exception_name, hex_errno, err_msg = ( errpipe_data.split(b':', 2)) @@ -1511,9 +1506,7 @@ """All callers to this function MUST hold self._waitpid_lock.""" try: (pid, sts) = _eintr_retry_call(os.waitpid, self.pid, wait_flags) - except OSError as e: - if e.errno != errno.ECHILD: - raise + except ChildProcessError: # This happens if SIGCLD is set to be ignored or waiting # for child processes has otherwise been disabled for our # process. This child is dead, we can't get the status. @@ -1625,12 +1618,9 @@ self._input_offset + _PIPE_BUF] try: self._input_offset += os.write(key.fd, chunk) - except OSError as e: - if e.errno == errno.EPIPE: - selector.unregister(key.fileobj) - key.fileobj.close() - else: - raise + except BrokenPipeError: + selector.unregister(key.fileobj) + key.fileobj.close() else: if self._input_offset >= len(self._input): selector.unregister(key.fileobj) -- Repository URL: https://hg.python.org/cpython From python-checkins at python.org Wed Jan 14 17:42:54 2015 From: python-checkins at python.org (victor.stinner) Date: Wed, 14 Jan 2015 16:42:54 +0000 Subject: [Python-checkins] =?utf-8?b?Y3B5dGhvbiAoMy40KTogSXNzdWUgIzIzMTk3?= =?utf-8?q?=3A_On_SSL_handshake_failure_on_matching_hostname=2C_check_if_t?= =?utf-8?q?he?= Message-ID: <20150114161446.11575.3775@psf.io> https://hg.python.org/cpython/rev/42f4dfc6c6a9 changeset: 94142:42f4dfc6c6a9 branch: 3.4 parent: 94139:c9ad45b15919 user: Victor Stinner date: Wed Jan 14 17:13:28 2015 +0100 summary: Issue #23197: On SSL handshake failure on matching hostname, check if the waiter is cancelled before setting its exception. files: Lib/asyncio/selector_events.py | 3 ++- 1 files changed, 2 insertions(+), 1 deletions(-) diff --git a/Lib/asyncio/selector_events.py b/Lib/asyncio/selector_events.py --- a/Lib/asyncio/selector_events.py +++ b/Lib/asyncio/selector_events.py @@ -774,7 +774,8 @@ "on matching the hostname", self, exc_info=True) self._sock.close() - if self._waiter is not None: + if (self._waiter is not None + and not self._waiter.cancelled()): self._waiter.set_exception(exc) return -- Repository URL: https://hg.python.org/cpython From python-checkins at python.org Wed Jan 14 17:42:54 2015 From: python-checkins at python.org (victor.stinner) Date: Wed, 14 Jan 2015 16:42:54 +0000 Subject: [Python-checkins] =?utf-8?q?cpython_=28merge_3=2E4_-=3E_default?= =?utf-8?q?=29=3A_Merge_3=2E4_=28asyncio=29?= Message-ID: <20150114161446.72569.93397@psf.io> https://hg.python.org/cpython/rev/c00be209bccf changeset: 94143:c00be209bccf parent: 94141:0c5ae257966f parent: 94142:42f4dfc6c6a9 user: Victor Stinner date: Wed Jan 14 17:13:43 2015 +0100 summary: Merge 3.4 (asyncio) files: Lib/asyncio/selector_events.py | 3 ++- 1 files changed, 2 insertions(+), 1 deletions(-) diff --git a/Lib/asyncio/selector_events.py b/Lib/asyncio/selector_events.py --- a/Lib/asyncio/selector_events.py +++ b/Lib/asyncio/selector_events.py @@ -774,7 +774,8 @@ "on matching the hostname", self, exc_info=True) self._sock.close() - if self._waiter is not None: + if (self._waiter is not None + and not self._waiter.cancelled()): self._waiter.set_exception(exc) return -- Repository URL: https://hg.python.org/cpython From python-checkins at python.org Wed Jan 14 22:06:04 2015 From: python-checkins at python.org (benjamin.peterson) Date: Wed, 14 Jan 2015 21:06:04 +0000 Subject: [Python-checkins] =?utf-8?q?peps=3A_add_a_rss_target?= Message-ID: <20150114210545.125890.92216@psf.io> https://hg.python.org/peps/rev/04d75fef530e changeset: 5671:04d75fef530e user: Benjamin Peterson date: Wed Jan 14 16:05:43 2015 -0500 summary: add a rss target files: Makefile | 3 +++ 1 files changed, 3 insertions(+), 0 deletions(-) diff --git a/Makefile b/Makefile --- a/Makefile +++ b/Makefile @@ -21,6 +21,9 @@ pep-0000.txt: $(wildcard pep-????.txt) $(wildcard pep0/*.py) $(PYTHON) genpepindex.py . +rss: + $(PYTHON) pep2rss.py . + install: echo "Installing is not necessary anymore. It will be done in post-commit." -- Repository URL: https://hg.python.org/peps From python-checkins at python.org Thu Jan 15 00:06:37 2015 From: python-checkins at python.org (victor.stinner) Date: Wed, 14 Jan 2015 23:06:37 +0000 Subject: [Python-checkins] =?utf-8?b?Y3B5dGhvbiAoMy40KTogYXN5bmNpbzogc3lu?= =?utf-8?q?c_with_Tulip?= Message-ID: <20150114230630.22407.8324@psf.io> https://hg.python.org/cpython/rev/463bbd862887 changeset: 94144:463bbd862887 branch: 3.4 parent: 94142:42f4dfc6c6a9 user: Victor Stinner date: Thu Jan 15 00:04:21 2015 +0100 summary: asyncio: sync with Tulip * PipeHandle now uses None instead of -1 for a closed handle * Sort imports in windows_utils. * Fix test_events on Python older than 3.5. Skip SSL tests on the ProactorEventLoop if ssl.MemoryIO is missing * Fix BaseEventLoop._create_connection_transport(). Close the transport if the creation of the transport (if the waiter) gets an exception. * _ProactorBasePipeTransport now sets _sock to None when the transport is closed. * Fix BaseSubprocessTransport.close(). Ignore pipes for which the protocol is not set yet (still equal to None). * TestLoop.close() now calls the close() method of the parent class (BaseEventLoop). * Cleanup BaseSelectorEventLoop: create the protocol on a separated line for readability and ease debugging. * Fix BaseSubprocessTransport._kill_wait(). Set the _returncode attribute, so close() doesn't try to terminate the process. * Tests: explicitly close event loops and transports * UNIX pipe transports: add closed/closing in repr(). Add "closed" or "closing" state in the __repr__() method of _UnixReadPipeTransport and _UnixWritePipeTransport classes. files: Lib/asyncio/base_events.py | 7 ++- Lib/asyncio/base_subprocess.py | 4 +- Lib/asyncio/proactor_events.py | 1 + Lib/asyncio/selector_events.py | 5 +- Lib/asyncio/test_utils.py | 1 + Lib/asyncio/unix_events.py | 14 +++++- Lib/asyncio/windows_utils.py | 10 ++-- Lib/test/test_asyncio/test_base_events.py | 1 + Lib/test/test_asyncio/test_events.py | 22 ++++++++++ Lib/test/test_asyncio/test_futures.py | 1 + Lib/test/test_asyncio/test_selector_events.py | 1 + Lib/test/test_asyncio/test_unix_events.py | 1 + 12 files changed, 57 insertions(+), 11 deletions(-) diff --git a/Lib/asyncio/base_events.py b/Lib/asyncio/base_events.py --- a/Lib/asyncio/base_events.py +++ b/Lib/asyncio/base_events.py @@ -634,7 +634,12 @@ else: transport = self._make_socket_transport(sock, protocol, waiter) - yield from waiter + try: + yield from waiter + except Exception as exc: + transport.close() + raise + return transport, protocol @coroutine diff --git a/Lib/asyncio/base_subprocess.py b/Lib/asyncio/base_subprocess.py --- a/Lib/asyncio/base_subprocess.py +++ b/Lib/asyncio/base_subprocess.py @@ -71,6 +71,8 @@ def close(self): for proto in self._pipes.values(): + if proto is None: + continue proto.pipe.close() if self._returncode is None: self.terminate() @@ -119,7 +121,7 @@ proc.kill() except ProcessLookupError: pass - proc.wait() + self._returncode = proc.wait() @coroutine def _post_init(self): diff --git a/Lib/asyncio/proactor_events.py b/Lib/asyncio/proactor_events.py --- a/Lib/asyncio/proactor_events.py +++ b/Lib/asyncio/proactor_events.py @@ -111,6 +111,7 @@ if hasattr(self._sock, 'shutdown'): self._sock.shutdown(socket.SHUT_RDWR) self._sock.close() + self._sock = None server = self._server if server is not None: server._detach() diff --git a/Lib/asyncio/selector_events.py b/Lib/asyncio/selector_events.py --- a/Lib/asyncio/selector_events.py +++ b/Lib/asyncio/selector_events.py @@ -182,13 +182,14 @@ else: raise # The event loop will catch, log and ignore it. else: + protocol = protocol_factory() if sslcontext: self._make_ssl_transport( - conn, protocol_factory(), sslcontext, + conn, protocol, sslcontext, server_side=True, extra={'peername': addr}, server=server) else: self._make_socket_transport( - conn, protocol_factory(), extra={'peername': addr}, + conn, protocol , extra={'peername': addr}, server=server) # It's now up to the protocol to handle the connection. diff --git a/Lib/asyncio/test_utils.py b/Lib/asyncio/test_utils.py --- a/Lib/asyncio/test_utils.py +++ b/Lib/asyncio/test_utils.py @@ -307,6 +307,7 @@ self._time += advance def close(self): + super().close() if self._check_on_close: try: self._gen.send(0) diff --git a/Lib/asyncio/unix_events.py b/Lib/asyncio/unix_events.py --- a/Lib/asyncio/unix_events.py +++ b/Lib/asyncio/unix_events.py @@ -301,7 +301,12 @@ self._loop.call_soon(waiter._set_result_unless_cancelled, None) def __repr__(self): - info = [self.__class__.__name__, 'fd=%s' % self._fileno] + info = [self.__class__.__name__] + if self._pipe is None: + info.append('closed') + elif self._closing: + info.append('closing') + info.append('fd=%s' % self._fileno) if self._pipe is not None: polling = selector_events._test_selector_event( self._loop._selector, @@ -404,7 +409,12 @@ self._loop.call_soon(waiter._set_result_unless_cancelled, None) def __repr__(self): - info = [self.__class__.__name__, 'fd=%s' % self._fileno] + info = [self.__class__.__name__] + if self._pipe is None: + info.append('closed') + elif self._closing: + info.append('closing') + info.append('fd=%s' % self._fileno) if self._pipe is not None: polling = selector_events._test_selector_event( self._loop._selector, diff --git a/Lib/asyncio/windows_utils.py b/Lib/asyncio/windows_utils.py --- a/Lib/asyncio/windows_utils.py +++ b/Lib/asyncio/windows_utils.py @@ -7,13 +7,13 @@ if sys.platform != 'win32': # pragma: no cover raise ImportError('win32 only') -import socket +import _winapi import itertools import msvcrt import os +import socket import subprocess import tempfile -import _winapi __all__ = ['socketpair', 'pipe', 'Popen', 'PIPE', 'PipeHandle'] @@ -136,7 +136,7 @@ self._handle = handle def __repr__(self): - if self._handle != -1: + if self._handle is not None: handle = 'handle=%r' % self._handle else: handle = 'closed' @@ -150,9 +150,9 @@ return self._handle def close(self, *, CloseHandle=_winapi.CloseHandle): - if self._handle != -1: + if self._handle is not None: CloseHandle(self._handle) - self._handle = -1 + self._handle = None __del__ = close diff --git a/Lib/test/test_asyncio/test_base_events.py b/Lib/test/test_asyncio/test_base_events.py --- a/Lib/test/test_asyncio/test_base_events.py +++ b/Lib/test/test_asyncio/test_base_events.py @@ -409,6 +409,7 @@ def test_run_until_complete_loop(self): task = asyncio.Future(loop=self.loop) other_loop = self.new_test_loop() + self.addCleanup(other_loop.close) self.assertRaises(ValueError, other_loop.run_until_complete, task) diff --git a/Lib/test/test_asyncio/test_events.py b/Lib/test/test_asyncio/test_events.py --- a/Lib/test/test_asyncio/test_events.py +++ b/Lib/test/test_asyncio/test_events.py @@ -25,6 +25,7 @@ import asyncio from asyncio import proactor_events from asyncio import selector_events +from asyncio import sslproto from asyncio import test_utils try: from test import support @@ -1585,6 +1586,7 @@ self.assertTrue(all(f.done() for f in proto.disconnects.values())) self.assertEqual(proto.data[1].rstrip(b'\r\n'), b'Python') self.assertEqual(proto.data[2], b'') + transp.close() def test_subprocess_exitcode(self): connect = self.loop.subprocess_shell( @@ -1594,6 +1596,7 @@ self.assertIsInstance(proto, MySubprocessProtocol) self.loop.run_until_complete(proto.completed) self.assertEqual(7, proto.returncode) + transp.close() def test_subprocess_close_after_finish(self): connect = self.loop.subprocess_shell( @@ -1621,6 +1624,7 @@ transp.kill() self.loop.run_until_complete(proto.completed) self.check_killed(proto.returncode) + transp.close() def test_subprocess_terminate(self): prog = os.path.join(os.path.dirname(__file__), 'echo.py') @@ -1635,6 +1639,7 @@ transp.terminate() self.loop.run_until_complete(proto.completed) self.check_terminated(proto.returncode) + transp.close() @unittest.skipIf(sys.platform == 'win32', "Don't have SIGHUP") def test_subprocess_send_signal(self): @@ -1650,6 +1655,7 @@ transp.send_signal(signal.SIGHUP) self.loop.run_until_complete(proto.completed) self.assertEqual(-signal.SIGHUP, proto.returncode) + transp.close() def test_subprocess_stderr(self): prog = os.path.join(os.path.dirname(__file__), 'echo2.py') @@ -1784,6 +1790,22 @@ def create_event_loop(self): return asyncio.ProactorEventLoop() + if not sslproto._is_sslproto_available(): + def test_create_ssl_connection(self): + raise unittest.SkipTest("need python 3.5 (ssl.MemoryBIO)") + + def test_create_server_ssl(self): + raise unittest.SkipTest("need python 3.5 (ssl.MemoryBIO)") + + def test_create_server_ssl_verify_failed(self): + raise unittest.SkipTest("need python 3.5 (ssl.MemoryBIO)") + + def test_create_server_ssl_match_failed(self): + raise unittest.SkipTest("need python 3.5 (ssl.MemoryBIO)") + + def test_create_server_ssl_verified(self): + raise unittest.SkipTest("need python 3.5 (ssl.MemoryBIO)") + def test_legacy_create_ssl_connection(self): raise unittest.SkipTest("IocpEventLoop incompatible with legacy SSL") diff --git a/Lib/test/test_asyncio/test_futures.py b/Lib/test/test_asyncio/test_futures.py --- a/Lib/test/test_asyncio/test_futures.py +++ b/Lib/test/test_asyncio/test_futures.py @@ -29,6 +29,7 @@ def setUp(self): self.loop = self.new_test_loop() + self.addCleanup(self.loop.close) def test_initial_state(self): f = asyncio.Future(loop=self.loop) diff --git a/Lib/test/test_asyncio/test_selector_events.py b/Lib/test/test_asyncio/test_selector_events.py --- a/Lib/test/test_asyncio/test_selector_events.py +++ b/Lib/test/test_asyncio/test_selector_events.py @@ -1744,6 +1744,7 @@ test_utils.MockPattern( 'Fatal error on transport\nprotocol:.*\ntransport:.*'), exc_info=(ConnectionRefusedError, MOCK_ANY, MOCK_ANY)) + transport.close() if __name__ == '__main__': diff --git a/Lib/test/test_asyncio/test_unix_events.py b/Lib/test/test_asyncio/test_unix_events.py --- a/Lib/test/test_asyncio/test_unix_events.py +++ b/Lib/test/test_asyncio/test_unix_events.py @@ -598,6 +598,7 @@ # This is a bit overspecified. :-( m_log.warning.assert_called_with( 'pipe closed by peer or os.write(pipe, data) raised exception.') + tr.close() @mock.patch('os.write') def test_write_close(self, m_write): -- Repository URL: https://hg.python.org/cpython From python-checkins at python.org Thu Jan 15 00:06:37 2015 From: python-checkins at python.org (victor.stinner) Date: Wed, 14 Jan 2015 23:06:37 +0000 Subject: [Python-checkins] =?utf-8?q?cpython_=28merge_3=2E4_-=3E_default?= =?utf-8?q?=29=3A_Merge_3=2E4_=28asyncio=29?= Message-ID: <20150114230630.11561.72451@psf.io> https://hg.python.org/cpython/rev/61a045ac0006 changeset: 94145:61a045ac0006 parent: 94143:c00be209bccf parent: 94144:463bbd862887 user: Victor Stinner date: Thu Jan 15 00:05:18 2015 +0100 summary: Merge 3.4 (asyncio) files: Lib/asyncio/base_events.py | 7 ++- Lib/asyncio/base_subprocess.py | 4 +- Lib/asyncio/proactor_events.py | 1 + Lib/asyncio/selector_events.py | 5 +- Lib/asyncio/test_utils.py | 1 + Lib/asyncio/unix_events.py | 14 +++++- Lib/asyncio/windows_utils.py | 10 ++-- Lib/test/test_asyncio/test_base_events.py | 1 + Lib/test/test_asyncio/test_events.py | 22 ++++++++++ Lib/test/test_asyncio/test_futures.py | 1 + Lib/test/test_asyncio/test_selector_events.py | 1 + Lib/test/test_asyncio/test_unix_events.py | 1 + 12 files changed, 57 insertions(+), 11 deletions(-) diff --git a/Lib/asyncio/base_events.py b/Lib/asyncio/base_events.py --- a/Lib/asyncio/base_events.py +++ b/Lib/asyncio/base_events.py @@ -634,7 +634,12 @@ else: transport = self._make_socket_transport(sock, protocol, waiter) - yield from waiter + try: + yield from waiter + except Exception as exc: + transport.close() + raise + return transport, protocol @coroutine diff --git a/Lib/asyncio/base_subprocess.py b/Lib/asyncio/base_subprocess.py --- a/Lib/asyncio/base_subprocess.py +++ b/Lib/asyncio/base_subprocess.py @@ -71,6 +71,8 @@ def close(self): for proto in self._pipes.values(): + if proto is None: + continue proto.pipe.close() if self._returncode is None: self.terminate() @@ -119,7 +121,7 @@ proc.kill() except ProcessLookupError: pass - proc.wait() + self._returncode = proc.wait() @coroutine def _post_init(self): diff --git a/Lib/asyncio/proactor_events.py b/Lib/asyncio/proactor_events.py --- a/Lib/asyncio/proactor_events.py +++ b/Lib/asyncio/proactor_events.py @@ -111,6 +111,7 @@ if hasattr(self._sock, 'shutdown'): self._sock.shutdown(socket.SHUT_RDWR) self._sock.close() + self._sock = None server = self._server if server is not None: server._detach() diff --git a/Lib/asyncio/selector_events.py b/Lib/asyncio/selector_events.py --- a/Lib/asyncio/selector_events.py +++ b/Lib/asyncio/selector_events.py @@ -182,13 +182,14 @@ else: raise # The event loop will catch, log and ignore it. else: + protocol = protocol_factory() if sslcontext: self._make_ssl_transport( - conn, protocol_factory(), sslcontext, + conn, protocol, sslcontext, server_side=True, extra={'peername': addr}, server=server) else: self._make_socket_transport( - conn, protocol_factory(), extra={'peername': addr}, + conn, protocol , extra={'peername': addr}, server=server) # It's now up to the protocol to handle the connection. diff --git a/Lib/asyncio/test_utils.py b/Lib/asyncio/test_utils.py --- a/Lib/asyncio/test_utils.py +++ b/Lib/asyncio/test_utils.py @@ -307,6 +307,7 @@ self._time += advance def close(self): + super().close() if self._check_on_close: try: self._gen.send(0) diff --git a/Lib/asyncio/unix_events.py b/Lib/asyncio/unix_events.py --- a/Lib/asyncio/unix_events.py +++ b/Lib/asyncio/unix_events.py @@ -301,7 +301,12 @@ self._loop.call_soon(waiter._set_result_unless_cancelled, None) def __repr__(self): - info = [self.__class__.__name__, 'fd=%s' % self._fileno] + info = [self.__class__.__name__] + if self._pipe is None: + info.append('closed') + elif self._closing: + info.append('closing') + info.append('fd=%s' % self._fileno) if self._pipe is not None: polling = selector_events._test_selector_event( self._loop._selector, @@ -404,7 +409,12 @@ self._loop.call_soon(waiter._set_result_unless_cancelled, None) def __repr__(self): - info = [self.__class__.__name__, 'fd=%s' % self._fileno] + info = [self.__class__.__name__] + if self._pipe is None: + info.append('closed') + elif self._closing: + info.append('closing') + info.append('fd=%s' % self._fileno) if self._pipe is not None: polling = selector_events._test_selector_event( self._loop._selector, diff --git a/Lib/asyncio/windows_utils.py b/Lib/asyncio/windows_utils.py --- a/Lib/asyncio/windows_utils.py +++ b/Lib/asyncio/windows_utils.py @@ -7,13 +7,13 @@ if sys.platform != 'win32': # pragma: no cover raise ImportError('win32 only') -import socket +import _winapi import itertools import msvcrt import os +import socket import subprocess import tempfile -import _winapi __all__ = ['socketpair', 'pipe', 'Popen', 'PIPE', 'PipeHandle'] @@ -136,7 +136,7 @@ self._handle = handle def __repr__(self): - if self._handle != -1: + if self._handle is not None: handle = 'handle=%r' % self._handle else: handle = 'closed' @@ -150,9 +150,9 @@ return self._handle def close(self, *, CloseHandle=_winapi.CloseHandle): - if self._handle != -1: + if self._handle is not None: CloseHandle(self._handle) - self._handle = -1 + self._handle = None __del__ = close diff --git a/Lib/test/test_asyncio/test_base_events.py b/Lib/test/test_asyncio/test_base_events.py --- a/Lib/test/test_asyncio/test_base_events.py +++ b/Lib/test/test_asyncio/test_base_events.py @@ -409,6 +409,7 @@ def test_run_until_complete_loop(self): task = asyncio.Future(loop=self.loop) other_loop = self.new_test_loop() + self.addCleanup(other_loop.close) self.assertRaises(ValueError, other_loop.run_until_complete, task) diff --git a/Lib/test/test_asyncio/test_events.py b/Lib/test/test_asyncio/test_events.py --- a/Lib/test/test_asyncio/test_events.py +++ b/Lib/test/test_asyncio/test_events.py @@ -25,6 +25,7 @@ import asyncio from asyncio import proactor_events from asyncio import selector_events +from asyncio import sslproto from asyncio import test_utils try: from test import support @@ -1585,6 +1586,7 @@ self.assertTrue(all(f.done() for f in proto.disconnects.values())) self.assertEqual(proto.data[1].rstrip(b'\r\n'), b'Python') self.assertEqual(proto.data[2], b'') + transp.close() def test_subprocess_exitcode(self): connect = self.loop.subprocess_shell( @@ -1594,6 +1596,7 @@ self.assertIsInstance(proto, MySubprocessProtocol) self.loop.run_until_complete(proto.completed) self.assertEqual(7, proto.returncode) + transp.close() def test_subprocess_close_after_finish(self): connect = self.loop.subprocess_shell( @@ -1621,6 +1624,7 @@ transp.kill() self.loop.run_until_complete(proto.completed) self.check_killed(proto.returncode) + transp.close() def test_subprocess_terminate(self): prog = os.path.join(os.path.dirname(__file__), 'echo.py') @@ -1635,6 +1639,7 @@ transp.terminate() self.loop.run_until_complete(proto.completed) self.check_terminated(proto.returncode) + transp.close() @unittest.skipIf(sys.platform == 'win32', "Don't have SIGHUP") def test_subprocess_send_signal(self): @@ -1650,6 +1655,7 @@ transp.send_signal(signal.SIGHUP) self.loop.run_until_complete(proto.completed) self.assertEqual(-signal.SIGHUP, proto.returncode) + transp.close() def test_subprocess_stderr(self): prog = os.path.join(os.path.dirname(__file__), 'echo2.py') @@ -1784,6 +1790,22 @@ def create_event_loop(self): return asyncio.ProactorEventLoop() + if not sslproto._is_sslproto_available(): + def test_create_ssl_connection(self): + raise unittest.SkipTest("need python 3.5 (ssl.MemoryBIO)") + + def test_create_server_ssl(self): + raise unittest.SkipTest("need python 3.5 (ssl.MemoryBIO)") + + def test_create_server_ssl_verify_failed(self): + raise unittest.SkipTest("need python 3.5 (ssl.MemoryBIO)") + + def test_create_server_ssl_match_failed(self): + raise unittest.SkipTest("need python 3.5 (ssl.MemoryBIO)") + + def test_create_server_ssl_verified(self): + raise unittest.SkipTest("need python 3.5 (ssl.MemoryBIO)") + def test_legacy_create_ssl_connection(self): raise unittest.SkipTest("IocpEventLoop incompatible with legacy SSL") diff --git a/Lib/test/test_asyncio/test_futures.py b/Lib/test/test_asyncio/test_futures.py --- a/Lib/test/test_asyncio/test_futures.py +++ b/Lib/test/test_asyncio/test_futures.py @@ -29,6 +29,7 @@ def setUp(self): self.loop = self.new_test_loop() + self.addCleanup(self.loop.close) def test_initial_state(self): f = asyncio.Future(loop=self.loop) diff --git a/Lib/test/test_asyncio/test_selector_events.py b/Lib/test/test_asyncio/test_selector_events.py --- a/Lib/test/test_asyncio/test_selector_events.py +++ b/Lib/test/test_asyncio/test_selector_events.py @@ -1744,6 +1744,7 @@ test_utils.MockPattern( 'Fatal error on transport\nprotocol:.*\ntransport:.*'), exc_info=(ConnectionRefusedError, MOCK_ANY, MOCK_ANY)) + transport.close() if __name__ == '__main__': diff --git a/Lib/test/test_asyncio/test_unix_events.py b/Lib/test/test_asyncio/test_unix_events.py --- a/Lib/test/test_asyncio/test_unix_events.py +++ b/Lib/test/test_asyncio/test_unix_events.py @@ -598,6 +598,7 @@ # This is a bit overspecified. :-( m_log.warning.assert_called_with( 'pipe closed by peer or os.write(pipe, data) raised exception.') + tr.close() @mock.patch('os.write') def test_write_close(self, m_write): -- Repository URL: https://hg.python.org/cpython From python-checkins at python.org Thu Jan 15 05:05:10 2015 From: python-checkins at python.org (larry.hastings) Date: Thu, 15 Jan 2015 04:05:10 +0000 Subject: [Python-checkins] =?utf-8?q?peps=3A_Added_schedule_for_3=2E4=2E3_?= =?utf-8?q?rc1_and_final=2E__Revised_schedule_for_3=2E5=2E0a1=2E?= Message-ID: <20150115040500.8767.49747@psf.io> https://hg.python.org/peps/rev/990461fae87a changeset: 5672:990461fae87a user: Larry Hastings date: Wed Jan 14 23:04:57 2015 -0500 summary: Added schedule for 3.4.3 rc1 and final. Revised schedule for 3.5.0a1. files: pep-0429.txt | 6 ++++++ pep-0478.txt | 2 +- 2 files changed, 7 insertions(+), 1 deletions(-) diff --git a/pep-0429.txt b/pep-0429.txt --- a/pep-0429.txt +++ b/pep-0429.txt @@ -62,6 +62,12 @@ - 3.4.2 candidate 1: September 22, 2014 - 3.4.2 final: October 6, 2014 +3.4.3 schedule +-------------- + +- 3.4.3 candidate 1: February 8, 2015 +- 3.4.3 final: February 22, 2015 + Features for 3.4 diff --git a/pep-0478.txt b/pep-0478.txt --- a/pep-0478.txt +++ b/pep-0478.txt @@ -36,7 +36,7 @@ The releases: -- 3.5.0 alpha 1: February 1, 2015 +- 3.5.0 alpha 1: February 8, 2015 - 3.5.0 alpha 2: March 8, 2015 - 3.5.0 alpha 3: March 28, 2015 - 3.5.0 alpha 4: April 19, 2015 -- Repository URL: https://hg.python.org/peps From python-checkins at python.org Thu Jan 15 06:00:54 2015 From: python-checkins at python.org (benjamin.peterson) Date: Thu, 15 Jan 2015 05:00:54 +0000 Subject: [Python-checkins] =?utf-8?q?cpython_=28merge_3=2E4_-=3E_default?= =?utf-8?b?KTogbWVyZ2UgMy40ICgjMjMwNjMp?= Message-ID: <20150115050026.72567.9075@psf.io> https://hg.python.org/cpython/rev/731a36c13629 changeset: 94148:731a36c13629 parent: 94145:61a045ac0006 parent: 94146:db09d760b965 user: Benjamin Peterson date: Thu Jan 15 00:00:16 2015 -0500 summary: merge 3.4 (#23063) files: Lib/distutils/command/check.py | 8 ++-- Lib/distutils/tests/test_check.py | 31 +++++++++++++++++++ Misc/NEWS | 3 + 3 files changed, 38 insertions(+), 4 deletions(-) diff --git a/Lib/distutils/command/check.py b/Lib/distutils/command/check.py --- a/Lib/distutils/command/check.py +++ b/Lib/distutils/command/check.py @@ -122,7 +122,7 @@ """Returns warnings when the provided data doesn't compile.""" source_path = StringIO() parser = Parser() - settings = frontend.OptionParser().get_default_values() + settings = frontend.OptionParser(components=(Parser,)).get_default_values() settings.tab_width = 4 settings.pep_references = None settings.rfc_references = None @@ -138,8 +138,8 @@ document.note_source(source_path, -1) try: parser.parse(data, document) - except AttributeError: - reporter.messages.append((-1, 'Could not finish the parsing.', - '', {})) + except AttributeError as e: + reporter.messages.append( + (-1, 'Could not finish the parsing: %s.' % e, '', {})) return reporter.messages diff --git a/Lib/distutils/tests/test_check.py b/Lib/distutils/tests/test_check.py --- a/Lib/distutils/tests/test_check.py +++ b/Lib/distutils/tests/test_check.py @@ -1,4 +1,5 @@ """Tests for distutils.command.check.""" +import textwrap import unittest from test.support import run_unittest @@ -92,6 +93,36 @@ cmd = self._run(metadata, strict=1, restructuredtext=1) self.assertEqual(cmd._warnings, 0) + @unittest.skipUnless(HAS_DOCUTILS, "won't test without docutils") + def test_check_restructuredtext_with_syntax_highlight(self): + # Don't fail if there is a `code` or `code-block` directive + + example_rst_docs = [] + example_rst_docs.append(textwrap.dedent("""\ + Here's some code: + + .. code:: python + + def foo(): + pass + """)) + example_rst_docs.append(textwrap.dedent("""\ + Here's some code: + + .. code-block:: python + + def foo(): + pass + """)) + + for rest_with_code in example_rst_docs: + pkg_info, dist = self.create_dist(long_description=rest_with_code) + cmd = check(dist) + cmd.check_restructuredtext() + self.assertEqual(cmd._warnings, 0) + msgs = cmd._check_rst_data(rest_with_code) + self.assertEqual(len(msgs), 0) + def test_check_all(self): metadata = {'url': 'xxx', 'author': 'xxx'} diff --git a/Misc/NEWS b/Misc/NEWS --- a/Misc/NEWS +++ b/Misc/NEWS @@ -203,6 +203,9 @@ Library ------- +- Issue #23063: In the disutils' check command, fix parsing of reST with code or + code-block directives. + - Issue #23209, #23225: selectors.BaseSelector.get_key() now raises a RuntimeError if the selector is closed. And selectors.BaseSelector.close() now clears its internal reference to the selector mapping to break a -- Repository URL: https://hg.python.org/cpython From python-checkins at python.org Thu Jan 15 06:00:54 2015 From: python-checkins at python.org (benjamin.peterson) Date: Thu, 15 Jan 2015 05:00:54 +0000 Subject: [Python-checkins] =?utf-8?q?cpython_=283=2E4=29=3A_fix_parsing_re?= =?utf-8?q?ST_with_code_or_code-block_directives_=28closes_=2323063=29?= Message-ID: <20150115050025.11561.28807@psf.io> https://hg.python.org/cpython/rev/db09d760b965 changeset: 94146:db09d760b965 branch: 3.4 parent: 94144:463bbd862887 user: Benjamin Peterson date: Wed Jan 14 23:56:35 2015 -0500 summary: fix parsing reST with code or code-block directives (closes #23063) Patch by Marc Abramowitz. files: Lib/distutils/command/check.py | 8 ++-- Lib/distutils/tests/test_check.py | 31 +++++++++++++++++++ Misc/NEWS | 3 + 3 files changed, 38 insertions(+), 4 deletions(-) diff --git a/Lib/distutils/command/check.py b/Lib/distutils/command/check.py --- a/Lib/distutils/command/check.py +++ b/Lib/distutils/command/check.py @@ -122,7 +122,7 @@ """Returns warnings when the provided data doesn't compile.""" source_path = StringIO() parser = Parser() - settings = frontend.OptionParser().get_default_values() + settings = frontend.OptionParser(components=(Parser,)).get_default_values() settings.tab_width = 4 settings.pep_references = None settings.rfc_references = None @@ -138,8 +138,8 @@ document.note_source(source_path, -1) try: parser.parse(data, document) - except AttributeError: - reporter.messages.append((-1, 'Could not finish the parsing.', - '', {})) + except AttributeError as e: + reporter.messages.append( + (-1, 'Could not finish the parsing: %s.' % e, '', {})) return reporter.messages diff --git a/Lib/distutils/tests/test_check.py b/Lib/distutils/tests/test_check.py --- a/Lib/distutils/tests/test_check.py +++ b/Lib/distutils/tests/test_check.py @@ -1,4 +1,5 @@ """Tests for distutils.command.check.""" +import textwrap import unittest from test.support import run_unittest @@ -92,6 +93,36 @@ cmd = self._run(metadata, strict=1, restructuredtext=1) self.assertEqual(cmd._warnings, 0) + @unittest.skipUnless(HAS_DOCUTILS, "won't test without docutils") + def test_check_restructuredtext_with_syntax_highlight(self): + # Don't fail if there is a `code` or `code-block` directive + + example_rst_docs = [] + example_rst_docs.append(textwrap.dedent("""\ + Here's some code: + + .. code:: python + + def foo(): + pass + """)) + example_rst_docs.append(textwrap.dedent("""\ + Here's some code: + + .. code-block:: python + + def foo(): + pass + """)) + + for rest_with_code in example_rst_docs: + pkg_info, dist = self.create_dist(long_description=rest_with_code) + cmd = check(dist) + cmd.check_restructuredtext() + self.assertEqual(cmd._warnings, 0) + msgs = cmd._check_rst_data(rest_with_code) + self.assertEqual(len(msgs), 0) + def test_check_all(self): metadata = {'url': 'xxx', 'author': 'xxx'} diff --git a/Misc/NEWS b/Misc/NEWS --- a/Misc/NEWS +++ b/Misc/NEWS @@ -44,6 +44,9 @@ Library ------- +- Issue #23063: In the disutils' check command, fix parsing of reST with code or + code-block directives. + - Issue #23209, #23225: selectors.BaseSelector.close() now clears its internal reference to the selector mapping to break a reference cycle. Initial patch written by Martin Richard. -- Repository URL: https://hg.python.org/cpython From python-checkins at python.org Thu Jan 15 06:00:54 2015 From: python-checkins at python.org (benjamin.peterson) Date: Thu, 15 Jan 2015 05:00:54 +0000 Subject: [Python-checkins] =?utf-8?q?cpython_=282=2E7=29=3A_fix_parsing_re?= =?utf-8?q?ST_with_code_or_code-block_directives_=28closes_=2323063=29?= Message-ID: <20150115050026.11591.28294@psf.io> https://hg.python.org/cpython/rev/38826e21f0db changeset: 94147:38826e21f0db branch: 2.7 parent: 94137:e280a04625cc user: Benjamin Peterson date: Wed Jan 14 23:56:35 2015 -0500 summary: fix parsing reST with code or code-block directives (closes #23063) Patch by Marc Abramowitz. files: Lib/distutils/command/check.py | 8 ++-- Lib/distutils/tests/test_check.py | 31 +++++++++++++++++++ Misc/NEWS | 3 + 3 files changed, 38 insertions(+), 4 deletions(-) diff --git a/Lib/distutils/command/check.py b/Lib/distutils/command/check.py --- a/Lib/distutils/command/check.py +++ b/Lib/distutils/command/check.py @@ -126,7 +126,7 @@ """Returns warnings when the provided data doesn't compile.""" source_path = StringIO() parser = Parser() - settings = frontend.OptionParser().get_default_values() + settings = frontend.OptionParser(components=(Parser,)).get_default_values() settings.tab_width = 4 settings.pep_references = None settings.rfc_references = None @@ -142,8 +142,8 @@ document.note_source(source_path, -1) try: parser.parse(data, document) - except AttributeError: - reporter.messages.append((-1, 'Could not finish the parsing.', - '', {})) + except AttributeError as e: + reporter.messages.append( + (-1, 'Could not finish the parsing: %s.' % e, '', {})) return reporter.messages diff --git a/Lib/distutils/tests/test_check.py b/Lib/distutils/tests/test_check.py --- a/Lib/distutils/tests/test_check.py +++ b/Lib/distutils/tests/test_check.py @@ -1,5 +1,6 @@ # -*- encoding: utf8 -*- """Tests for distutils.command.check.""" +import textwrap import unittest from test.test_support import run_unittest @@ -93,6 +94,36 @@ cmd = self._run(metadata, strict=1, restructuredtext=1) self.assertEqual(cmd._warnings, 0) + @unittest.skipUnless(HAS_DOCUTILS, "won't test without docutils") + def test_check_restructuredtext_with_syntax_highlight(self): + # Don't fail if there is a `code` or `code-block` directive + + example_rst_docs = [] + example_rst_docs.append(textwrap.dedent("""\ + Here's some code: + + .. code:: python + + def foo(): + pass + """)) + example_rst_docs.append(textwrap.dedent("""\ + Here's some code: + + .. code-block:: python + + def foo(): + pass + """)) + + for rest_with_code in example_rst_docs: + pkg_info, dist = self.create_dist(long_description=rest_with_code) + cmd = check(dist) + cmd.check_restructuredtext() + self.assertEqual(cmd._warnings, 0) + msgs = cmd._check_rst_data(rest_with_code) + self.assertEqual(len(msgs), 0) + def test_check_all(self): metadata = {'url': 'xxx', 'author': 'xxx'} diff --git a/Misc/NEWS b/Misc/NEWS --- a/Misc/NEWS +++ b/Misc/NEWS @@ -15,6 +15,9 @@ Library ------- +- Issue #23063: In the disutils' check command, fix parsing of reST with code or + code-block directives. + - Issue #21356: Make ssl.RAND_egd() optional to support LibreSSL. The availability of the function is checked during the compilation. Patch written by Bernard Spil. -- Repository URL: https://hg.python.org/cpython From python-checkins at python.org Thu Jan 15 06:58:10 2015 From: python-checkins at python.org (ethan.furman) Date: Thu, 15 Jan 2015 05:58:10 +0000 Subject: [Python-checkins] =?utf-8?b?Y3B5dGhvbiAoMy4zKTogSXNzdWUyMDQ2Nzog?= =?utf-8?b?Y2xhcmlmeSBfX2luaXRfXydzIHJvbGU=?= Message-ID: <20150115055754.125886.49389@psf.io> https://hg.python.org/cpython/rev/94696e457461 changeset: 94149:94696e457461 branch: 3.3 parent: 94019:d1af6f3a8ce3 user: Ethan Furman date: Wed Jan 14 21:56:10 2015 -0800 summary: Issue20467: clarify __init__'s role files: Doc/reference/datamodel.rst | 18 +++++++++++------- 1 files changed, 11 insertions(+), 7 deletions(-) diff --git a/Doc/reference/datamodel.rst b/Doc/reference/datamodel.rst --- a/Doc/reference/datamodel.rst +++ b/Doc/reference/datamodel.rst @@ -1081,13 +1081,17 @@ .. index:: pair: class; constructor - Called when the instance is created. The arguments are those passed to the - class constructor expression. If a base class has an :meth:`__init__` method, - the derived class's :meth:`__init__` method, if any, must explicitly call it to - ensure proper initialization of the base class part of the instance; for - example: ``BaseClass.__init__(self, [args...])``. As a special constraint on - constructors, no value may be returned; doing so will cause a :exc:`TypeError` - to be raised at runtime. + Called after the instance has been created (by :meth:`__new__`), but before + it is returned to the caller. The arguments are those passed to the + class constructor expression. If a base class has an :meth:`__init__` + method, the derived class's :meth:`__init__` method, if any, must explicitly + call it to ensure proper initialization of the base class part of the + instance; for example: ``BaseClass.__init__(self, [args...])``. + + Because :meth:`__new__` and :meth:`__init__` work together in constructing + objects (:meth:`__new__` to create it, and :meth:`__init__` to customise it), + no non-``None`` value may be returned by :meth:`__init__`; doing so will + cause a :exc:`TypeError` to be raised at runtime. .. method:: object.__del__(self) -- Repository URL: https://hg.python.org/cpython From python-checkins at python.org Thu Jan 15 06:58:10 2015 From: python-checkins at python.org (ethan.furman) Date: Thu, 15 Jan 2015 05:58:10 +0000 Subject: [Python-checkins] =?utf-8?b?Y3B5dGhvbiAobWVyZ2UgMy4zIC0+IDMuNCk6?= =?utf-8?q?_Issue20467=3A_clarify_=5F=5Finit=5F=5F=27s_role?= Message-ID: <20150115055754.11589.41599@psf.io> https://hg.python.org/cpython/rev/46c9b34f31b8 changeset: 94150:46c9b34f31b8 branch: 3.4 parent: 94146:db09d760b965 parent: 94149:94696e457461 user: Ethan Furman date: Wed Jan 14 21:56:49 2015 -0800 summary: Issue20467: clarify __init__'s role files: Doc/reference/datamodel.rst | 18 +++++++++++------- 1 files changed, 11 insertions(+), 7 deletions(-) diff --git a/Doc/reference/datamodel.rst b/Doc/reference/datamodel.rst --- a/Doc/reference/datamodel.rst +++ b/Doc/reference/datamodel.rst @@ -1100,13 +1100,17 @@ .. index:: pair: class; constructor - Called when the instance is created. The arguments are those passed to the - class constructor expression. If a base class has an :meth:`__init__` method, - the derived class's :meth:`__init__` method, if any, must explicitly call it to - ensure proper initialization of the base class part of the instance; for - example: ``BaseClass.__init__(self, [args...])``. As a special constraint on - constructors, no value may be returned; doing so will cause a :exc:`TypeError` - to be raised at runtime. + Called after the instance has been created (by :meth:`__new__`), but before + it is returned to the caller. The arguments are those passed to the + class constructor expression. If a base class has an :meth:`__init__` + method, the derived class's :meth:`__init__` method, if any, must explicitly + call it to ensure proper initialization of the base class part of the + instance; for example: ``BaseClass.__init__(self, [args...])``. + + Because :meth:`__new__` and :meth:`__init__` work together in constructing + objects (:meth:`__new__` to create it, and :meth:`__init__` to customise it), + no non-``None`` value may be returned by :meth:`__init__`; doing so will + cause a :exc:`TypeError` to be raised at runtime. .. method:: object.__del__(self) -- Repository URL: https://hg.python.org/cpython From python-checkins at python.org Thu Jan 15 06:58:10 2015 From: python-checkins at python.org (ethan.furman) Date: Thu, 15 Jan 2015 05:58:10 +0000 Subject: [Python-checkins] =?utf-8?q?cpython_=28merge_3=2E4_-=3E_default?= =?utf-8?b?KTogSXNzdWUyMDQ2NzogY2xhcmlmeSBfX2luaXRfXydzIHJvbGU=?= Message-ID: <20150115055755.125888.66317@psf.io> https://hg.python.org/cpython/rev/d0421de8ee11 changeset: 94151:d0421de8ee11 parent: 94148:731a36c13629 parent: 94150:46c9b34f31b8 user: Ethan Furman date: Wed Jan 14 21:57:15 2015 -0800 summary: Issue20467: clarify __init__'s role files: Doc/reference/datamodel.rst | 18 +++++++++++------- 1 files changed, 11 insertions(+), 7 deletions(-) diff --git a/Doc/reference/datamodel.rst b/Doc/reference/datamodel.rst --- a/Doc/reference/datamodel.rst +++ b/Doc/reference/datamodel.rst @@ -1100,13 +1100,17 @@ .. index:: pair: class; constructor - Called when the instance is created. The arguments are those passed to the - class constructor expression. If a base class has an :meth:`__init__` method, - the derived class's :meth:`__init__` method, if any, must explicitly call it to - ensure proper initialization of the base class part of the instance; for - example: ``BaseClass.__init__(self, [args...])``. As a special constraint on - constructors, no value may be returned; doing so will cause a :exc:`TypeError` - to be raised at runtime. + Called after the instance has been created (by :meth:`__new__`), but before + it is returned to the caller. The arguments are those passed to the + class constructor expression. If a base class has an :meth:`__init__` + method, the derived class's :meth:`__init__` method, if any, must explicitly + call it to ensure proper initialization of the base class part of the + instance; for example: ``BaseClass.__init__(self, [args...])``. + + Because :meth:`__new__` and :meth:`__init__` work together in constructing + objects (:meth:`__new__` to create it, and :meth:`__init__` to customise it), + no non-``None`` value may be returned by :meth:`__init__`; doing so will + cause a :exc:`TypeError` to be raised at runtime. .. method:: object.__del__(self) -- Repository URL: https://hg.python.org/cpython From python-checkins at python.org Thu Jan 15 07:02:41 2015 From: python-checkins at python.org (ethan.furman) Date: Thu, 15 Jan 2015 06:02:41 +0000 Subject: [Python-checkins] =?utf-8?b?Y3B5dGhvbiAoMi43KTogSXNzdWUyMDQ2Nzog?= =?utf-8?b?Y2xhcmlmeSBfX2luaXRfXydzIHJvbGU=?= Message-ID: <20150115060235.22405.49428@psf.io> https://hg.python.org/cpython/rev/7972b18b6e42 changeset: 94152:7972b18b6e42 branch: 2.7 parent: 94147:38826e21f0db user: Ethan Furman date: Wed Jan 14 22:02:10 2015 -0800 summary: Issue20467: clarify __init__'s role files: Doc/reference/datamodel.rst | 12 ++++++++---- 1 files changed, 8 insertions(+), 4 deletions(-) diff --git a/Doc/reference/datamodel.rst b/Doc/reference/datamodel.rst --- a/Doc/reference/datamodel.rst +++ b/Doc/reference/datamodel.rst @@ -1225,13 +1225,17 @@ .. index:: pair: class; constructor - Called when the instance is created. The arguments are those passed to the + Called after the instance has been created (by :meth:`__new__`), but before + it is returned to the caller. The arguments are those passed to the class constructor expression. If a base class has an :meth:`__init__` method, the derived class's :meth:`__init__` method, if any, must explicitly call it to ensure proper initialization of the base class part of the instance; for - example: ``BaseClass.__init__(self, [args...])``. As a special constraint on - constructors, no value may be returned; doing so will cause a :exc:`TypeError` - to be raised at runtime. + example: ``BaseClass.__init__(self, [args...])``. + + Because :meth:`__new__` and :meth:`__init__` work together in constructing + objects (:meth:`__new__` to create it, and :meth:`__init__` to customise it), + no non-``None`` value may be returned by :meth:`__init__`; doing so will + cause a :exc:`TypeError` to be raised at runtime. .. method:: object.__del__(self) -- Repository URL: https://hg.python.org/cpython From python-checkins at python.org Thu Jan 15 07:26:34 2015 From: python-checkins at python.org (ethan.furman) Date: Thu, 15 Jan 2015 06:26:34 +0000 Subject: [Python-checkins] =?utf-8?q?cpython_=28merge_3=2E4_-=3E_default?= =?utf-8?q?=29=3A_Issue22988=3A_clarify_yield_and_exception_blocks?= Message-ID: <20150115062629.125896.26217@psf.io> https://hg.python.org/cpython/rev/55ba19d084fb changeset: 94154:55ba19d084fb parent: 94151:d0421de8ee11 parent: 94153:0e48b0908651 user: Ethan Furman date: Wed Jan 14 22:26:04 2015 -0800 summary: Issue22988: clarify yield and exception blocks files: Doc/reference/expressions.rst | 9 +++++---- 1 files changed, 5 insertions(+), 4 deletions(-) diff --git a/Doc/reference/expressions.rst b/Doc/reference/expressions.rst --- a/Doc/reference/expressions.rst +++ b/Doc/reference/expressions.rst @@ -330,8 +330,9 @@ time, the execution proceeds to the first yield expression, where it is suspended again, returning the value of :token:`expression_list` to the generator's caller. By suspended, we mean that all local state is retained, including the -current bindings of local variables, the instruction pointer, and the internal -evaluation stack. When the execution is resumed by calling one of the +current bindings of local variables, the instruction pointer, the internal +evaluation stack, and the state of any exception handling. When the execution +is resumed by calling one of the generator's methods, the function can proceed exactly as if the yield expression were just another external call. The value of the yield expression after resuming depends on the method which resumed the execution. If @@ -348,8 +349,8 @@ where the execution should continue after it yields; the control is always transferred to the generator's caller. -Yield expressions are allowed in the :keyword:`try` clause of a :keyword:`try` -... :keyword:`finally` construct. If the generator is not resumed before it is +Yield expressions are allowed anywhere in a :keyword:`try` construct. If the +generator is not resumed before it is finalized (by reaching a zero reference count or by being garbage collected), the generator-iterator's :meth:`~generator.close` method will be called, allowing any pending :keyword:`finally` clauses to execute. -- Repository URL: https://hg.python.org/cpython From python-checkins at python.org Thu Jan 15 07:26:34 2015 From: python-checkins at python.org (ethan.furman) Date: Thu, 15 Jan 2015 06:26:34 +0000 Subject: [Python-checkins] =?utf-8?b?Y3B5dGhvbiAoMy40KTogSXNzdWUyMjk4ODog?= =?utf-8?q?clarify_yield_and_exception_blocks?= Message-ID: <20150115062629.22403.36139@psf.io> https://hg.python.org/cpython/rev/0e48b0908651 changeset: 94153:0e48b0908651 branch: 3.4 parent: 94150:46c9b34f31b8 user: Ethan Furman date: Wed Jan 14 22:25:27 2015 -0800 summary: Issue22988: clarify yield and exception blocks files: Doc/reference/expressions.rst | 9 +++++---- 1 files changed, 5 insertions(+), 4 deletions(-) diff --git a/Doc/reference/expressions.rst b/Doc/reference/expressions.rst --- a/Doc/reference/expressions.rst +++ b/Doc/reference/expressions.rst @@ -330,8 +330,9 @@ time, the execution proceeds to the first yield expression, where it is suspended again, returning the value of :token:`expression_list` to the generator's caller. By suspended, we mean that all local state is retained, including the -current bindings of local variables, the instruction pointer, and the internal -evaluation stack. When the execution is resumed by calling one of the +current bindings of local variables, the instruction pointer, the internal +evaluation stack, and the state of any exception handling. When the execution +is resumed by calling one of the generator's methods, the function can proceed exactly as if the yield expression were just another external call. The value of the yield expression after resuming depends on the method which resumed the execution. If @@ -348,8 +349,8 @@ where the execution should continue after it yields; the control is always transferred to the generator's caller. -Yield expressions are allowed in the :keyword:`try` clause of a :keyword:`try` -... :keyword:`finally` construct. If the generator is not resumed before it is +Yield expressions are allowed anywhere in a :keyword:`try` construct. If the +generator is not resumed before it is finalized (by reaching a zero reference count or by being garbage collected), the generator-iterator's :meth:`~generator.close` method will be called, allowing any pending :keyword:`finally` clauses to execute. -- Repository URL: https://hg.python.org/cpython From python-checkins at python.org Thu Jan 15 07:32:54 2015 From: python-checkins at python.org (ethan.furman) Date: Thu, 15 Jan 2015 06:32:54 +0000 Subject: [Python-checkins] =?utf-8?b?Y3B5dGhvbiAoMy40KTogSXNzdWUyMjk5Nzog?= =?utf-8?q?minor_doc_update=3B_thanks_to_Simoen_Visser?= Message-ID: <20150115063254.11571.86166@psf.io> https://hg.python.org/cpython/rev/522d13d8e76e changeset: 94155:522d13d8e76e branch: 3.4 parent: 94153:0e48b0908651 user: Ethan Furman date: Wed Jan 14 22:31:50 2015 -0800 summary: Issue22997: minor doc update; thanks to Simoen Visser files: Doc/library/enum.rst | 10 +++++++--- 1 files changed, 7 insertions(+), 3 deletions(-) diff --git a/Doc/library/enum.rst b/Doc/library/enum.rst --- a/Doc/library/enum.rst +++ b/Doc/library/enum.rst @@ -404,7 +404,7 @@ new class derived from :class:`Enum` is returned. In other words, the above assignment to :class:`Animal` is equivalent to:: - >>> class Animals(Enum): + >>> class Animal(Enum): ... ant = 1 ... bee = 2 ... cat = 3 @@ -421,7 +421,7 @@ function in separate module, and also may not work on IronPython or Jython). The solution is to specify the module name explicitly as follows:: - >>> Animals = Enum('Animals', 'ant bee cat dog', module=__name__) + >>> Animal = Enum('Animal', 'ant bee cat dog', module=__name__) .. warning:: @@ -434,7 +434,7 @@ to find the class. For example, if the class was made available in class SomeData in the global scope:: - >>> Animals = Enum('Animals', 'ant bee cat dog', qualname='SomeData.Animals') + >>> Animal = Enum('Animal', 'ant bee cat dog', qualname='SomeData.Animal') The complete signature is:: @@ -447,6 +447,10 @@ 'red green blue' | 'red,green,blue' | 'red, green, blue' + or an iterator of names:: + + ['red', 'green', 'blue'] + or an iterator of (name, value) pairs:: [('cyan', 4), ('magenta', 5), ('yellow', 6)] -- Repository URL: https://hg.python.org/cpython From python-checkins at python.org Thu Jan 15 07:32:54 2015 From: python-checkins at python.org (ethan.furman) Date: Thu, 15 Jan 2015 06:32:54 +0000 Subject: [Python-checkins] =?utf-8?q?cpython_=28merge_3=2E4_-=3E_default?= =?utf-8?q?=29=3A_Issue22997=3A_minor_doc_update=3B_thanks_to_Simoen_Visse?= =?utf-8?q?r?= Message-ID: <20150115063254.11579.95109@psf.io> https://hg.python.org/cpython/rev/31ae2dcaaed8 changeset: 94156:31ae2dcaaed8 parent: 94154:55ba19d084fb parent: 94155:522d13d8e76e user: Ethan Furman date: Wed Jan 14 22:32:29 2015 -0800 summary: Issue22997: minor doc update; thanks to Simoen Visser files: Doc/library/enum.rst | 10 +++++++--- 1 files changed, 7 insertions(+), 3 deletions(-) diff --git a/Doc/library/enum.rst b/Doc/library/enum.rst --- a/Doc/library/enum.rst +++ b/Doc/library/enum.rst @@ -405,7 +405,7 @@ new class derived from :class:`Enum` is returned. In other words, the above assignment to :class:`Animal` is equivalent to:: - >>> class Animals(Enum): + >>> class Animal(Enum): ... ant = 1 ... bee = 2 ... cat = 3 @@ -422,7 +422,7 @@ function in separate module, and also may not work on IronPython or Jython). The solution is to specify the module name explicitly as follows:: - >>> Animals = Enum('Animals', 'ant bee cat dog', module=__name__) + >>> Animal = Enum('Animal', 'ant bee cat dog', module=__name__) .. warning:: @@ -435,7 +435,7 @@ to find the class. For example, if the class was made available in class SomeData in the global scope:: - >>> Animals = Enum('Animals', 'ant bee cat dog', qualname='SomeData.Animals') + >>> Animal = Enum('Animal', 'ant bee cat dog', qualname='SomeData.Animal') The complete signature is:: @@ -448,6 +448,10 @@ 'red green blue' | 'red,green,blue' | 'red, green, blue' + or an iterator of names:: + + ['red', 'green', 'blue'] + or an iterator of (name, value) pairs:: [('cyan', 4), ('magenta', 5), ('yellow', 6)] -- Repository URL: https://hg.python.org/cpython From python-checkins at python.org Thu Jan 15 08:16:48 2015 From: python-checkins at python.org (georg.brandl) Date: Thu, 15 Jan 2015 07:16:48 +0000 Subject: [Python-checkins] =?utf-8?b?Y3B5dGhvbiAoMy40KTogQ2xvc2VzICMyMzI0?= =?utf-8?q?4=3A_fix_typo=2E_Thanks_Mayank_Tripathi_for_the_patch=2E?= Message-ID: <20150115071633.8755.99201@psf.io> https://hg.python.org/cpython/rev/3e4d4c4968bb changeset: 94158:3e4d4c4968bb branch: 3.4 parent: 94155:522d13d8e76e user: Georg Brandl date: Thu Jan 15 08:16:01 2015 +0100 summary: Closes #23244: fix typo. Thanks Mayank Tripathi for the patch. files: Doc/glossary.rst | 2 +- 1 files changed, 1 insertions(+), 1 deletions(-) diff --git a/Doc/glossary.rst b/Doc/glossary.rst --- a/Doc/glossary.rst +++ b/Doc/glossary.rst @@ -292,7 +292,7 @@ generator A function which returns an iterator. It looks like a normal function except that it contains :keyword:`yield` statements for producing a series - a values usable in a for-loop or that can be retrieved one at a time with + of values usable in a for-loop or that can be retrieved one at a time with the :func:`next` function. Each :keyword:`yield` temporarily suspends processing, remembering the location execution state (including local variables and pending try-statements). When the generator resumes, it -- Repository URL: https://hg.python.org/cpython From python-checkins at python.org Thu Jan 15 08:16:50 2015 From: python-checkins at python.org (georg.brandl) Date: Thu, 15 Jan 2015 07:16:50 +0000 Subject: [Python-checkins] =?utf-8?q?cpython_=28merge_3=2E4_-=3E_default?= =?utf-8?q?=29=3A_merge_with_3=2E4?= Message-ID: <20150115071634.72577.82979@psf.io> https://hg.python.org/cpython/rev/2a8378e69d75 changeset: 94159:2a8378e69d75 parent: 94156:31ae2dcaaed8 parent: 94158:3e4d4c4968bb user: Georg Brandl date: Thu Jan 15 08:16:25 2015 +0100 summary: merge with 3.4 files: Doc/glossary.rst | 2 +- 1 files changed, 1 insertions(+), 1 deletions(-) diff --git a/Doc/glossary.rst b/Doc/glossary.rst --- a/Doc/glossary.rst +++ b/Doc/glossary.rst @@ -292,7 +292,7 @@ generator A function which returns an iterator. It looks like a normal function except that it contains :keyword:`yield` statements for producing a series - a values usable in a for-loop or that can be retrieved one at a time with + of values usable in a for-loop or that can be retrieved one at a time with the :func:`next` function. Each :keyword:`yield` temporarily suspends processing, remembering the location execution state (including local variables and pending try-statements). When the generator resumes, it -- Repository URL: https://hg.python.org/cpython From python-checkins at python.org Thu Jan 15 08:16:50 2015 From: python-checkins at python.org (georg.brandl) Date: Thu, 15 Jan 2015 07:16:50 +0000 Subject: [Python-checkins] =?utf-8?b?Y3B5dGhvbiAoMi43KTogQ2xvc2VzICMyMzI0?= =?utf-8?q?4=3A_fix_typo=2E_Thanks_Mayank_Tripathi_for_the_patch=2E?= Message-ID: <20150115071633.9920.54554@psf.io> https://hg.python.org/cpython/rev/25a1ce2a6f9b changeset: 94157:25a1ce2a6f9b branch: 2.7 parent: 94152:7972b18b6e42 user: Georg Brandl date: Thu Jan 15 08:16:01 2015 +0100 summary: Closes #23244: fix typo. Thanks Mayank Tripathi for the patch. files: Doc/glossary.rst | 2 +- 1 files changed, 1 insertions(+), 1 deletions(-) diff --git a/Doc/glossary.rst b/Doc/glossary.rst --- a/Doc/glossary.rst +++ b/Doc/glossary.rst @@ -284,7 +284,7 @@ generator A function which returns an iterator. It looks like a normal function except that it contains :keyword:`yield` statements for producing a series - a values usable in a for-loop or that can be retrieved one at a time with + of values usable in a for-loop or that can be retrieved one at a time with the :func:`next` function. Each :keyword:`yield` temporarily suspends processing, remembering the location execution state (including local variables and pending try-statements). When the generator resumes, it -- Repository URL: https://hg.python.org/cpython From solipsis at pitrou.net Thu Jan 15 08:37:30 2015 From: solipsis at pitrou.net (solipsis at pitrou.net) Date: Thu, 15 Jan 2015 08:37:30 +0100 Subject: [Python-checkins] Daily reference leaks (61a045ac0006): sum=-6 Message-ID: results for 61a045ac0006 on branch "default" -------------------------------------------- test_collections leaked [-2, -4, 0] references, sum=-6 test_collections leaked [-1, -2, 0] memory blocks, sum=-3 test_functools leaked [0, 0, 3] memory blocks, sum=3 Command line was: ['./python', '-m', 'test.regrtest', '-uall', '-R', '3:3:/home/antoine/cpython/refleaks/reflogDncN0A', '-x'] From python-checkins at python.org Thu Jan 15 09:36:31 2015 From: python-checkins at python.org (victor.stinner) Date: Thu, 15 Jan 2015 08:36:31 +0000 Subject: [Python-checkins] =?utf-8?q?cpython_=283=2E4=29=3A_StreamWriter?= =?utf-8?q?=3A_close=28=29_now_clears_the_reference_to_the_transport?= Message-ID: <20150115083613.8765.13586@psf.io> https://hg.python.org/cpython/rev/6ab2575bc12b changeset: 94160:6ab2575bc12b branch: 3.4 parent: 94158:3e4d4c4968bb user: Victor Stinner date: Thu Jan 15 09:33:50 2015 +0100 summary: StreamWriter: close() now clears the reference to the transport StreamWriter now raises an exception if it is closed: write(), writelines(), write_eof(), can_write_eof(), get_extra_info(), drain(). files: Lib/asyncio/streams.py | 25 +++++++++++++++++++++---- 1 files changed, 21 insertions(+), 4 deletions(-) diff --git a/Lib/asyncio/streams.py b/Lib/asyncio/streams.py --- a/Lib/asyncio/streams.py +++ b/Lib/asyncio/streams.py @@ -258,8 +258,22 @@ self._reader = reader self._loop = loop + def close(self): + if self._transport is None: + return + self._transport.close() + self._transport = None + + def _check_closed(self): + if self._transport is None: + raise RuntimeError('StreamWriter is closed') + def __repr__(self): - info = [self.__class__.__name__, 'transport=%r' % self._transport] + info = [self.__class__.__name__] + if self._transport is not None: + info.append('transport=%r' % self._transport) + else: + info.append('closed') if self._reader is not None: info.append('reader=%r' % self._reader) return '<%s>' % ' '.join(info) @@ -269,21 +283,23 @@ return self._transport def write(self, data): + self._check_closed() self._transport.write(data) def writelines(self, data): + self._check_closed() self._transport.writelines(data) def write_eof(self): + self._check_closed() return self._transport.write_eof() def can_write_eof(self): + self._check_closed() return self._transport.can_write_eof() - def close(self): - return self._transport.close() - def get_extra_info(self, name, default=None): + self._check_closed() return self._transport.get_extra_info(name, default) @coroutine @@ -295,6 +311,7 @@ w.write(data) yield from w.drain() """ + self._check_closed() if self._reader is not None: exc = self._reader.exception() if exc is not None: -- Repository URL: https://hg.python.org/cpython From python-checkins at python.org Thu Jan 15 09:36:31 2015 From: python-checkins at python.org (victor.stinner) Date: Thu, 15 Jan 2015 08:36:31 +0000 Subject: [Python-checkins] =?utf-8?q?cpython_=28merge_3=2E4_-=3E_default?= =?utf-8?q?=29=3A_Merge_3=2E4_=28asyncio=29?= Message-ID: <20150115083614.11587.29110@psf.io> https://hg.python.org/cpython/rev/681e54e6d11e changeset: 94161:681e54e6d11e parent: 94159:2a8378e69d75 parent: 94160:6ab2575bc12b user: Victor Stinner date: Thu Jan 15 09:35:29 2015 +0100 summary: Merge 3.4 (asyncio) files: Lib/asyncio/streams.py | 25 +++++++++++++++++++++---- 1 files changed, 21 insertions(+), 4 deletions(-) diff --git a/Lib/asyncio/streams.py b/Lib/asyncio/streams.py --- a/Lib/asyncio/streams.py +++ b/Lib/asyncio/streams.py @@ -258,8 +258,22 @@ self._reader = reader self._loop = loop + def close(self): + if self._transport is None: + return + self._transport.close() + self._transport = None + + def _check_closed(self): + if self._transport is None: + raise RuntimeError('StreamWriter is closed') + def __repr__(self): - info = [self.__class__.__name__, 'transport=%r' % self._transport] + info = [self.__class__.__name__] + if self._transport is not None: + info.append('transport=%r' % self._transport) + else: + info.append('closed') if self._reader is not None: info.append('reader=%r' % self._reader) return '<%s>' % ' '.join(info) @@ -269,21 +283,23 @@ return self._transport def write(self, data): + self._check_closed() self._transport.write(data) def writelines(self, data): + self._check_closed() self._transport.writelines(data) def write_eof(self): + self._check_closed() return self._transport.write_eof() def can_write_eof(self): + self._check_closed() return self._transport.can_write_eof() - def close(self): - return self._transport.close() - def get_extra_info(self, name, default=None): + self._check_closed() return self._transport.get_extra_info(name, default) @coroutine @@ -295,6 +311,7 @@ w.write(data) yield from w.drain() """ + self._check_closed() if self._reader is not None: exc = self._reader.exception() if exc is not None: -- Repository URL: https://hg.python.org/cpython From python-checkins at python.org Thu Jan 15 09:42:26 2015 From: python-checkins at python.org (victor.stinner) Date: Thu, 15 Jan 2015 08:42:26 +0000 Subject: [Python-checkins] =?utf-8?b?Y3B5dGhvbiAoMy40KTogSXNzdWUgIzIyNTYw?= =?utf-8?q?=3A_Fix_SSLProtocol=2E=5Fon=5Fhandshake=5Fcomplete=28=29?= Message-ID: <20150115084223.22397.77030@psf.io> https://hg.python.org/cpython/rev/fb3761de0d3c changeset: 94162:fb3761de0d3c branch: 3.4 parent: 94160:6ab2575bc12b user: Victor Stinner date: Thu Jan 15 09:41:48 2015 +0100 summary: Issue #22560: Fix SSLProtocol._on_handshake_complete() Don't call immediatly self._process_write_backlog() but schedule the call using call_soon(). _on_handshake_complete() can be called indirectly from _process_write_backlog(), and _process_write_backlog() is not reentrant. files: Lib/asyncio/sslproto.py | 8 ++++++-- 1 files changed, 6 insertions(+), 2 deletions(-) diff --git a/Lib/asyncio/sslproto.py b/Lib/asyncio/sslproto.py --- a/Lib/asyncio/sslproto.py +++ b/Lib/asyncio/sslproto.py @@ -572,8 +572,12 @@ # wait until protocol.connection_made() has been called self._waiter._set_result_unless_cancelled(None) self._session_established = True - # In case transport.write() was already called - self._process_write_backlog() + # In case transport.write() was already called. Don't call + # immediatly _process_write_backlog(), but schedule it: + # _on_handshake_complete() can be called indirectly from + # _process_write_backlog(), and _process_write_backlog() is not + # reentrant. + self._loop.call(self._process_write_backlog) def _process_write_backlog(self): # Try to make progress on the write backlog. -- Repository URL: https://hg.python.org/cpython From python-checkins at python.org Thu Jan 15 09:45:25 2015 From: python-checkins at python.org (victor.stinner) Date: Thu, 15 Jan 2015 08:45:25 +0000 Subject: [Python-checkins] =?utf-8?b?Y3B5dGhvbiAoMy40KTogSXNzdWUgIzIyNTYw?= =?utf-8?q?=3A_Fix_typo=3A_call_-=3E_call=5Fsoon?= Message-ID: <20150115084523.72571.30858@psf.io> https://hg.python.org/cpython/rev/3c37825d85d3 changeset: 94163:3c37825d85d3 branch: 3.4 user: Victor Stinner date: Thu Jan 15 09:44:13 2015 +0100 summary: Issue #22560: Fix typo: call -> call_soon files: Lib/asyncio/sslproto.py | 2 +- 1 files changed, 1 insertions(+), 1 deletions(-) diff --git a/Lib/asyncio/sslproto.py b/Lib/asyncio/sslproto.py --- a/Lib/asyncio/sslproto.py +++ b/Lib/asyncio/sslproto.py @@ -577,7 +577,7 @@ # _on_handshake_complete() can be called indirectly from # _process_write_backlog(), and _process_write_backlog() is not # reentrant. - self._loop.call(self._process_write_backlog) + self._loop.call_soon(self._process_write_backlog) def _process_write_backlog(self): # Try to make progress on the write backlog. -- Repository URL: https://hg.python.org/cpython From python-checkins at python.org Thu Jan 15 09:45:25 2015 From: python-checkins at python.org (victor.stinner) Date: Thu, 15 Jan 2015 08:45:25 +0000 Subject: [Python-checkins] =?utf-8?q?cpython_=28merge_3=2E4_-=3E_default?= =?utf-8?q?=29=3A_Merge_3=2E4_=28asyncio=29?= Message-ID: <20150115084524.8749.36976@psf.io> https://hg.python.org/cpython/rev/69285bf7e865 changeset: 94164:69285bf7e865 parent: 94161:681e54e6d11e parent: 94163:3c37825d85d3 user: Victor Stinner date: Thu Jan 15 09:44:24 2015 +0100 summary: Merge 3.4 (asyncio) files: Lib/asyncio/sslproto.py | 8 ++++++-- 1 files changed, 6 insertions(+), 2 deletions(-) diff --git a/Lib/asyncio/sslproto.py b/Lib/asyncio/sslproto.py --- a/Lib/asyncio/sslproto.py +++ b/Lib/asyncio/sslproto.py @@ -572,8 +572,12 @@ # wait until protocol.connection_made() has been called self._waiter._set_result_unless_cancelled(None) self._session_established = True - # In case transport.write() was already called - self._process_write_backlog() + # In case transport.write() was already called. Don't call + # immediatly _process_write_backlog(), but schedule it: + # _on_handshake_complete() can be called indirectly from + # _process_write_backlog(), and _process_write_backlog() is not + # reentrant. + self._loop.call_soon(self._process_write_backlog) def _process_write_backlog(self): # Try to make progress on the write backlog. -- Repository URL: https://hg.python.org/cpython From python-checkins at python.org Thu Jan 15 13:31:20 2015 From: python-checkins at python.org (victor.stinner) Date: Thu, 15 Jan 2015 12:31:20 +0000 Subject: [Python-checkins] =?utf-8?b?Y3B5dGhvbiAoMy40KTogU1NMUHJvdG9jb2w6?= =?utf-8?q?_set_the_=5Ftransport_attribute_in_the_constructor?= Message-ID: <20150115122510.22407.55194@psf.io> https://hg.python.org/cpython/rev/1fca25b4ea1f changeset: 94166:1fca25b4ea1f branch: 3.4 user: Victor Stinner date: Thu Jan 15 13:16:27 2015 +0100 summary: SSLProtocol: set the _transport attribute in the constructor files: Lib/asyncio/sslproto.py | 1 + 1 files changed, 1 insertions(+), 0 deletions(-) diff --git a/Lib/asyncio/sslproto.py b/Lib/asyncio/sslproto.py --- a/Lib/asyncio/sslproto.py +++ b/Lib/asyncio/sslproto.py @@ -417,6 +417,7 @@ self._session_established = False self._in_handshake = False self._in_shutdown = False + self._transport = None def connection_made(self, transport): """Called when the low-level connection is made. -- Repository URL: https://hg.python.org/cpython From python-checkins at python.org Thu Jan 15 13:31:20 2015 From: python-checkins at python.org (victor.stinner) Date: Thu, 15 Jan 2015 12:31:20 +0000 Subject: [Python-checkins] =?utf-8?b?Y3B5dGhvbiAoMy40KTogSXNzdWUgIzIzMjQz?= =?utf-8?q?=3A_Fix_asyncio=2E=5FUnixWritePipeTransport=2Eclose=28=29?= Message-ID: <20150115122510.72559.64359@psf.io> https://hg.python.org/cpython/rev/2f13d53f4680 changeset: 94167:2f13d53f4680 branch: 3.4 user: Victor Stinner date: Thu Jan 15 13:16:50 2015 +0100 summary: Issue #23243: Fix asyncio._UnixWritePipeTransport.close() Do nothing if the transport is already closed. Before it was not possible to close the transport twice. files: Lib/asyncio/unix_events.py | 2 +- Lib/test/test_asyncio/test_unix_events.py | 3 +++ 2 files changed, 4 insertions(+), 1 deletions(-) diff --git a/Lib/asyncio/unix_events.py b/Lib/asyncio/unix_events.py --- a/Lib/asyncio/unix_events.py +++ b/Lib/asyncio/unix_events.py @@ -516,7 +516,7 @@ self._loop.call_soon(self._call_connection_lost, None) def close(self): - if not self._closing: + if self._pipe is not None and not self._closing: # write_eof is all what we needed to close the write pipe self.write_eof() diff --git a/Lib/test/test_asyncio/test_unix_events.py b/Lib/test/test_asyncio/test_unix_events.py --- a/Lib/test/test_asyncio/test_unix_events.py +++ b/Lib/test/test_asyncio/test_unix_events.py @@ -766,6 +766,9 @@ tr.close() tr.write_eof.assert_called_with() + # closing the transport twice must not fail + tr.close() + def test_close_closing(self): tr = unix_events._UnixWritePipeTransport( self.loop, self.pipe, self.protocol) -- Repository URL: https://hg.python.org/cpython From python-checkins at python.org Thu Jan 15 13:31:20 2015 From: python-checkins at python.org (victor.stinner) Date: Thu, 15 Jan 2015 12:31:20 +0000 Subject: [Python-checkins] =?utf-8?b?Y3B5dGhvbiAoMy40KTogSXNzdWUgIzIzMjQz?= =?utf-8?q?=3A_Close_explicitly_event_loops_in_asyncio_tests?= Message-ID: <20150115122511.11591.85912@psf.io> https://hg.python.org/cpython/rev/aef0f9b4e729 changeset: 94168:aef0f9b4e729 branch: 3.4 user: Victor Stinner date: Thu Jan 15 13:17:34 2015 +0100 summary: Issue #23243: Close explicitly event loops in asyncio tests files: Lib/test/test_asyncio/test_base_events.py | 1 + Lib/test/test_asyncio/test_proactor_events.py | 4 ++ Lib/test/test_asyncio/test_selector_events.py | 16 +++++++++- 3 files changed, 20 insertions(+), 1 deletions(-) diff --git a/Lib/test/test_asyncio/test_base_events.py b/Lib/test/test_asyncio/test_base_events.py --- a/Lib/test/test_asyncio/test_base_events.py +++ b/Lib/test/test_asyncio/test_base_events.py @@ -590,6 +590,7 @@ raise ValueError('spam') loop = Loop() + self.addCleanup(loop.close) asyncio.set_event_loop(loop) def run_loop(): diff --git a/Lib/test/test_asyncio/test_proactor_events.py b/Lib/test/test_asyncio/test_proactor_events.py --- a/Lib/test/test_asyncio/test_proactor_events.py +++ b/Lib/test/test_asyncio/test_proactor_events.py @@ -16,6 +16,7 @@ def setUp(self): self.loop = self.new_test_loop() + self.addCleanup(self.loop.close) self.proactor = mock.Mock() self.loop._proactor = self.proactor self.protocol = test_utils.make_test_protocol(asyncio.Protocol) @@ -459,6 +460,9 @@ self.assertIsNone(self.loop._ssock) self.assertIsNone(self.loop._csock) + # Don't call close(): _close_self_pipe() cannot be called twice + self.loop._closed = True + def test_close(self): self.loop._close_self_pipe = mock.Mock() self.loop.close() diff --git a/Lib/test/test_asyncio/test_selector_events.py b/Lib/test/test_asyncio/test_selector_events.py --- a/Lib/test/test_asyncio/test_selector_events.py +++ b/Lib/test/test_asyncio/test_selector_events.py @@ -24,6 +24,11 @@ class TestBaseSelectorEventLoop(BaseSelectorEventLoop): + def close(self): + # Don't call the close() method of the parent class, because the + # selector is mocked + self._closed = True + def _make_self_pipe(self): self._ssock = mock.Mock() self._csock = mock.Mock() @@ -40,7 +45,7 @@ self.selector = mock.Mock() self.selector.select.return_value = [] self.loop = TestBaseSelectorEventLoop(self.selector) - self.set_event_loop(self.loop, cleanup=False) + self.set_event_loop(self.loop) def test_make_socket_transport(self): m = mock.Mock() @@ -76,6 +81,15 @@ self.loop._make_ssl_transport(m, m, m, m) def test_close(self): + class EventLoop(BaseSelectorEventLoop): + def _make_self_pipe(self): + self._ssock = mock.Mock() + self._csock = mock.Mock() + self._internal_fds += 1 + + self.loop = EventLoop(self.selector) + self.set_event_loop(self.loop) + ssock = self.loop._ssock ssock.fileno.return_value = 7 csock = self.loop._csock -- Repository URL: https://hg.python.org/cpython From python-checkins at python.org Thu Jan 15 13:31:20 2015 From: python-checkins at python.org (victor.stinner) Date: Thu, 15 Jan 2015 12:31:20 +0000 Subject: [Python-checkins] =?utf-8?b?Y3B5dGhvbiAoMy40KTogSXNzdWUgIzIzMjQy?= =?utf-8?q?=3A_asyncio=2ESubprocessStreamProtocol_now_closes_the_subproces?= =?utf-8?q?s?= Message-ID: <20150115122510.11561.88534@psf.io> https://hg.python.org/cpython/rev/df493e9c6821 changeset: 94165:df493e9c6821 branch: 3.4 parent: 94163:3c37825d85d3 user: Victor Stinner date: Thu Jan 15 13:16:02 2015 +0100 summary: Issue #23242: asyncio.SubprocessStreamProtocol now closes the subprocess transport at subprocess exit. Clear also its reference to the transport. files: Lib/asyncio/subprocess.py | 5 ++++- 1 files changed, 4 insertions(+), 1 deletions(-) diff --git a/Lib/asyncio/subprocess.py b/Lib/asyncio/subprocess.py --- a/Lib/asyncio/subprocess.py +++ b/Lib/asyncio/subprocess.py @@ -94,8 +94,11 @@ reader.set_exception(exc) def process_exited(self): + returncode = self._transport.get_returncode() + self._transport.close() + self._transport = None + # wake up futures waiting for wait() - returncode = self._transport.get_returncode() while self._waiters: waiter = self._waiters.popleft() if not waiter.cancelled(): -- Repository URL: https://hg.python.org/cpython From python-checkins at python.org Thu Jan 15 13:31:20 2015 From: python-checkins at python.org (victor.stinner) Date: Thu, 15 Jan 2015 12:31:20 +0000 Subject: [Python-checkins] =?utf-8?b?Y3B5dGhvbiAoMy40KTogSXNzdWUgIzIzMjQz?= =?utf-8?q?=3A_Close_explicitly_transports_in_asyncio_tests?= Message-ID: <20150115122511.125890.51050@psf.io> https://hg.python.org/cpython/rev/f9b127188d43 changeset: 94169:f9b127188d43 branch: 3.4 user: Victor Stinner date: Thu Jan 15 13:18:32 2015 +0100 summary: Issue #23243: Close explicitly transports in asyncio tests files: Lib/test/test_asyncio/test_proactor_events.py | 87 +- Lib/test/test_asyncio/test_selector_events.py | 262 ++++----- Lib/test/test_asyncio/test_unix_events.py | 154 ++--- 3 files changed, 226 insertions(+), 277 deletions(-) diff --git a/Lib/test/test_asyncio/test_proactor_events.py b/Lib/test/test_asyncio/test_proactor_events.py --- a/Lib/test/test_asyncio/test_proactor_events.py +++ b/Lib/test/test_asyncio/test_proactor_events.py @@ -12,6 +12,15 @@ from asyncio import test_utils +def close_transport(transport): + # Don't call transport.close() because the event loop and the IOCP proactor + # are mocked + if transport._sock is None: + return + transport._sock.close() + transport._sock = None + + class ProactorSocketTransportTests(test_utils.TestCase): def setUp(self): @@ -22,17 +31,22 @@ self.protocol = test_utils.make_test_protocol(asyncio.Protocol) self.sock = mock.Mock(socket.socket) + def socket_transport(self, waiter=None): + transport = _ProactorSocketTransport(self.loop, self.sock, + self.protocol, waiter=waiter) + self.addCleanup(close_transport, transport) + return transport + def test_ctor(self): fut = asyncio.Future(loop=self.loop) - tr = _ProactorSocketTransport( - self.loop, self.sock, self.protocol, fut) + tr = self.socket_transport(waiter=fut) test_utils.run_briefly(self.loop) self.assertIsNone(fut.result()) self.protocol.connection_made(tr) self.proactor.recv.assert_called_with(self.sock, 4096) def test_loop_reading(self): - tr = _ProactorSocketTransport(self.loop, self.sock, self.protocol) + tr = self.socket_transport() tr._loop_reading() self.loop._proactor.recv.assert_called_with(self.sock, 4096) self.assertFalse(self.protocol.data_received.called) @@ -42,8 +56,7 @@ res = asyncio.Future(loop=self.loop) res.set_result(b'data') - tr = _ProactorSocketTransport(self.loop, self.sock, self.protocol) - + tr = self.socket_transport() tr._read_fut = res tr._loop_reading(res) self.loop._proactor.recv.assert_called_with(self.sock, 4096) @@ -53,8 +66,7 @@ res = asyncio.Future(loop=self.loop) res.set_result(b'') - tr = _ProactorSocketTransport(self.loop, self.sock, self.protocol) - + tr = self.socket_transport() self.assertRaises(AssertionError, tr._loop_reading, res) tr.close = mock.Mock() @@ -67,7 +79,7 @@ def test_loop_reading_aborted(self): err = self.loop._proactor.recv.side_effect = ConnectionAbortedError() - tr = _ProactorSocketTransport(self.loop, self.sock, self.protocol) + tr = self.socket_transport() tr._fatal_error = mock.Mock() tr._loop_reading() tr._fatal_error.assert_called_with( @@ -77,7 +89,7 @@ def test_loop_reading_aborted_closing(self): self.loop._proactor.recv.side_effect = ConnectionAbortedError() - tr = _ProactorSocketTransport(self.loop, self.sock, self.protocol) + tr = self.socket_transport() tr._closing = True tr._fatal_error = mock.Mock() tr._loop_reading() @@ -85,7 +97,7 @@ def test_loop_reading_aborted_is_fatal(self): self.loop._proactor.recv.side_effect = ConnectionAbortedError() - tr = _ProactorSocketTransport(self.loop, self.sock, self.protocol) + tr = self.socket_transport() tr._closing = False tr._fatal_error = mock.Mock() tr._loop_reading() @@ -94,7 +106,7 @@ def test_loop_reading_conn_reset_lost(self): err = self.loop._proactor.recv.side_effect = ConnectionResetError() - tr = _ProactorSocketTransport(self.loop, self.sock, self.protocol) + tr = self.socket_transport() tr._closing = False tr._fatal_error = mock.Mock() tr._force_close = mock.Mock() @@ -105,7 +117,7 @@ def test_loop_reading_exception(self): err = self.loop._proactor.recv.side_effect = (OSError()) - tr = _ProactorSocketTransport(self.loop, self.sock, self.protocol) + tr = self.socket_transport() tr._fatal_error = mock.Mock() tr._loop_reading() tr._fatal_error.assert_called_with( @@ -113,19 +125,19 @@ 'Fatal read error on pipe transport') def test_write(self): - tr = _ProactorSocketTransport(self.loop, self.sock, self.protocol) + tr = self.socket_transport() tr._loop_writing = mock.Mock() tr.write(b'data') self.assertEqual(tr._buffer, None) tr._loop_writing.assert_called_with(data=b'data') def test_write_no_data(self): - tr = _ProactorSocketTransport(self.loop, self.sock, self.protocol) + tr = self.socket_transport() tr.write(b'') self.assertFalse(tr._buffer) def test_write_more(self): - tr = _ProactorSocketTransport(self.loop, self.sock, self.protocol) + tr = self.socket_transport() tr._write_fut = mock.Mock() tr._loop_writing = mock.Mock() tr.write(b'data') @@ -133,7 +145,7 @@ self.assertFalse(tr._loop_writing.called) def test_loop_writing(self): - tr = _ProactorSocketTransport(self.loop, self.sock, self.protocol) + tr = self.socket_transport() tr._buffer = bytearray(b'data') tr._loop_writing() self.loop._proactor.send.assert_called_with(self.sock, b'data') @@ -143,7 +155,7 @@ @mock.patch('asyncio.proactor_events.logger') def test_loop_writing_err(self, m_log): err = self.loop._proactor.send.side_effect = OSError() - tr = _ProactorSocketTransport(self.loop, self.sock, self.protocol) + tr = self.socket_transport() tr._fatal_error = mock.Mock() tr._buffer = [b'da', b'ta'] tr._loop_writing() @@ -164,7 +176,7 @@ fut = asyncio.Future(loop=self.loop) fut.set_result(b'data') - tr = _ProactorSocketTransport(self.loop, self.sock, self.protocol) + tr = self.socket_transport() tr._write_fut = fut tr._loop_writing(fut) self.assertIsNone(tr._write_fut) @@ -173,7 +185,7 @@ fut = asyncio.Future(loop=self.loop) fut.set_result(1) - tr = _ProactorSocketTransport(self.loop, self.sock, self.protocol) + tr = self.socket_transport() tr._write_fut = fut tr.close() tr._loop_writing(fut) @@ -182,13 +194,13 @@ self.protocol.connection_lost.assert_called_with(None) def test_abort(self): - tr = _ProactorSocketTransport(self.loop, self.sock, self.protocol) + tr = self.socket_transport() tr._force_close = mock.Mock() tr.abort() tr._force_close.assert_called_with(None) def test_close(self): - tr = _ProactorSocketTransport(self.loop, self.sock, self.protocol) + tr = self.socket_transport() tr.close() test_utils.run_briefly(self.loop) self.protocol.connection_lost.assert_called_with(None) @@ -201,14 +213,14 @@ self.assertFalse(self.protocol.connection_lost.called) def test_close_write_fut(self): - tr = _ProactorSocketTransport(self.loop, self.sock, self.protocol) + tr = self.socket_transport() tr._write_fut = mock.Mock() tr.close() test_utils.run_briefly(self.loop) self.assertFalse(self.protocol.connection_lost.called) def test_close_buffer(self): - tr = _ProactorSocketTransport(self.loop, self.sock, self.protocol) + tr = self.socket_transport() tr._buffer = [b'data'] tr.close() test_utils.run_briefly(self.loop) @@ -216,14 +228,14 @@ @mock.patch('asyncio.base_events.logger') def test_fatal_error(self, m_logging): - tr = _ProactorSocketTransport(self.loop, self.sock, self.protocol) + tr = self.socket_transport() tr._force_close = mock.Mock() tr._fatal_error(None) self.assertTrue(tr._force_close.called) self.assertTrue(m_logging.error.called) def test_force_close(self): - tr = _ProactorSocketTransport(self.loop, self.sock, self.protocol) + tr = self.socket_transport() tr._buffer = [b'data'] read_fut = tr._read_fut = mock.Mock() write_fut = tr._write_fut = mock.Mock() @@ -237,14 +249,14 @@ self.assertEqual(tr._conn_lost, 1) def test_force_close_idempotent(self): - tr = _ProactorSocketTransport(self.loop, self.sock, self.protocol) + tr = self.socket_transport() tr._closing = True tr._force_close(None) test_utils.run_briefly(self.loop) self.assertFalse(self.protocol.connection_lost.called) def test_fatal_error_2(self): - tr = _ProactorSocketTransport(self.loop, self.sock, self.protocol) + tr = self.socket_transport() tr._buffer = [b'data'] tr._force_close(None) @@ -253,14 +265,13 @@ self.assertEqual(None, tr._buffer) def test_call_connection_lost(self): - tr = _ProactorSocketTransport(self.loop, self.sock, self.protocol) + tr = self.socket_transport() tr._call_connection_lost(None) self.assertTrue(self.protocol.connection_lost.called) self.assertTrue(self.sock.close.called) def test_write_eof(self): - tr = _ProactorSocketTransport( - self.loop, self.sock, self.protocol) + tr = self.socket_transport() self.assertTrue(tr.can_write_eof()) tr.write_eof() self.sock.shutdown.assert_called_with(socket.SHUT_WR) @@ -269,7 +280,7 @@ tr.close() def test_write_eof_buffer(self): - tr = _ProactorSocketTransport(self.loop, self.sock, self.protocol) + tr = self.socket_transport() f = asyncio.Future(loop=self.loop) tr._loop._proactor.send.return_value = f tr.write(b'data') @@ -313,11 +324,10 @@ self.assertFalse(tr.can_write_eof()) with self.assertRaises(NotImplementedError): tr.write_eof() - tr.close() + close_transport(tr) def test_pause_resume_reading(self): - tr = _ProactorSocketTransport( - self.loop, self.sock, self.protocol) + tr = self.socket_transport() futures = [] for msg in [b'data1', b'data2', b'data3', b'data4', b'']: f = asyncio.Future(loop=self.loop) @@ -345,10 +355,7 @@ def pause_writing_transport(self, high): - tr = _ProactorSocketTransport( - self.loop, self.sock, self.protocol) - self.addCleanup(tr.close) - + tr = self.socket_transport() tr.set_write_buffer_limits(high=high) self.assertEqual(tr.get_write_buffer_size(), 0) @@ -439,7 +446,7 @@ return (self.ssock, self.csock) self.loop = EventLoop(self.proactor) - self.set_event_loop(self.loop, cleanup=False) + self.set_event_loop(self.loop) @mock.patch.object(BaseProactorEventLoop, 'call_soon') @mock.patch.object(BaseProactorEventLoop, '_socketpair') @@ -451,6 +458,7 @@ self.assertIs(loop._csock, csock) self.assertEqual(loop._internal_fds, 1) call_soon.assert_called_with(loop._loop_self_reading) + loop.close() def test_close_self_pipe(self): self.loop._close_self_pipe() @@ -497,6 +505,7 @@ def test_make_socket_transport(self): tr = self.loop._make_socket_transport(self.sock, asyncio.Protocol()) self.assertIsInstance(tr, _ProactorSocketTransport) + close_transport(tr) def test_loop_self_reading(self): self.loop._loop_self_reading() diff --git a/Lib/test/test_asyncio/test_selector_events.py b/Lib/test/test_asyncio/test_selector_events.py --- a/Lib/test/test_asyncio/test_selector_events.py +++ b/Lib/test/test_asyncio/test_selector_events.py @@ -39,6 +39,15 @@ return bytearray().join(l) +def close_transport(transport): + # Don't call transport.close() because the event loop and the selector + # are mocked + if transport._sock is None: + return + transport._sock.close() + transport._sock = None + + class BaseSelectorEventLoopTests(test_utils.TestCase): def setUp(self): @@ -52,6 +61,7 @@ self.loop.add_reader = mock.Mock() transport = self.loop._make_socket_transport(m, asyncio.Protocol()) self.assertIsInstance(transport, _SelectorSocketTransport) + close_transport(transport) @unittest.skipIf(ssl is None, 'No ssl module') def test_make_ssl_transport(self): @@ -64,11 +74,19 @@ with test_utils.disable_logger(): transport = self.loop._make_ssl_transport( m, asyncio.Protocol(), m, waiter) + # execute the handshake while the logger is disabled + # to ignore SSL handshake failure + test_utils.run_briefly(self.loop) + # Sanity check class_name = transport.__class__.__name__ self.assertIn("ssl", class_name.lower()) self.assertIn("transport", class_name.lower()) + transport.close() + # execute pending callbacks to close the socket transport + test_utils.run_briefly(self.loop) + @mock.patch('asyncio.selector_events.ssl', None) @mock.patch('asyncio.sslproto.ssl', None) def test_make_ssl_transport_without_ssl_error(self): @@ -650,21 +668,27 @@ self.sock = mock.Mock(socket.socket) self.sock.fileno.return_value = 7 + def create_transport(self): + transport = _SelectorTransport(self.loop, self.sock, self.protocol, + None) + self.addCleanup(close_transport, transport) + return transport + def test_ctor(self): - tr = _SelectorTransport(self.loop, self.sock, self.protocol, None) + tr = self.create_transport() self.assertIs(tr._loop, self.loop) self.assertIs(tr._sock, self.sock) self.assertIs(tr._sock_fd, 7) def test_abort(self): - tr = _SelectorTransport(self.loop, self.sock, self.protocol, None) + tr = self.create_transport() tr._force_close = mock.Mock() tr.abort() tr._force_close.assert_called_with(None) def test_close(self): - tr = _SelectorTransport(self.loop, self.sock, self.protocol, None) + tr = self.create_transport() tr.close() self.assertTrue(tr._closing) @@ -677,7 +701,7 @@ self.assertEqual(1, self.loop.remove_reader_count[7]) def test_close_write_buffer(self): - tr = _SelectorTransport(self.loop, self.sock, self.protocol, None) + tr = self.create_transport() tr._buffer.extend(b'data') tr.close() @@ -686,7 +710,7 @@ self.assertFalse(self.protocol.connection_lost.called) def test_force_close(self): - tr = _SelectorTransport(self.loop, self.sock, self.protocol, None) + tr = self.create_transport() tr._buffer.extend(b'1') self.loop.add_reader(7, mock.sentinel) self.loop.add_writer(7, mock.sentinel) @@ -705,7 +729,7 @@ @mock.patch('asyncio.log.logger.error') def test_fatal_error(self, m_exc): exc = OSError() - tr = _SelectorTransport(self.loop, self.sock, self.protocol, None) + tr = self.create_transport() tr._force_close = mock.Mock() tr._fatal_error(exc) @@ -718,7 +742,7 @@ def test_connection_lost(self): exc = OSError() - tr = _SelectorTransport(self.loop, self.sock, self.protocol, None) + tr = self.create_transport() self.assertIsNotNone(tr._protocol) self.assertIsNotNone(tr._loop) tr._call_connection_lost(exc) @@ -739,9 +763,14 @@ self.sock = mock.Mock(socket.socket) self.sock_fd = self.sock.fileno.return_value = 7 + def socket_transport(self, waiter=None): + transport = _SelectorSocketTransport(self.loop, self.sock, + self.protocol, waiter=waiter) + self.addCleanup(close_transport, transport) + return transport + def test_ctor(self): - tr = _SelectorSocketTransport( - self.loop, self.sock, self.protocol) + tr = self.socket_transport() self.loop.assert_reader(7, tr._read_ready) test_utils.run_briefly(self.loop) self.protocol.connection_made.assert_called_with(tr) @@ -749,14 +778,12 @@ def test_ctor_with_waiter(self): fut = asyncio.Future(loop=self.loop) - _SelectorSocketTransport( - self.loop, self.sock, self.protocol, fut) + self.socket_transport(waiter=fut) test_utils.run_briefly(self.loop) self.assertIsNone(fut.result()) def test_pause_resume_reading(self): - tr = _SelectorSocketTransport( - self.loop, self.sock, self.protocol) + tr = self.socket_transport() self.assertFalse(tr._paused) self.loop.assert_reader(7, tr._read_ready) tr.pause_reading() @@ -769,8 +796,7 @@ tr.resume_reading() def test_read_ready(self): - transport = _SelectorSocketTransport( - self.loop, self.sock, self.protocol) + transport = self.socket_transport() self.sock.recv.return_value = b'data' transport._read_ready() @@ -778,8 +804,7 @@ self.protocol.data_received.assert_called_with(b'data') def test_read_ready_eof(self): - transport = _SelectorSocketTransport( - self.loop, self.sock, self.protocol) + transport = self.socket_transport() transport.close = mock.Mock() self.sock.recv.return_value = b'' @@ -789,8 +814,7 @@ transport.close.assert_called_with() def test_read_ready_eof_keep_open(self): - transport = _SelectorSocketTransport( - self.loop, self.sock, self.protocol) + transport = self.socket_transport() transport.close = mock.Mock() self.sock.recv.return_value = b'' @@ -804,8 +828,7 @@ def test_read_ready_tryagain(self, m_exc): self.sock.recv.side_effect = BlockingIOError - transport = _SelectorSocketTransport( - self.loop, self.sock, self.protocol) + transport = self.socket_transport() transport._fatal_error = mock.Mock() transport._read_ready() @@ -815,8 +838,7 @@ def test_read_ready_tryagain_interrupted(self, m_exc): self.sock.recv.side_effect = InterruptedError - transport = _SelectorSocketTransport( - self.loop, self.sock, self.protocol) + transport = self.socket_transport() transport._fatal_error = mock.Mock() transport._read_ready() @@ -826,8 +848,7 @@ def test_read_ready_conn_reset(self, m_exc): err = self.sock.recv.side_effect = ConnectionResetError() - transport = _SelectorSocketTransport( - self.loop, self.sock, self.protocol) + transport = self.socket_transport() transport._force_close = mock.Mock() with test_utils.disable_logger(): transport._read_ready() @@ -837,8 +858,7 @@ def test_read_ready_err(self, m_exc): err = self.sock.recv.side_effect = OSError() - transport = _SelectorSocketTransport( - self.loop, self.sock, self.protocol) + transport = self.socket_transport() transport._fatal_error = mock.Mock() transport._read_ready() @@ -850,8 +870,7 @@ data = b'data' self.sock.send.return_value = len(data) - transport = _SelectorSocketTransport( - self.loop, self.sock, self.protocol) + transport = self.socket_transport() transport.write(data) self.sock.send.assert_called_with(data) @@ -859,8 +878,7 @@ data = bytearray(b'data') self.sock.send.return_value = len(data) - transport = _SelectorSocketTransport( - self.loop, self.sock, self.protocol) + transport = self.socket_transport() transport.write(data) self.sock.send.assert_called_with(data) self.assertEqual(data, bytearray(b'data')) # Hasn't been mutated. @@ -869,22 +887,19 @@ data = memoryview(b'data') self.sock.send.return_value = len(data) - transport = _SelectorSocketTransport( - self.loop, self.sock, self.protocol) + transport = self.socket_transport() transport.write(data) self.sock.send.assert_called_with(data) def test_write_no_data(self): - transport = _SelectorSocketTransport( - self.loop, self.sock, self.protocol) + transport = self.socket_transport() transport._buffer.extend(b'data') transport.write(b'') self.assertFalse(self.sock.send.called) self.assertEqual(list_to_buffer([b'data']), transport._buffer) def test_write_buffer(self): - transport = _SelectorSocketTransport( - self.loop, self.sock, self.protocol) + transport = self.socket_transport() transport._buffer.extend(b'data1') transport.write(b'data2') self.assertFalse(self.sock.send.called) @@ -895,8 +910,7 @@ data = b'data' self.sock.send.return_value = 2 - transport = _SelectorSocketTransport( - self.loop, self.sock, self.protocol) + transport = self.socket_transport() transport.write(data) self.loop.assert_writer(7, transport._write_ready) @@ -906,8 +920,7 @@ data = bytearray(b'data') self.sock.send.return_value = 2 - transport = _SelectorSocketTransport( - self.loop, self.sock, self.protocol) + transport = self.socket_transport() transport.write(data) self.loop.assert_writer(7, transport._write_ready) @@ -918,8 +931,7 @@ data = memoryview(b'data') self.sock.send.return_value = 2 - transport = _SelectorSocketTransport( - self.loop, self.sock, self.protocol) + transport = self.socket_transport() transport.write(data) self.loop.assert_writer(7, transport._write_ready) @@ -930,8 +942,7 @@ self.sock.send.return_value = 0 self.sock.fileno.return_value = 7 - transport = _SelectorSocketTransport( - self.loop, self.sock, self.protocol) + transport = self.socket_transport() transport.write(data) self.loop.assert_writer(7, transport._write_ready) @@ -941,8 +952,7 @@ self.sock.send.side_effect = BlockingIOError data = b'data' - transport = _SelectorSocketTransport( - self.loop, self.sock, self.protocol) + transport = self.socket_transport() transport.write(data) self.loop.assert_writer(7, transport._write_ready) @@ -953,8 +963,7 @@ err = self.sock.send.side_effect = OSError() data = b'data' - transport = _SelectorSocketTransport( - self.loop, self.sock, self.protocol) + transport = self.socket_transport() transport._fatal_error = mock.Mock() transport.write(data) transport._fatal_error.assert_called_with( @@ -973,13 +982,11 @@ m_log.warning.assert_called_with('socket.send() raised exception.') def test_write_str(self): - transport = _SelectorSocketTransport( - self.loop, self.sock, self.protocol) + transport = self.socket_transport() self.assertRaises(TypeError, transport.write, 'str') def test_write_closing(self): - transport = _SelectorSocketTransport( - self.loop, self.sock, self.protocol) + transport = self.socket_transport() transport.close() self.assertEqual(transport._conn_lost, 1) transport.write(b'data') @@ -989,8 +996,7 @@ data = b'data' self.sock.send.return_value = len(data) - transport = _SelectorSocketTransport( - self.loop, self.sock, self.protocol) + transport = self.socket_transport() transport._buffer.extend(data) self.loop.add_writer(7, transport._write_ready) transport._write_ready() @@ -1001,8 +1007,7 @@ data = b'data' self.sock.send.return_value = len(data) - transport = _SelectorSocketTransport( - self.loop, self.sock, self.protocol) + transport = self.socket_transport() transport._closing = True transport._buffer.extend(data) self.loop.add_writer(7, transport._write_ready) @@ -1013,8 +1018,7 @@ self.protocol.connection_lost.assert_called_with(None) def test_write_ready_no_data(self): - transport = _SelectorSocketTransport( - self.loop, self.sock, self.protocol) + transport = self.socket_transport() # This is an internal error. self.assertRaises(AssertionError, transport._write_ready) @@ -1022,8 +1026,7 @@ data = b'data' self.sock.send.return_value = 2 - transport = _SelectorSocketTransport( - self.loop, self.sock, self.protocol) + transport = self.socket_transport() transport._buffer.extend(data) self.loop.add_writer(7, transport._write_ready) transport._write_ready() @@ -1034,8 +1037,7 @@ data = b'data' self.sock.send.return_value = 0 - transport = _SelectorSocketTransport( - self.loop, self.sock, self.protocol) + transport = self.socket_transport() transport._buffer.extend(data) self.loop.add_writer(7, transport._write_ready) transport._write_ready() @@ -1045,8 +1047,7 @@ def test_write_ready_tryagain(self): self.sock.send.side_effect = BlockingIOError - transport = _SelectorSocketTransport( - self.loop, self.sock, self.protocol) + transport = self.socket_transport() transport._buffer = list_to_buffer([b'data1', b'data2']) self.loop.add_writer(7, transport._write_ready) transport._write_ready() @@ -1057,8 +1058,7 @@ def test_write_ready_exception(self): err = self.sock.send.side_effect = OSError() - transport = _SelectorSocketTransport( - self.loop, self.sock, self.protocol) + transport = self.socket_transport() transport._fatal_error = mock.Mock() transport._buffer.extend(b'data') transport._write_ready() @@ -1071,16 +1071,14 @@ self.sock.send.side_effect = OSError() remove_writer = self.loop.remove_writer = mock.Mock() - transport = _SelectorSocketTransport( - self.loop, self.sock, self.protocol) + transport = self.socket_transport() transport.close() transport._buffer.extend(b'data') transport._write_ready() remove_writer.assert_called_with(self.sock_fd) def test_write_eof(self): - tr = _SelectorSocketTransport( - self.loop, self.sock, self.protocol) + tr = self.socket_transport() self.assertTrue(tr.can_write_eof()) tr.write_eof() self.sock.shutdown.assert_called_with(socket.SHUT_WR) @@ -1089,8 +1087,7 @@ tr.close() def test_write_eof_buffer(self): - tr = _SelectorSocketTransport( - self.loop, self.sock, self.protocol) + tr = self.socket_transport() self.sock.send.side_effect = BlockingIOError tr.write(b'data') tr.write_eof() @@ -1117,9 +1114,15 @@ self.sslcontext = mock.Mock() self.sslcontext.wrap_socket.return_value = self.sslsock + def ssl_transport(self, waiter=None, server_hostname=None): + transport = _SelectorSslTransport(self.loop, self.sock, self.protocol, + self.sslcontext, waiter=waiter, + server_hostname=server_hostname) + self.addCleanup(close_transport, transport) + return transport + def _make_one(self, create_waiter=None): - transport = _SelectorSslTransport( - self.loop, self.sock, self.protocol, self.sslcontext) + transport = self.ssl_transport() self.sock.reset_mock() self.sslsock.reset_mock() self.sslcontext.reset_mock() @@ -1128,9 +1131,7 @@ def test_on_handshake(self): waiter = asyncio.Future(loop=self.loop) - tr = _SelectorSslTransport( - self.loop, self.sock, self.protocol, self.sslcontext, - waiter=waiter) + tr = self.ssl_transport(waiter=waiter) self.assertTrue(self.sslsock.do_handshake.called) self.loop.assert_reader(1, tr._read_ready) test_utils.run_briefly(self.loop) @@ -1139,15 +1140,13 @@ def test_on_handshake_reader_retry(self): self.loop.set_debug(False) self.sslsock.do_handshake.side_effect = ssl.SSLWantReadError - transport = _SelectorSslTransport( - self.loop, self.sock, self.protocol, self.sslcontext) + transport = self.ssl_transport() self.loop.assert_reader(1, transport._on_handshake, None) def test_on_handshake_writer_retry(self): self.loop.set_debug(False) self.sslsock.do_handshake.side_effect = ssl.SSLWantWriteError - transport = _SelectorSslTransport( - self.loop, self.sock, self.protocol, self.sslcontext) + transport = self.ssl_transport() self.loop.assert_writer(1, transport._on_handshake, None) def test_on_handshake_exc(self): @@ -1155,16 +1154,14 @@ self.sslsock.do_handshake.side_effect = exc with test_utils.disable_logger(): waiter = asyncio.Future(loop=self.loop) - transport = _SelectorSslTransport( - self.loop, self.sock, self.protocol, self.sslcontext, waiter) + transport = self.ssl_transport(waiter=waiter) self.assertTrue(waiter.done()) self.assertIs(exc, waiter.exception()) self.assertTrue(self.sslsock.close.called) def test_on_handshake_base_exc(self): waiter = asyncio.Future(loop=self.loop) - transport = _SelectorSslTransport( - self.loop, self.sock, self.protocol, self.sslcontext, waiter) + transport = self.ssl_transport(waiter=waiter) exc = BaseException() self.sslsock.do_handshake.side_effect = exc with test_utils.disable_logger(): @@ -1177,8 +1174,7 @@ # Python issue #23197: cancelling an handshake must not raise an # exception or log an error, even if the handshake failed waiter = asyncio.Future(loop=self.loop) - transport = _SelectorSslTransport( - self.loop, self.sock, self.protocol, self.sslcontext, waiter) + transport = self.ssl_transport(waiter=waiter) waiter.cancel() exc = ValueError() self.sslsock.do_handshake.side_effect = exc @@ -1437,9 +1433,7 @@ @unittest.skipIf(ssl is None, 'No SSL support') def test_server_hostname(self): - _SelectorSslTransport( - self.loop, self.sock, self.protocol, self.sslcontext, - server_hostname='localhost') + self.ssl_transport(server_hostname='localhost') self.sslcontext.wrap_socket.assert_called_with( self.sock, do_handshake_on_connect=False, server_side=False, server_hostname='localhost') @@ -1462,9 +1456,15 @@ self.sock = mock.Mock(spec_set=socket.socket) self.sock.fileno.return_value = 7 + def datagram_transport(self, address=None): + transport = _SelectorDatagramTransport(self.loop, self.sock, + self.protocol, + address=address) + self.addCleanup(close_transport, transport) + return transport + def test_read_ready(self): - transport = _SelectorDatagramTransport( - self.loop, self.sock, self.protocol) + transport = self.datagram_transport() self.sock.recvfrom.return_value = (b'data', ('0.0.0.0', 1234)) transport._read_ready() @@ -1473,8 +1473,7 @@ b'data', ('0.0.0.0', 1234)) def test_read_ready_tryagain(self): - transport = _SelectorDatagramTransport( - self.loop, self.sock, self.protocol) + transport = self.datagram_transport() self.sock.recvfrom.side_effect = BlockingIOError transport._fatal_error = mock.Mock() @@ -1483,8 +1482,7 @@ self.assertFalse(transport._fatal_error.called) def test_read_ready_err(self): - transport = _SelectorDatagramTransport( - self.loop, self.sock, self.protocol) + transport = self.datagram_transport() err = self.sock.recvfrom.side_effect = RuntimeError() transport._fatal_error = mock.Mock() @@ -1495,8 +1493,7 @@ 'Fatal read error on datagram transport') def test_read_ready_oserr(self): - transport = _SelectorDatagramTransport( - self.loop, self.sock, self.protocol) + transport = self.datagram_transport() err = self.sock.recvfrom.side_effect = OSError() transport._fatal_error = mock.Mock() @@ -1507,8 +1504,7 @@ def test_sendto(self): data = b'data' - transport = _SelectorDatagramTransport( - self.loop, self.sock, self.protocol) + transport = self.datagram_transport() transport.sendto(data, ('0.0.0.0', 1234)) self.assertTrue(self.sock.sendto.called) self.assertEqual( @@ -1516,8 +1512,7 @@ def test_sendto_bytearray(self): data = bytearray(b'data') - transport = _SelectorDatagramTransport( - self.loop, self.sock, self.protocol) + transport = self.datagram_transport() transport.sendto(data, ('0.0.0.0', 1234)) self.assertTrue(self.sock.sendto.called) self.assertEqual( @@ -1525,16 +1520,14 @@ def test_sendto_memoryview(self): data = memoryview(b'data') - transport = _SelectorDatagramTransport( - self.loop, self.sock, self.protocol) + transport = self.datagram_transport() transport.sendto(data, ('0.0.0.0', 1234)) self.assertTrue(self.sock.sendto.called) self.assertEqual( self.sock.sendto.call_args[0], (data, ('0.0.0.0', 1234))) def test_sendto_no_data(self): - transport = _SelectorDatagramTransport( - self.loop, self.sock, self.protocol) + transport = self.datagram_transport() transport._buffer.append((b'data', ('0.0.0.0', 12345))) transport.sendto(b'', ()) self.assertFalse(self.sock.sendto.called) @@ -1542,8 +1535,7 @@ [(b'data', ('0.0.0.0', 12345))], list(transport._buffer)) def test_sendto_buffer(self): - transport = _SelectorDatagramTransport( - self.loop, self.sock, self.protocol) + transport = self.datagram_transport() transport._buffer.append((b'data1', ('0.0.0.0', 12345))) transport.sendto(b'data2', ('0.0.0.0', 12345)) self.assertFalse(self.sock.sendto.called) @@ -1554,8 +1546,7 @@ def test_sendto_buffer_bytearray(self): data2 = bytearray(b'data2') - transport = _SelectorDatagramTransport( - self.loop, self.sock, self.protocol) + transport = self.datagram_transport() transport._buffer.append((b'data1', ('0.0.0.0', 12345))) transport.sendto(data2, ('0.0.0.0', 12345)) self.assertFalse(self.sock.sendto.called) @@ -1567,8 +1558,7 @@ def test_sendto_buffer_memoryview(self): data2 = memoryview(b'data2') - transport = _SelectorDatagramTransport( - self.loop, self.sock, self.protocol) + transport = self.datagram_transport() transport._buffer.append((b'data1', ('0.0.0.0', 12345))) transport.sendto(data2, ('0.0.0.0', 12345)) self.assertFalse(self.sock.sendto.called) @@ -1583,8 +1573,7 @@ self.sock.sendto.side_effect = BlockingIOError - transport = _SelectorDatagramTransport( - self.loop, self.sock, self.protocol) + transport = self.datagram_transport() transport.sendto(data, ('0.0.0.0', 12345)) self.loop.assert_writer(7, transport._sendto_ready) @@ -1596,8 +1585,7 @@ data = b'data' err = self.sock.sendto.side_effect = RuntimeError() - transport = _SelectorDatagramTransport( - self.loop, self.sock, self.protocol) + transport = self.datagram_transport() transport._fatal_error = mock.Mock() transport.sendto(data, ()) @@ -1620,8 +1608,7 @@ self.sock.sendto.side_effect = ConnectionRefusedError - transport = _SelectorDatagramTransport( - self.loop, self.sock, self.protocol) + transport = self.datagram_transport() transport._fatal_error = mock.Mock() transport.sendto(data, ()) @@ -1633,8 +1620,7 @@ self.sock.send.side_effect = ConnectionRefusedError - transport = _SelectorDatagramTransport( - self.loop, self.sock, self.protocol, ('0.0.0.0', 1)) + transport = self.datagram_transport(address=('0.0.0.0', 1)) transport._fatal_error = mock.Mock() transport.sendto(data) @@ -1642,19 +1628,16 @@ self.assertTrue(self.protocol.error_received.called) def test_sendto_str(self): - transport = _SelectorDatagramTransport( - self.loop, self.sock, self.protocol) + transport = self.datagram_transport() self.assertRaises(TypeError, transport.sendto, 'str', ()) def test_sendto_connected_addr(self): - transport = _SelectorDatagramTransport( - self.loop, self.sock, self.protocol, ('0.0.0.0', 1)) + transport = self.datagram_transport(address=('0.0.0.0', 1)) self.assertRaises( ValueError, transport.sendto, b'str', ('0.0.0.0', 2)) def test_sendto_closing(self): - transport = _SelectorDatagramTransport( - self.loop, self.sock, self.protocol, address=(1,)) + transport = self.datagram_transport(address=(1,)) transport.close() self.assertEqual(transport._conn_lost, 1) transport.sendto(b'data', (1,)) @@ -1664,8 +1647,7 @@ data = b'data' self.sock.sendto.return_value = len(data) - transport = _SelectorDatagramTransport( - self.loop, self.sock, self.protocol) + transport = self.datagram_transport() transport._buffer.append((data, ('0.0.0.0', 12345))) self.loop.add_writer(7, transport._sendto_ready) transport._sendto_ready() @@ -1678,8 +1660,7 @@ data = b'data' self.sock.send.return_value = len(data) - transport = _SelectorDatagramTransport( - self.loop, self.sock, self.protocol) + transport = self.datagram_transport() transport._closing = True transport._buffer.append((data, ())) self.loop.add_writer(7, transport._sendto_ready) @@ -1690,8 +1671,7 @@ self.protocol.connection_lost.assert_called_with(None) def test_sendto_ready_no_data(self): - transport = _SelectorDatagramTransport( - self.loop, self.sock, self.protocol) + transport = self.datagram_transport() self.loop.add_writer(7, transport._sendto_ready) transport._sendto_ready() self.assertFalse(self.sock.sendto.called) @@ -1700,8 +1680,7 @@ def test_sendto_ready_tryagain(self): self.sock.sendto.side_effect = BlockingIOError - transport = _SelectorDatagramTransport( - self.loop, self.sock, self.protocol) + transport = self.datagram_transport() transport._buffer.extend([(b'data1', ()), (b'data2', ())]) self.loop.add_writer(7, transport._sendto_ready) transport._sendto_ready() @@ -1714,8 +1693,7 @@ def test_sendto_ready_exception(self): err = self.sock.sendto.side_effect = RuntimeError() - transport = _SelectorDatagramTransport( - self.loop, self.sock, self.protocol) + transport = self.datagram_transport() transport._fatal_error = mock.Mock() transport._buffer.append((b'data', ())) transport._sendto_ready() @@ -1727,8 +1705,7 @@ def test_sendto_ready_error_received(self): self.sock.sendto.side_effect = ConnectionRefusedError - transport = _SelectorDatagramTransport( - self.loop, self.sock, self.protocol) + transport = self.datagram_transport() transport._fatal_error = mock.Mock() transport._buffer.append((b'data', ())) transport._sendto_ready() @@ -1738,8 +1715,7 @@ def test_sendto_ready_error_received_connection(self): self.sock.send.side_effect = ConnectionRefusedError - transport = _SelectorDatagramTransport( - self.loop, self.sock, self.protocol, ('0.0.0.0', 1)) + transport = self.datagram_transport(address=('0.0.0.0', 1)) transport._fatal_error = mock.Mock() transport._buffer.append((b'data', ())) transport._sendto_ready() @@ -1749,8 +1725,7 @@ @mock.patch('asyncio.base_events.logger.error') def test_fatal_error_connected(self, m_exc): - transport = _SelectorDatagramTransport( - self.loop, self.sock, self.protocol, ('0.0.0.0', 1)) + transport = self.datagram_transport(address=('0.0.0.0', 1)) err = ConnectionRefusedError() transport._fatal_error(err) self.assertFalse(self.protocol.error_received.called) @@ -1758,7 +1733,6 @@ test_utils.MockPattern( 'Fatal error on transport\nprotocol:.*\ntransport:.*'), exc_info=(ConnectionRefusedError, MOCK_ANY, MOCK_ANY)) - transport.close() if __name__ == '__main__': diff --git a/Lib/test/test_asyncio/test_unix_events.py b/Lib/test/test_asyncio/test_unix_events.py --- a/Lib/test/test_asyncio/test_unix_events.py +++ b/Lib/test/test_asyncio/test_unix_events.py @@ -26,6 +26,15 @@ MOCK_ANY = mock.ANY +def close_pipe_transport(transport): + # Don't call transport.close() because the event loop and the selector + # are mocked + if transport._pipe is None: + return + transport._pipe.close() + transport._pipe = None + + @unittest.skipUnless(signal, 'Signals are not supported') class SelectorEventLoopSignalTests(test_utils.TestCase): @@ -333,24 +342,28 @@ m_fstat.return_value = st self.addCleanup(fstat_patcher.stop) + def read_pipe_transport(self, waiter=None): + transport = unix_events._UnixReadPipeTransport(self.loop, self.pipe, + self.protocol, + waiter=waiter) + self.addCleanup(close_pipe_transport, transport) + return transport + def test_ctor(self): - tr = unix_events._UnixReadPipeTransport( - self.loop, self.pipe, self.protocol) + tr = self.read_pipe_transport() self.loop.assert_reader(5, tr._read_ready) test_utils.run_briefly(self.loop) self.protocol.connection_made.assert_called_with(tr) def test_ctor_with_waiter(self): fut = asyncio.Future(loop=self.loop) - unix_events._UnixReadPipeTransport( - self.loop, self.pipe, self.protocol, fut) + tr = self.read_pipe_transport(waiter=fut) test_utils.run_briefly(self.loop) self.assertIsNone(fut.result()) @mock.patch('os.read') def test__read_ready(self, m_read): - tr = unix_events._UnixReadPipeTransport( - self.loop, self.pipe, self.protocol) + tr = self.read_pipe_transport() m_read.return_value = b'data' tr._read_ready() @@ -359,8 +372,7 @@ @mock.patch('os.read') def test__read_ready_eof(self, m_read): - tr = unix_events._UnixReadPipeTransport( - self.loop, self.pipe, self.protocol) + tr = self.read_pipe_transport() m_read.return_value = b'' tr._read_ready() @@ -372,8 +384,7 @@ @mock.patch('os.read') def test__read_ready_blocked(self, m_read): - tr = unix_events._UnixReadPipeTransport( - self.loop, self.pipe, self.protocol) + tr = self.read_pipe_transport() m_read.side_effect = BlockingIOError tr._read_ready() @@ -384,8 +395,7 @@ @mock.patch('asyncio.log.logger.error') @mock.patch('os.read') def test__read_ready_error(self, m_read, m_logexc): - tr = unix_events._UnixReadPipeTransport( - self.loop, self.pipe, self.protocol) + tr = self.read_pipe_transport() err = OSError() m_read.side_effect = err tr._close = mock.Mock() @@ -401,9 +411,7 @@ @mock.patch('os.read') def test_pause_reading(self, m_read): - tr = unix_events._UnixReadPipeTransport( - self.loop, self.pipe, self.protocol) - + tr = self.read_pipe_transport() m = mock.Mock() self.loop.add_reader(5, m) tr.pause_reading() @@ -411,26 +419,20 @@ @mock.patch('os.read') def test_resume_reading(self, m_read): - tr = unix_events._UnixReadPipeTransport( - self.loop, self.pipe, self.protocol) - + tr = self.read_pipe_transport() tr.resume_reading() self.loop.assert_reader(5, tr._read_ready) @mock.patch('os.read') def test_close(self, m_read): - tr = unix_events._UnixReadPipeTransport( - self.loop, self.pipe, self.protocol) - + tr = self.read_pipe_transport() tr._close = mock.Mock() tr.close() tr._close.assert_called_with(None) @mock.patch('os.read') def test_close_already_closing(self, m_read): - tr = unix_events._UnixReadPipeTransport( - self.loop, self.pipe, self.protocol) - + tr = self.read_pipe_transport() tr._closing = True tr._close = mock.Mock() tr.close() @@ -438,9 +440,7 @@ @mock.patch('os.read') def test__close(self, m_read): - tr = unix_events._UnixReadPipeTransport( - self.loop, self.pipe, self.protocol) - + tr = self.read_pipe_transport() err = object() tr._close(err) self.assertTrue(tr._closing) @@ -449,8 +449,7 @@ self.protocol.connection_lost.assert_called_with(err) def test__call_connection_lost(self): - tr = unix_events._UnixReadPipeTransport( - self.loop, self.pipe, self.protocol) + tr = self.read_pipe_transport() self.assertIsNotNone(tr._protocol) self.assertIsNotNone(tr._loop) @@ -463,8 +462,7 @@ self.assertIsNone(tr._loop) def test__call_connection_lost_with_err(self): - tr = unix_events._UnixReadPipeTransport( - self.loop, self.pipe, self.protocol) + tr = self.read_pipe_transport() self.assertIsNotNone(tr._protocol) self.assertIsNotNone(tr._loop) @@ -496,31 +494,33 @@ m_fstat.return_value = st self.addCleanup(fstat_patcher.stop) + def write_pipe_transport(self, waiter=None): + transport = unix_events._UnixWritePipeTransport(self.loop, self.pipe, + self.protocol, + waiter=waiter) + self.addCleanup(close_pipe_transport, transport) + return transport + def test_ctor(self): - tr = unix_events._UnixWritePipeTransport( - self.loop, self.pipe, self.protocol) + tr = self.write_pipe_transport() self.loop.assert_reader(5, tr._read_ready) test_utils.run_briefly(self.loop) self.protocol.connection_made.assert_called_with(tr) def test_ctor_with_waiter(self): fut = asyncio.Future(loop=self.loop) - tr = unix_events._UnixWritePipeTransport( - self.loop, self.pipe, self.protocol, fut) + tr = self.write_pipe_transport(waiter=fut) self.loop.assert_reader(5, tr._read_ready) test_utils.run_briefly(self.loop) self.assertEqual(None, fut.result()) def test_can_write_eof(self): - tr = unix_events._UnixWritePipeTransport( - self.loop, self.pipe, self.protocol) + tr = self.write_pipe_transport() self.assertTrue(tr.can_write_eof()) @mock.patch('os.write') def test_write(self, m_write): - tr = unix_events._UnixWritePipeTransport( - self.loop, self.pipe, self.protocol) - + tr = self.write_pipe_transport() m_write.return_value = 4 tr.write(b'data') m_write.assert_called_with(5, b'data') @@ -529,9 +529,7 @@ @mock.patch('os.write') def test_write_no_data(self, m_write): - tr = unix_events._UnixWritePipeTransport( - self.loop, self.pipe, self.protocol) - + tr = self.write_pipe_transport() tr.write(b'') self.assertFalse(m_write.called) self.assertFalse(self.loop.writers) @@ -539,9 +537,7 @@ @mock.patch('os.write') def test_write_partial(self, m_write): - tr = unix_events._UnixWritePipeTransport( - self.loop, self.pipe, self.protocol) - + tr = self.write_pipe_transport() m_write.return_value = 2 tr.write(b'data') m_write.assert_called_with(5, b'data') @@ -550,9 +546,7 @@ @mock.patch('os.write') def test_write_buffer(self, m_write): - tr = unix_events._UnixWritePipeTransport( - self.loop, self.pipe, self.protocol) - + tr = self.write_pipe_transport() self.loop.add_writer(5, tr._write_ready) tr._buffer = [b'previous'] tr.write(b'data') @@ -562,9 +556,7 @@ @mock.patch('os.write') def test_write_again(self, m_write): - tr = unix_events._UnixWritePipeTransport( - self.loop, self.pipe, self.protocol) - + tr = self.write_pipe_transport() m_write.side_effect = BlockingIOError() tr.write(b'data') m_write.assert_called_with(5, b'data') @@ -574,9 +566,7 @@ @mock.patch('asyncio.unix_events.logger') @mock.patch('os.write') def test_write_err(self, m_write, m_log): - tr = unix_events._UnixWritePipeTransport( - self.loop, self.pipe, self.protocol) - + tr = self.write_pipe_transport() err = OSError() m_write.side_effect = err tr._fatal_error = mock.Mock() @@ -602,8 +592,7 @@ @mock.patch('os.write') def test_write_close(self, m_write): - tr = unix_events._UnixWritePipeTransport( - self.loop, self.pipe, self.protocol) + tr = self.write_pipe_transport() tr._read_ready() # pipe was closed by peer tr.write(b'data') @@ -612,8 +601,7 @@ self.assertEqual(tr._conn_lost, 2) def test__read_ready(self): - tr = unix_events._UnixWritePipeTransport(self.loop, self.pipe, - self.protocol) + tr = self.write_pipe_transport() tr._read_ready() self.assertFalse(self.loop.readers) self.assertFalse(self.loop.writers) @@ -623,8 +611,7 @@ @mock.patch('os.write') def test__write_ready(self, m_write): - tr = unix_events._UnixWritePipeTransport( - self.loop, self.pipe, self.protocol) + tr = self.write_pipe_transport() self.loop.add_writer(5, tr._write_ready) tr._buffer = [b'da', b'ta'] m_write.return_value = 4 @@ -635,9 +622,7 @@ @mock.patch('os.write') def test__write_ready_partial(self, m_write): - tr = unix_events._UnixWritePipeTransport( - self.loop, self.pipe, self.protocol) - + tr = self.write_pipe_transport() self.loop.add_writer(5, tr._write_ready) tr._buffer = [b'da', b'ta'] m_write.return_value = 3 @@ -648,9 +633,7 @@ @mock.patch('os.write') def test__write_ready_again(self, m_write): - tr = unix_events._UnixWritePipeTransport( - self.loop, self.pipe, self.protocol) - + tr = self.write_pipe_transport() self.loop.add_writer(5, tr._write_ready) tr._buffer = [b'da', b'ta'] m_write.side_effect = BlockingIOError() @@ -661,9 +644,7 @@ @mock.patch('os.write') def test__write_ready_empty(self, m_write): - tr = unix_events._UnixWritePipeTransport( - self.loop, self.pipe, self.protocol) - + tr = self.write_pipe_transport() self.loop.add_writer(5, tr._write_ready) tr._buffer = [b'da', b'ta'] m_write.return_value = 0 @@ -675,9 +656,7 @@ @mock.patch('asyncio.log.logger.error') @mock.patch('os.write') def test__write_ready_err(self, m_write, m_logexc): - tr = unix_events._UnixWritePipeTransport( - self.loop, self.pipe, self.protocol) - + tr = self.write_pipe_transport() self.loop.add_writer(5, tr._write_ready) tr._buffer = [b'da', b'ta'] m_write.side_effect = err = OSError() @@ -698,9 +677,7 @@ @mock.patch('os.write') def test__write_ready_closing(self, m_write): - tr = unix_events._UnixWritePipeTransport( - self.loop, self.pipe, self.protocol) - + tr = self.write_pipe_transport() self.loop.add_writer(5, tr._write_ready) tr._closing = True tr._buffer = [b'da', b'ta'] @@ -715,9 +692,7 @@ @mock.patch('os.write') def test_abort(self, m_write): - tr = unix_events._UnixWritePipeTransport( - self.loop, self.pipe, self.protocol) - + tr = self.write_pipe_transport() self.loop.add_writer(5, tr._write_ready) self.loop.add_reader(5, tr._read_ready) tr._buffer = [b'da', b'ta'] @@ -731,8 +706,7 @@ self.protocol.connection_lost.assert_called_with(None) def test__call_connection_lost(self): - tr = unix_events._UnixWritePipeTransport( - self.loop, self.pipe, self.protocol) + tr = self.write_pipe_transport() self.assertIsNotNone(tr._protocol) self.assertIsNotNone(tr._loop) @@ -745,8 +719,7 @@ self.assertIsNone(tr._loop) def test__call_connection_lost_with_err(self): - tr = unix_events._UnixWritePipeTransport( - self.loop, self.pipe, self.protocol) + tr = self.write_pipe_transport() self.assertIsNotNone(tr._protocol) self.assertIsNotNone(tr._loop) @@ -759,9 +732,7 @@ self.assertIsNone(tr._loop) def test_close(self): - tr = unix_events._UnixWritePipeTransport( - self.loop, self.pipe, self.protocol) - + tr = self.write_pipe_transport() tr.write_eof = mock.Mock() tr.close() tr.write_eof.assert_called_with() @@ -770,18 +741,14 @@ tr.close() def test_close_closing(self): - tr = unix_events._UnixWritePipeTransport( - self.loop, self.pipe, self.protocol) - + tr = self.write_pipe_transport() tr.write_eof = mock.Mock() tr._closing = True tr.close() self.assertFalse(tr.write_eof.called) def test_write_eof(self): - tr = unix_events._UnixWritePipeTransport( - self.loop, self.pipe, self.protocol) - + tr = self.write_pipe_transport() tr.write_eof() self.assertTrue(tr._closing) self.assertFalse(self.loop.readers) @@ -789,8 +756,7 @@ self.protocol.connection_lost.assert_called_with(None) def test_write_eof_pending(self): - tr = unix_events._UnixWritePipeTransport( - self.loop, self.pipe, self.protocol) + tr = self.write_pipe_transport() tr._buffer = [b'data'] tr.write_eof() self.assertTrue(tr._closing) -- Repository URL: https://hg.python.org/cpython From python-checkins at python.org Thu Jan 15 13:31:20 2015 From: python-checkins at python.org (victor.stinner) Date: Thu, 15 Jan 2015 12:31:20 +0000 Subject: [Python-checkins] =?utf-8?q?cpython_=28merge_3=2E4_-=3E_default?= =?utf-8?q?=29=3A_Merge_3=2E4_=28asyncio=29?= Message-ID: <20150115122511.11628.34373@psf.io> https://hg.python.org/cpython/rev/dc21d72d0f87 changeset: 94170:dc21d72d0f87 parent: 94164:69285bf7e865 parent: 94169:f9b127188d43 user: Victor Stinner date: Thu Jan 15 13:23:36 2015 +0100 summary: Merge 3.4 (asyncio) files: Lib/asyncio/sslproto.py | 1 + Lib/asyncio/subprocess.py | 5 +- Lib/asyncio/unix_events.py | 2 +- Lib/test/test_asyncio/test_base_events.py | 1 + Lib/test/test_asyncio/test_proactor_events.py | 91 +- Lib/test/test_asyncio/test_selector_events.py | 278 ++++----- Lib/test/test_asyncio/test_unix_events.py | 157 ++--- 7 files changed, 255 insertions(+), 280 deletions(-) diff --git a/Lib/asyncio/sslproto.py b/Lib/asyncio/sslproto.py --- a/Lib/asyncio/sslproto.py +++ b/Lib/asyncio/sslproto.py @@ -417,6 +417,7 @@ self._session_established = False self._in_handshake = False self._in_shutdown = False + self._transport = None def connection_made(self, transport): """Called when the low-level connection is made. diff --git a/Lib/asyncio/subprocess.py b/Lib/asyncio/subprocess.py --- a/Lib/asyncio/subprocess.py +++ b/Lib/asyncio/subprocess.py @@ -94,8 +94,11 @@ reader.set_exception(exc) def process_exited(self): + returncode = self._transport.get_returncode() + self._transport.close() + self._transport = None + # wake up futures waiting for wait() - returncode = self._transport.get_returncode() while self._waiters: waiter = self._waiters.popleft() if not waiter.cancelled(): diff --git a/Lib/asyncio/unix_events.py b/Lib/asyncio/unix_events.py --- a/Lib/asyncio/unix_events.py +++ b/Lib/asyncio/unix_events.py @@ -516,7 +516,7 @@ self._loop.call_soon(self._call_connection_lost, None) def close(self): - if not self._closing: + if self._pipe is not None and not self._closing: # write_eof is all what we needed to close the write pipe self.write_eof() diff --git a/Lib/test/test_asyncio/test_base_events.py b/Lib/test/test_asyncio/test_base_events.py --- a/Lib/test/test_asyncio/test_base_events.py +++ b/Lib/test/test_asyncio/test_base_events.py @@ -590,6 +590,7 @@ raise ValueError('spam') loop = Loop() + self.addCleanup(loop.close) asyncio.set_event_loop(loop) def run_loop(): diff --git a/Lib/test/test_asyncio/test_proactor_events.py b/Lib/test/test_asyncio/test_proactor_events.py --- a/Lib/test/test_asyncio/test_proactor_events.py +++ b/Lib/test/test_asyncio/test_proactor_events.py @@ -12,26 +12,41 @@ from asyncio import test_utils +def close_transport(transport): + # Don't call transport.close() because the event loop and the IOCP proactor + # are mocked + if transport._sock is None: + return + transport._sock.close() + transport._sock = None + + class ProactorSocketTransportTests(test_utils.TestCase): def setUp(self): self.loop = self.new_test_loop() + self.addCleanup(self.loop.close) self.proactor = mock.Mock() self.loop._proactor = self.proactor self.protocol = test_utils.make_test_protocol(asyncio.Protocol) self.sock = mock.Mock(socket.socket) + def socket_transport(self, waiter=None): + transport = _ProactorSocketTransport(self.loop, self.sock, + self.protocol, waiter=waiter) + self.addCleanup(close_transport, transport) + return transport + def test_ctor(self): fut = asyncio.Future(loop=self.loop) - tr = _ProactorSocketTransport( - self.loop, self.sock, self.protocol, fut) + tr = self.socket_transport(waiter=fut) test_utils.run_briefly(self.loop) self.assertIsNone(fut.result()) self.protocol.connection_made(tr) self.proactor.recv.assert_called_with(self.sock, 4096) def test_loop_reading(self): - tr = _ProactorSocketTransport(self.loop, self.sock, self.protocol) + tr = self.socket_transport() tr._loop_reading() self.loop._proactor.recv.assert_called_with(self.sock, 4096) self.assertFalse(self.protocol.data_received.called) @@ -41,8 +56,7 @@ res = asyncio.Future(loop=self.loop) res.set_result(b'data') - tr = _ProactorSocketTransport(self.loop, self.sock, self.protocol) - + tr = self.socket_transport() tr._read_fut = res tr._loop_reading(res) self.loop._proactor.recv.assert_called_with(self.sock, 4096) @@ -52,8 +66,7 @@ res = asyncio.Future(loop=self.loop) res.set_result(b'') - tr = _ProactorSocketTransport(self.loop, self.sock, self.protocol) - + tr = self.socket_transport() self.assertRaises(AssertionError, tr._loop_reading, res) tr.close = mock.Mock() @@ -66,7 +79,7 @@ def test_loop_reading_aborted(self): err = self.loop._proactor.recv.side_effect = ConnectionAbortedError() - tr = _ProactorSocketTransport(self.loop, self.sock, self.protocol) + tr = self.socket_transport() tr._fatal_error = mock.Mock() tr._loop_reading() tr._fatal_error.assert_called_with( @@ -76,7 +89,7 @@ def test_loop_reading_aborted_closing(self): self.loop._proactor.recv.side_effect = ConnectionAbortedError() - tr = _ProactorSocketTransport(self.loop, self.sock, self.protocol) + tr = self.socket_transport() tr._closing = True tr._fatal_error = mock.Mock() tr._loop_reading() @@ -84,7 +97,7 @@ def test_loop_reading_aborted_is_fatal(self): self.loop._proactor.recv.side_effect = ConnectionAbortedError() - tr = _ProactorSocketTransport(self.loop, self.sock, self.protocol) + tr = self.socket_transport() tr._closing = False tr._fatal_error = mock.Mock() tr._loop_reading() @@ -93,7 +106,7 @@ def test_loop_reading_conn_reset_lost(self): err = self.loop._proactor.recv.side_effect = ConnectionResetError() - tr = _ProactorSocketTransport(self.loop, self.sock, self.protocol) + tr = self.socket_transport() tr._closing = False tr._fatal_error = mock.Mock() tr._force_close = mock.Mock() @@ -104,7 +117,7 @@ def test_loop_reading_exception(self): err = self.loop._proactor.recv.side_effect = (OSError()) - tr = _ProactorSocketTransport(self.loop, self.sock, self.protocol) + tr = self.socket_transport() tr._fatal_error = mock.Mock() tr._loop_reading() tr._fatal_error.assert_called_with( @@ -112,19 +125,19 @@ 'Fatal read error on pipe transport') def test_write(self): - tr = _ProactorSocketTransport(self.loop, self.sock, self.protocol) + tr = self.socket_transport() tr._loop_writing = mock.Mock() tr.write(b'data') self.assertEqual(tr._buffer, None) tr._loop_writing.assert_called_with(data=b'data') def test_write_no_data(self): - tr = _ProactorSocketTransport(self.loop, self.sock, self.protocol) + tr = self.socket_transport() tr.write(b'') self.assertFalse(tr._buffer) def test_write_more(self): - tr = _ProactorSocketTransport(self.loop, self.sock, self.protocol) + tr = self.socket_transport() tr._write_fut = mock.Mock() tr._loop_writing = mock.Mock() tr.write(b'data') @@ -132,7 +145,7 @@ self.assertFalse(tr._loop_writing.called) def test_loop_writing(self): - tr = _ProactorSocketTransport(self.loop, self.sock, self.protocol) + tr = self.socket_transport() tr._buffer = bytearray(b'data') tr._loop_writing() self.loop._proactor.send.assert_called_with(self.sock, b'data') @@ -142,7 +155,7 @@ @mock.patch('asyncio.proactor_events.logger') def test_loop_writing_err(self, m_log): err = self.loop._proactor.send.side_effect = OSError() - tr = _ProactorSocketTransport(self.loop, self.sock, self.protocol) + tr = self.socket_transport() tr._fatal_error = mock.Mock() tr._buffer = [b'da', b'ta'] tr._loop_writing() @@ -163,7 +176,7 @@ fut = asyncio.Future(loop=self.loop) fut.set_result(b'data') - tr = _ProactorSocketTransport(self.loop, self.sock, self.protocol) + tr = self.socket_transport() tr._write_fut = fut tr._loop_writing(fut) self.assertIsNone(tr._write_fut) @@ -172,7 +185,7 @@ fut = asyncio.Future(loop=self.loop) fut.set_result(1) - tr = _ProactorSocketTransport(self.loop, self.sock, self.protocol) + tr = self.socket_transport() tr._write_fut = fut tr.close() tr._loop_writing(fut) @@ -181,13 +194,13 @@ self.protocol.connection_lost.assert_called_with(None) def test_abort(self): - tr = _ProactorSocketTransport(self.loop, self.sock, self.protocol) + tr = self.socket_transport() tr._force_close = mock.Mock() tr.abort() tr._force_close.assert_called_with(None) def test_close(self): - tr = _ProactorSocketTransport(self.loop, self.sock, self.protocol) + tr = self.socket_transport() tr.close() test_utils.run_briefly(self.loop) self.protocol.connection_lost.assert_called_with(None) @@ -200,14 +213,14 @@ self.assertFalse(self.protocol.connection_lost.called) def test_close_write_fut(self): - tr = _ProactorSocketTransport(self.loop, self.sock, self.protocol) + tr = self.socket_transport() tr._write_fut = mock.Mock() tr.close() test_utils.run_briefly(self.loop) self.assertFalse(self.protocol.connection_lost.called) def test_close_buffer(self): - tr = _ProactorSocketTransport(self.loop, self.sock, self.protocol) + tr = self.socket_transport() tr._buffer = [b'data'] tr.close() test_utils.run_briefly(self.loop) @@ -215,14 +228,14 @@ @mock.patch('asyncio.base_events.logger') def test_fatal_error(self, m_logging): - tr = _ProactorSocketTransport(self.loop, self.sock, self.protocol) + tr = self.socket_transport() tr._force_close = mock.Mock() tr._fatal_error(None) self.assertTrue(tr._force_close.called) self.assertTrue(m_logging.error.called) def test_force_close(self): - tr = _ProactorSocketTransport(self.loop, self.sock, self.protocol) + tr = self.socket_transport() tr._buffer = [b'data'] read_fut = tr._read_fut = mock.Mock() write_fut = tr._write_fut = mock.Mock() @@ -236,14 +249,14 @@ self.assertEqual(tr._conn_lost, 1) def test_force_close_idempotent(self): - tr = _ProactorSocketTransport(self.loop, self.sock, self.protocol) + tr = self.socket_transport() tr._closing = True tr._force_close(None) test_utils.run_briefly(self.loop) self.assertFalse(self.protocol.connection_lost.called) def test_fatal_error_2(self): - tr = _ProactorSocketTransport(self.loop, self.sock, self.protocol) + tr = self.socket_transport() tr._buffer = [b'data'] tr._force_close(None) @@ -252,14 +265,13 @@ self.assertEqual(None, tr._buffer) def test_call_connection_lost(self): - tr = _ProactorSocketTransport(self.loop, self.sock, self.protocol) + tr = self.socket_transport() tr._call_connection_lost(None) self.assertTrue(self.protocol.connection_lost.called) self.assertTrue(self.sock.close.called) def test_write_eof(self): - tr = _ProactorSocketTransport( - self.loop, self.sock, self.protocol) + tr = self.socket_transport() self.assertTrue(tr.can_write_eof()) tr.write_eof() self.sock.shutdown.assert_called_with(socket.SHUT_WR) @@ -268,7 +280,7 @@ tr.close() def test_write_eof_buffer(self): - tr = _ProactorSocketTransport(self.loop, self.sock, self.protocol) + tr = self.socket_transport() f = asyncio.Future(loop=self.loop) tr._loop._proactor.send.return_value = f tr.write(b'data') @@ -312,11 +324,10 @@ self.assertFalse(tr.can_write_eof()) with self.assertRaises(NotImplementedError): tr.write_eof() - tr.close() + close_transport(tr) def test_pause_resume_reading(self): - tr = _ProactorSocketTransport( - self.loop, self.sock, self.protocol) + tr = self.socket_transport() futures = [] for msg in [b'data1', b'data2', b'data3', b'data4', b'']: f = asyncio.Future(loop=self.loop) @@ -344,10 +355,7 @@ def pause_writing_transport(self, high): - tr = _ProactorSocketTransport( - self.loop, self.sock, self.protocol) - self.addCleanup(tr.close) - + tr = self.socket_transport() tr.set_write_buffer_limits(high=high) self.assertEqual(tr.get_write_buffer_size(), 0) @@ -438,7 +446,7 @@ return (self.ssock, self.csock) self.loop = EventLoop(self.proactor) - self.set_event_loop(self.loop, cleanup=False) + self.set_event_loop(self.loop) @mock.patch.object(BaseProactorEventLoop, 'call_soon') @mock.patch.object(BaseProactorEventLoop, '_socketpair') @@ -450,6 +458,7 @@ self.assertIs(loop._csock, csock) self.assertEqual(loop._internal_fds, 1) call_soon.assert_called_with(loop._loop_self_reading) + loop.close() def test_close_self_pipe(self): self.loop._close_self_pipe() @@ -459,6 +468,9 @@ self.assertIsNone(self.loop._ssock) self.assertIsNone(self.loop._csock) + # Don't call close(): _close_self_pipe() cannot be called twice + self.loop._closed = True + def test_close(self): self.loop._close_self_pipe = mock.Mock() self.loop.close() @@ -493,6 +505,7 @@ def test_make_socket_transport(self): tr = self.loop._make_socket_transport(self.sock, asyncio.Protocol()) self.assertIsInstance(tr, _ProactorSocketTransport) + close_transport(tr) def test_loop_self_reading(self): self.loop._loop_self_reading() diff --git a/Lib/test/test_asyncio/test_selector_events.py b/Lib/test/test_asyncio/test_selector_events.py --- a/Lib/test/test_asyncio/test_selector_events.py +++ b/Lib/test/test_asyncio/test_selector_events.py @@ -24,6 +24,11 @@ class TestBaseSelectorEventLoop(BaseSelectorEventLoop): + def close(self): + # Don't call the close() method of the parent class, because the + # selector is mocked + self._closed = True + def _make_self_pipe(self): self._ssock = mock.Mock() self._csock = mock.Mock() @@ -34,19 +39,29 @@ return bytearray().join(l) +def close_transport(transport): + # Don't call transport.close() because the event loop and the selector + # are mocked + if transport._sock is None: + return + transport._sock.close() + transport._sock = None + + class BaseSelectorEventLoopTests(test_utils.TestCase): def setUp(self): self.selector = mock.Mock() self.selector.select.return_value = [] self.loop = TestBaseSelectorEventLoop(self.selector) - self.set_event_loop(self.loop, cleanup=False) + self.set_event_loop(self.loop) def test_make_socket_transport(self): m = mock.Mock() self.loop.add_reader = mock.Mock() transport = self.loop._make_socket_transport(m, asyncio.Protocol()) self.assertIsInstance(transport, _SelectorSocketTransport) + close_transport(transport) @unittest.skipIf(ssl is None, 'No ssl module') def test_make_ssl_transport(self): @@ -59,11 +74,19 @@ with test_utils.disable_logger(): transport = self.loop._make_ssl_transport( m, asyncio.Protocol(), m, waiter) + # execute the handshake while the logger is disabled + # to ignore SSL handshake failure + test_utils.run_briefly(self.loop) + # Sanity check class_name = transport.__class__.__name__ self.assertIn("ssl", class_name.lower()) self.assertIn("transport", class_name.lower()) + transport.close() + # execute pending callbacks to close the socket transport + test_utils.run_briefly(self.loop) + @mock.patch('asyncio.selector_events.ssl', None) @mock.patch('asyncio.sslproto.ssl', None) def test_make_ssl_transport_without_ssl_error(self): @@ -76,6 +99,15 @@ self.loop._make_ssl_transport(m, m, m, m) def test_close(self): + class EventLoop(BaseSelectorEventLoop): + def _make_self_pipe(self): + self._ssock = mock.Mock() + self._csock = mock.Mock() + self._internal_fds += 1 + + self.loop = EventLoop(self.selector) + self.set_event_loop(self.loop) + ssock = self.loop._ssock ssock.fileno.return_value = 7 csock = self.loop._csock @@ -636,21 +668,27 @@ self.sock = mock.Mock(socket.socket) self.sock.fileno.return_value = 7 + def create_transport(self): + transport = _SelectorTransport(self.loop, self.sock, self.protocol, + None) + self.addCleanup(close_transport, transport) + return transport + def test_ctor(self): - tr = _SelectorTransport(self.loop, self.sock, self.protocol, None) + tr = self.create_transport() self.assertIs(tr._loop, self.loop) self.assertIs(tr._sock, self.sock) self.assertIs(tr._sock_fd, 7) def test_abort(self): - tr = _SelectorTransport(self.loop, self.sock, self.protocol, None) + tr = self.create_transport() tr._force_close = mock.Mock() tr.abort() tr._force_close.assert_called_with(None) def test_close(self): - tr = _SelectorTransport(self.loop, self.sock, self.protocol, None) + tr = self.create_transport() tr.close() self.assertTrue(tr._closing) @@ -663,7 +701,7 @@ self.assertEqual(1, self.loop.remove_reader_count[7]) def test_close_write_buffer(self): - tr = _SelectorTransport(self.loop, self.sock, self.protocol, None) + tr = self.create_transport() tr._buffer.extend(b'data') tr.close() @@ -672,7 +710,7 @@ self.assertFalse(self.protocol.connection_lost.called) def test_force_close(self): - tr = _SelectorTransport(self.loop, self.sock, self.protocol, None) + tr = self.create_transport() tr._buffer.extend(b'1') self.loop.add_reader(7, mock.sentinel) self.loop.add_writer(7, mock.sentinel) @@ -691,7 +729,7 @@ @mock.patch('asyncio.log.logger.error') def test_fatal_error(self, m_exc): exc = OSError() - tr = _SelectorTransport(self.loop, self.sock, self.protocol, None) + tr = self.create_transport() tr._force_close = mock.Mock() tr._fatal_error(exc) @@ -704,7 +742,7 @@ def test_connection_lost(self): exc = OSError() - tr = _SelectorTransport(self.loop, self.sock, self.protocol, None) + tr = self.create_transport() self.assertIsNotNone(tr._protocol) self.assertIsNotNone(tr._loop) tr._call_connection_lost(exc) @@ -725,9 +763,14 @@ self.sock = mock.Mock(socket.socket) self.sock_fd = self.sock.fileno.return_value = 7 + def socket_transport(self, waiter=None): + transport = _SelectorSocketTransport(self.loop, self.sock, + self.protocol, waiter=waiter) + self.addCleanup(close_transport, transport) + return transport + def test_ctor(self): - tr = _SelectorSocketTransport( - self.loop, self.sock, self.protocol) + tr = self.socket_transport() self.loop.assert_reader(7, tr._read_ready) test_utils.run_briefly(self.loop) self.protocol.connection_made.assert_called_with(tr) @@ -735,14 +778,12 @@ def test_ctor_with_waiter(self): fut = asyncio.Future(loop=self.loop) - _SelectorSocketTransport( - self.loop, self.sock, self.protocol, fut) + self.socket_transport(waiter=fut) test_utils.run_briefly(self.loop) self.assertIsNone(fut.result()) def test_pause_resume_reading(self): - tr = _SelectorSocketTransport( - self.loop, self.sock, self.protocol) + tr = self.socket_transport() self.assertFalse(tr._paused) self.loop.assert_reader(7, tr._read_ready) tr.pause_reading() @@ -755,8 +796,7 @@ tr.resume_reading() def test_read_ready(self): - transport = _SelectorSocketTransport( - self.loop, self.sock, self.protocol) + transport = self.socket_transport() self.sock.recv.return_value = b'data' transport._read_ready() @@ -764,8 +804,7 @@ self.protocol.data_received.assert_called_with(b'data') def test_read_ready_eof(self): - transport = _SelectorSocketTransport( - self.loop, self.sock, self.protocol) + transport = self.socket_transport() transport.close = mock.Mock() self.sock.recv.return_value = b'' @@ -775,8 +814,7 @@ transport.close.assert_called_with() def test_read_ready_eof_keep_open(self): - transport = _SelectorSocketTransport( - self.loop, self.sock, self.protocol) + transport = self.socket_transport() transport.close = mock.Mock() self.sock.recv.return_value = b'' @@ -790,8 +828,7 @@ def test_read_ready_tryagain(self, m_exc): self.sock.recv.side_effect = BlockingIOError - transport = _SelectorSocketTransport( - self.loop, self.sock, self.protocol) + transport = self.socket_transport() transport._fatal_error = mock.Mock() transport._read_ready() @@ -801,8 +838,7 @@ def test_read_ready_tryagain_interrupted(self, m_exc): self.sock.recv.side_effect = InterruptedError - transport = _SelectorSocketTransport( - self.loop, self.sock, self.protocol) + transport = self.socket_transport() transport._fatal_error = mock.Mock() transport._read_ready() @@ -812,8 +848,7 @@ def test_read_ready_conn_reset(self, m_exc): err = self.sock.recv.side_effect = ConnectionResetError() - transport = _SelectorSocketTransport( - self.loop, self.sock, self.protocol) + transport = self.socket_transport() transport._force_close = mock.Mock() with test_utils.disable_logger(): transport._read_ready() @@ -823,8 +858,7 @@ def test_read_ready_err(self, m_exc): err = self.sock.recv.side_effect = OSError() - transport = _SelectorSocketTransport( - self.loop, self.sock, self.protocol) + transport = self.socket_transport() transport._fatal_error = mock.Mock() transport._read_ready() @@ -836,8 +870,7 @@ data = b'data' self.sock.send.return_value = len(data) - transport = _SelectorSocketTransport( - self.loop, self.sock, self.protocol) + transport = self.socket_transport() transport.write(data) self.sock.send.assert_called_with(data) @@ -845,8 +878,7 @@ data = bytearray(b'data') self.sock.send.return_value = len(data) - transport = _SelectorSocketTransport( - self.loop, self.sock, self.protocol) + transport = self.socket_transport() transport.write(data) self.sock.send.assert_called_with(data) self.assertEqual(data, bytearray(b'data')) # Hasn't been mutated. @@ -855,22 +887,19 @@ data = memoryview(b'data') self.sock.send.return_value = len(data) - transport = _SelectorSocketTransport( - self.loop, self.sock, self.protocol) + transport = self.socket_transport() transport.write(data) self.sock.send.assert_called_with(data) def test_write_no_data(self): - transport = _SelectorSocketTransport( - self.loop, self.sock, self.protocol) + transport = self.socket_transport() transport._buffer.extend(b'data') transport.write(b'') self.assertFalse(self.sock.send.called) self.assertEqual(list_to_buffer([b'data']), transport._buffer) def test_write_buffer(self): - transport = _SelectorSocketTransport( - self.loop, self.sock, self.protocol) + transport = self.socket_transport() transport._buffer.extend(b'data1') transport.write(b'data2') self.assertFalse(self.sock.send.called) @@ -881,8 +910,7 @@ data = b'data' self.sock.send.return_value = 2 - transport = _SelectorSocketTransport( - self.loop, self.sock, self.protocol) + transport = self.socket_transport() transport.write(data) self.loop.assert_writer(7, transport._write_ready) @@ -892,8 +920,7 @@ data = bytearray(b'data') self.sock.send.return_value = 2 - transport = _SelectorSocketTransport( - self.loop, self.sock, self.protocol) + transport = self.socket_transport() transport.write(data) self.loop.assert_writer(7, transport._write_ready) @@ -904,8 +931,7 @@ data = memoryview(b'data') self.sock.send.return_value = 2 - transport = _SelectorSocketTransport( - self.loop, self.sock, self.protocol) + transport = self.socket_transport() transport.write(data) self.loop.assert_writer(7, transport._write_ready) @@ -916,8 +942,7 @@ self.sock.send.return_value = 0 self.sock.fileno.return_value = 7 - transport = _SelectorSocketTransport( - self.loop, self.sock, self.protocol) + transport = self.socket_transport() transport.write(data) self.loop.assert_writer(7, transport._write_ready) @@ -927,8 +952,7 @@ self.sock.send.side_effect = BlockingIOError data = b'data' - transport = _SelectorSocketTransport( - self.loop, self.sock, self.protocol) + transport = self.socket_transport() transport.write(data) self.loop.assert_writer(7, transport._write_ready) @@ -939,8 +963,7 @@ err = self.sock.send.side_effect = OSError() data = b'data' - transport = _SelectorSocketTransport( - self.loop, self.sock, self.protocol) + transport = self.socket_transport() transport._fatal_error = mock.Mock() transport.write(data) transport._fatal_error.assert_called_with( @@ -959,13 +982,11 @@ m_log.warning.assert_called_with('socket.send() raised exception.') def test_write_str(self): - transport = _SelectorSocketTransport( - self.loop, self.sock, self.protocol) + transport = self.socket_transport() self.assertRaises(TypeError, transport.write, 'str') def test_write_closing(self): - transport = _SelectorSocketTransport( - self.loop, self.sock, self.protocol) + transport = self.socket_transport() transport.close() self.assertEqual(transport._conn_lost, 1) transport.write(b'data') @@ -975,8 +996,7 @@ data = b'data' self.sock.send.return_value = len(data) - transport = _SelectorSocketTransport( - self.loop, self.sock, self.protocol) + transport = self.socket_transport() transport._buffer.extend(data) self.loop.add_writer(7, transport._write_ready) transport._write_ready() @@ -987,8 +1007,7 @@ data = b'data' self.sock.send.return_value = len(data) - transport = _SelectorSocketTransport( - self.loop, self.sock, self.protocol) + transport = self.socket_transport() transport._closing = True transport._buffer.extend(data) self.loop.add_writer(7, transport._write_ready) @@ -999,8 +1018,7 @@ self.protocol.connection_lost.assert_called_with(None) def test_write_ready_no_data(self): - transport = _SelectorSocketTransport( - self.loop, self.sock, self.protocol) + transport = self.socket_transport() # This is an internal error. self.assertRaises(AssertionError, transport._write_ready) @@ -1008,8 +1026,7 @@ data = b'data' self.sock.send.return_value = 2 - transport = _SelectorSocketTransport( - self.loop, self.sock, self.protocol) + transport = self.socket_transport() transport._buffer.extend(data) self.loop.add_writer(7, transport._write_ready) transport._write_ready() @@ -1020,8 +1037,7 @@ data = b'data' self.sock.send.return_value = 0 - transport = _SelectorSocketTransport( - self.loop, self.sock, self.protocol) + transport = self.socket_transport() transport._buffer.extend(data) self.loop.add_writer(7, transport._write_ready) transport._write_ready() @@ -1031,8 +1047,7 @@ def test_write_ready_tryagain(self): self.sock.send.side_effect = BlockingIOError - transport = _SelectorSocketTransport( - self.loop, self.sock, self.protocol) + transport = self.socket_transport() transport._buffer = list_to_buffer([b'data1', b'data2']) self.loop.add_writer(7, transport._write_ready) transport._write_ready() @@ -1043,8 +1058,7 @@ def test_write_ready_exception(self): err = self.sock.send.side_effect = OSError() - transport = _SelectorSocketTransport( - self.loop, self.sock, self.protocol) + transport = self.socket_transport() transport._fatal_error = mock.Mock() transport._buffer.extend(b'data') transport._write_ready() @@ -1057,16 +1071,14 @@ self.sock.send.side_effect = OSError() remove_writer = self.loop.remove_writer = mock.Mock() - transport = _SelectorSocketTransport( - self.loop, self.sock, self.protocol) + transport = self.socket_transport() transport.close() transport._buffer.extend(b'data') transport._write_ready() remove_writer.assert_called_with(self.sock_fd) def test_write_eof(self): - tr = _SelectorSocketTransport( - self.loop, self.sock, self.protocol) + tr = self.socket_transport() self.assertTrue(tr.can_write_eof()) tr.write_eof() self.sock.shutdown.assert_called_with(socket.SHUT_WR) @@ -1075,8 +1087,7 @@ tr.close() def test_write_eof_buffer(self): - tr = _SelectorSocketTransport( - self.loop, self.sock, self.protocol) + tr = self.socket_transport() self.sock.send.side_effect = BlockingIOError tr.write(b'data') tr.write_eof() @@ -1103,9 +1114,15 @@ self.sslcontext = mock.Mock() self.sslcontext.wrap_socket.return_value = self.sslsock + def ssl_transport(self, waiter=None, server_hostname=None): + transport = _SelectorSslTransport(self.loop, self.sock, self.protocol, + self.sslcontext, waiter=waiter, + server_hostname=server_hostname) + self.addCleanup(close_transport, transport) + return transport + def _make_one(self, create_waiter=None): - transport = _SelectorSslTransport( - self.loop, self.sock, self.protocol, self.sslcontext) + transport = self.ssl_transport() self.sock.reset_mock() self.sslsock.reset_mock() self.sslcontext.reset_mock() @@ -1114,9 +1131,7 @@ def test_on_handshake(self): waiter = asyncio.Future(loop=self.loop) - tr = _SelectorSslTransport( - self.loop, self.sock, self.protocol, self.sslcontext, - waiter=waiter) + tr = self.ssl_transport(waiter=waiter) self.assertTrue(self.sslsock.do_handshake.called) self.loop.assert_reader(1, tr._read_ready) test_utils.run_briefly(self.loop) @@ -1125,15 +1140,13 @@ def test_on_handshake_reader_retry(self): self.loop.set_debug(False) self.sslsock.do_handshake.side_effect = ssl.SSLWantReadError - transport = _SelectorSslTransport( - self.loop, self.sock, self.protocol, self.sslcontext) + transport = self.ssl_transport() self.loop.assert_reader(1, transport._on_handshake, None) def test_on_handshake_writer_retry(self): self.loop.set_debug(False) self.sslsock.do_handshake.side_effect = ssl.SSLWantWriteError - transport = _SelectorSslTransport( - self.loop, self.sock, self.protocol, self.sslcontext) + transport = self.ssl_transport() self.loop.assert_writer(1, transport._on_handshake, None) def test_on_handshake_exc(self): @@ -1141,16 +1154,14 @@ self.sslsock.do_handshake.side_effect = exc with test_utils.disable_logger(): waiter = asyncio.Future(loop=self.loop) - transport = _SelectorSslTransport( - self.loop, self.sock, self.protocol, self.sslcontext, waiter) + transport = self.ssl_transport(waiter=waiter) self.assertTrue(waiter.done()) self.assertIs(exc, waiter.exception()) self.assertTrue(self.sslsock.close.called) def test_on_handshake_base_exc(self): waiter = asyncio.Future(loop=self.loop) - transport = _SelectorSslTransport( - self.loop, self.sock, self.protocol, self.sslcontext, waiter) + transport = self.ssl_transport(waiter=waiter) exc = BaseException() self.sslsock.do_handshake.side_effect = exc with test_utils.disable_logger(): @@ -1163,8 +1174,7 @@ # Python issue #23197: cancelling an handshake must not raise an # exception or log an error, even if the handshake failed waiter = asyncio.Future(loop=self.loop) - transport = _SelectorSslTransport( - self.loop, self.sock, self.protocol, self.sslcontext, waiter) + transport = self.ssl_transport(waiter=waiter) waiter.cancel() exc = ValueError() self.sslsock.do_handshake.side_effect = exc @@ -1423,9 +1433,7 @@ @unittest.skipIf(ssl is None, 'No SSL support') def test_server_hostname(self): - _SelectorSslTransport( - self.loop, self.sock, self.protocol, self.sslcontext, - server_hostname='localhost') + self.ssl_transport(server_hostname='localhost') self.sslcontext.wrap_socket.assert_called_with( self.sock, do_handshake_on_connect=False, server_side=False, server_hostname='localhost') @@ -1448,9 +1456,15 @@ self.sock = mock.Mock(spec_set=socket.socket) self.sock.fileno.return_value = 7 + def datagram_transport(self, address=None): + transport = _SelectorDatagramTransport(self.loop, self.sock, + self.protocol, + address=address) + self.addCleanup(close_transport, transport) + return transport + def test_read_ready(self): - transport = _SelectorDatagramTransport( - self.loop, self.sock, self.protocol) + transport = self.datagram_transport() self.sock.recvfrom.return_value = (b'data', ('0.0.0.0', 1234)) transport._read_ready() @@ -1459,8 +1473,7 @@ b'data', ('0.0.0.0', 1234)) def test_read_ready_tryagain(self): - transport = _SelectorDatagramTransport( - self.loop, self.sock, self.protocol) + transport = self.datagram_transport() self.sock.recvfrom.side_effect = BlockingIOError transport._fatal_error = mock.Mock() @@ -1469,8 +1482,7 @@ self.assertFalse(transport._fatal_error.called) def test_read_ready_err(self): - transport = _SelectorDatagramTransport( - self.loop, self.sock, self.protocol) + transport = self.datagram_transport() err = self.sock.recvfrom.side_effect = RuntimeError() transport._fatal_error = mock.Mock() @@ -1481,8 +1493,7 @@ 'Fatal read error on datagram transport') def test_read_ready_oserr(self): - transport = _SelectorDatagramTransport( - self.loop, self.sock, self.protocol) + transport = self.datagram_transport() err = self.sock.recvfrom.side_effect = OSError() transport._fatal_error = mock.Mock() @@ -1493,8 +1504,7 @@ def test_sendto(self): data = b'data' - transport = _SelectorDatagramTransport( - self.loop, self.sock, self.protocol) + transport = self.datagram_transport() transport.sendto(data, ('0.0.0.0', 1234)) self.assertTrue(self.sock.sendto.called) self.assertEqual( @@ -1502,8 +1512,7 @@ def test_sendto_bytearray(self): data = bytearray(b'data') - transport = _SelectorDatagramTransport( - self.loop, self.sock, self.protocol) + transport = self.datagram_transport() transport.sendto(data, ('0.0.0.0', 1234)) self.assertTrue(self.sock.sendto.called) self.assertEqual( @@ -1511,16 +1520,14 @@ def test_sendto_memoryview(self): data = memoryview(b'data') - transport = _SelectorDatagramTransport( - self.loop, self.sock, self.protocol) + transport = self.datagram_transport() transport.sendto(data, ('0.0.0.0', 1234)) self.assertTrue(self.sock.sendto.called) self.assertEqual( self.sock.sendto.call_args[0], (data, ('0.0.0.0', 1234))) def test_sendto_no_data(self): - transport = _SelectorDatagramTransport( - self.loop, self.sock, self.protocol) + transport = self.datagram_transport() transport._buffer.append((b'data', ('0.0.0.0', 12345))) transport.sendto(b'', ()) self.assertFalse(self.sock.sendto.called) @@ -1528,8 +1535,7 @@ [(b'data', ('0.0.0.0', 12345))], list(transport._buffer)) def test_sendto_buffer(self): - transport = _SelectorDatagramTransport( - self.loop, self.sock, self.protocol) + transport = self.datagram_transport() transport._buffer.append((b'data1', ('0.0.0.0', 12345))) transport.sendto(b'data2', ('0.0.0.0', 12345)) self.assertFalse(self.sock.sendto.called) @@ -1540,8 +1546,7 @@ def test_sendto_buffer_bytearray(self): data2 = bytearray(b'data2') - transport = _SelectorDatagramTransport( - self.loop, self.sock, self.protocol) + transport = self.datagram_transport() transport._buffer.append((b'data1', ('0.0.0.0', 12345))) transport.sendto(data2, ('0.0.0.0', 12345)) self.assertFalse(self.sock.sendto.called) @@ -1553,8 +1558,7 @@ def test_sendto_buffer_memoryview(self): data2 = memoryview(b'data2') - transport = _SelectorDatagramTransport( - self.loop, self.sock, self.protocol) + transport = self.datagram_transport() transport._buffer.append((b'data1', ('0.0.0.0', 12345))) transport.sendto(data2, ('0.0.0.0', 12345)) self.assertFalse(self.sock.sendto.called) @@ -1569,8 +1573,7 @@ self.sock.sendto.side_effect = BlockingIOError - transport = _SelectorDatagramTransport( - self.loop, self.sock, self.protocol) + transport = self.datagram_transport() transport.sendto(data, ('0.0.0.0', 12345)) self.loop.assert_writer(7, transport._sendto_ready) @@ -1582,8 +1585,7 @@ data = b'data' err = self.sock.sendto.side_effect = RuntimeError() - transport = _SelectorDatagramTransport( - self.loop, self.sock, self.protocol) + transport = self.datagram_transport() transport._fatal_error = mock.Mock() transport.sendto(data, ()) @@ -1606,8 +1608,7 @@ self.sock.sendto.side_effect = ConnectionRefusedError - transport = _SelectorDatagramTransport( - self.loop, self.sock, self.protocol) + transport = self.datagram_transport() transport._fatal_error = mock.Mock() transport.sendto(data, ()) @@ -1619,8 +1620,7 @@ self.sock.send.side_effect = ConnectionRefusedError - transport = _SelectorDatagramTransport( - self.loop, self.sock, self.protocol, ('0.0.0.0', 1)) + transport = self.datagram_transport(address=('0.0.0.0', 1)) transport._fatal_error = mock.Mock() transport.sendto(data) @@ -1628,19 +1628,16 @@ self.assertTrue(self.protocol.error_received.called) def test_sendto_str(self): - transport = _SelectorDatagramTransport( - self.loop, self.sock, self.protocol) + transport = self.datagram_transport() self.assertRaises(TypeError, transport.sendto, 'str', ()) def test_sendto_connected_addr(self): - transport = _SelectorDatagramTransport( - self.loop, self.sock, self.protocol, ('0.0.0.0', 1)) + transport = self.datagram_transport(address=('0.0.0.0', 1)) self.assertRaises( ValueError, transport.sendto, b'str', ('0.0.0.0', 2)) def test_sendto_closing(self): - transport = _SelectorDatagramTransport( - self.loop, self.sock, self.protocol, address=(1,)) + transport = self.datagram_transport(address=(1,)) transport.close() self.assertEqual(transport._conn_lost, 1) transport.sendto(b'data', (1,)) @@ -1650,8 +1647,7 @@ data = b'data' self.sock.sendto.return_value = len(data) - transport = _SelectorDatagramTransport( - self.loop, self.sock, self.protocol) + transport = self.datagram_transport() transport._buffer.append((data, ('0.0.0.0', 12345))) self.loop.add_writer(7, transport._sendto_ready) transport._sendto_ready() @@ -1664,8 +1660,7 @@ data = b'data' self.sock.send.return_value = len(data) - transport = _SelectorDatagramTransport( - self.loop, self.sock, self.protocol) + transport = self.datagram_transport() transport._closing = True transport._buffer.append((data, ())) self.loop.add_writer(7, transport._sendto_ready) @@ -1676,8 +1671,7 @@ self.protocol.connection_lost.assert_called_with(None) def test_sendto_ready_no_data(self): - transport = _SelectorDatagramTransport( - self.loop, self.sock, self.protocol) + transport = self.datagram_transport() self.loop.add_writer(7, transport._sendto_ready) transport._sendto_ready() self.assertFalse(self.sock.sendto.called) @@ -1686,8 +1680,7 @@ def test_sendto_ready_tryagain(self): self.sock.sendto.side_effect = BlockingIOError - transport = _SelectorDatagramTransport( - self.loop, self.sock, self.protocol) + transport = self.datagram_transport() transport._buffer.extend([(b'data1', ()), (b'data2', ())]) self.loop.add_writer(7, transport._sendto_ready) transport._sendto_ready() @@ -1700,8 +1693,7 @@ def test_sendto_ready_exception(self): err = self.sock.sendto.side_effect = RuntimeError() - transport = _SelectorDatagramTransport( - self.loop, self.sock, self.protocol) + transport = self.datagram_transport() transport._fatal_error = mock.Mock() transport._buffer.append((b'data', ())) transport._sendto_ready() @@ -1713,8 +1705,7 @@ def test_sendto_ready_error_received(self): self.sock.sendto.side_effect = ConnectionRefusedError - transport = _SelectorDatagramTransport( - self.loop, self.sock, self.protocol) + transport = self.datagram_transport() transport._fatal_error = mock.Mock() transport._buffer.append((b'data', ())) transport._sendto_ready() @@ -1724,8 +1715,7 @@ def test_sendto_ready_error_received_connection(self): self.sock.send.side_effect = ConnectionRefusedError - transport = _SelectorDatagramTransport( - self.loop, self.sock, self.protocol, ('0.0.0.0', 1)) + transport = self.datagram_transport(address=('0.0.0.0', 1)) transport._fatal_error = mock.Mock() transport._buffer.append((b'data', ())) transport._sendto_ready() @@ -1735,8 +1725,7 @@ @mock.patch('asyncio.base_events.logger.error') def test_fatal_error_connected(self, m_exc): - transport = _SelectorDatagramTransport( - self.loop, self.sock, self.protocol, ('0.0.0.0', 1)) + transport = self.datagram_transport(address=('0.0.0.0', 1)) err = ConnectionRefusedError() transport._fatal_error(err) self.assertFalse(self.protocol.error_received.called) @@ -1744,7 +1733,6 @@ test_utils.MockPattern( 'Fatal error on transport\nprotocol:.*\ntransport:.*'), exc_info=(ConnectionRefusedError, MOCK_ANY, MOCK_ANY)) - transport.close() if __name__ == '__main__': diff --git a/Lib/test/test_asyncio/test_unix_events.py b/Lib/test/test_asyncio/test_unix_events.py --- a/Lib/test/test_asyncio/test_unix_events.py +++ b/Lib/test/test_asyncio/test_unix_events.py @@ -26,6 +26,15 @@ MOCK_ANY = mock.ANY +def close_pipe_transport(transport): + # Don't call transport.close() because the event loop and the selector + # are mocked + if transport._pipe is None: + return + transport._pipe.close() + transport._pipe = None + + @unittest.skipUnless(signal, 'Signals are not supported') class SelectorEventLoopSignalTests(test_utils.TestCase): @@ -333,24 +342,28 @@ m_fstat.return_value = st self.addCleanup(fstat_patcher.stop) + def read_pipe_transport(self, waiter=None): + transport = unix_events._UnixReadPipeTransport(self.loop, self.pipe, + self.protocol, + waiter=waiter) + self.addCleanup(close_pipe_transport, transport) + return transport + def test_ctor(self): - tr = unix_events._UnixReadPipeTransport( - self.loop, self.pipe, self.protocol) + tr = self.read_pipe_transport() self.loop.assert_reader(5, tr._read_ready) test_utils.run_briefly(self.loop) self.protocol.connection_made.assert_called_with(tr) def test_ctor_with_waiter(self): fut = asyncio.Future(loop=self.loop) - unix_events._UnixReadPipeTransport( - self.loop, self.pipe, self.protocol, fut) + tr = self.read_pipe_transport(waiter=fut) test_utils.run_briefly(self.loop) self.assertIsNone(fut.result()) @mock.patch('os.read') def test__read_ready(self, m_read): - tr = unix_events._UnixReadPipeTransport( - self.loop, self.pipe, self.protocol) + tr = self.read_pipe_transport() m_read.return_value = b'data' tr._read_ready() @@ -359,8 +372,7 @@ @mock.patch('os.read') def test__read_ready_eof(self, m_read): - tr = unix_events._UnixReadPipeTransport( - self.loop, self.pipe, self.protocol) + tr = self.read_pipe_transport() m_read.return_value = b'' tr._read_ready() @@ -372,8 +384,7 @@ @mock.patch('os.read') def test__read_ready_blocked(self, m_read): - tr = unix_events._UnixReadPipeTransport( - self.loop, self.pipe, self.protocol) + tr = self.read_pipe_transport() m_read.side_effect = BlockingIOError tr._read_ready() @@ -384,8 +395,7 @@ @mock.patch('asyncio.log.logger.error') @mock.patch('os.read') def test__read_ready_error(self, m_read, m_logexc): - tr = unix_events._UnixReadPipeTransport( - self.loop, self.pipe, self.protocol) + tr = self.read_pipe_transport() err = OSError() m_read.side_effect = err tr._close = mock.Mock() @@ -401,9 +411,7 @@ @mock.patch('os.read') def test_pause_reading(self, m_read): - tr = unix_events._UnixReadPipeTransport( - self.loop, self.pipe, self.protocol) - + tr = self.read_pipe_transport() m = mock.Mock() self.loop.add_reader(5, m) tr.pause_reading() @@ -411,26 +419,20 @@ @mock.patch('os.read') def test_resume_reading(self, m_read): - tr = unix_events._UnixReadPipeTransport( - self.loop, self.pipe, self.protocol) - + tr = self.read_pipe_transport() tr.resume_reading() self.loop.assert_reader(5, tr._read_ready) @mock.patch('os.read') def test_close(self, m_read): - tr = unix_events._UnixReadPipeTransport( - self.loop, self.pipe, self.protocol) - + tr = self.read_pipe_transport() tr._close = mock.Mock() tr.close() tr._close.assert_called_with(None) @mock.patch('os.read') def test_close_already_closing(self, m_read): - tr = unix_events._UnixReadPipeTransport( - self.loop, self.pipe, self.protocol) - + tr = self.read_pipe_transport() tr._closing = True tr._close = mock.Mock() tr.close() @@ -438,9 +440,7 @@ @mock.patch('os.read') def test__close(self, m_read): - tr = unix_events._UnixReadPipeTransport( - self.loop, self.pipe, self.protocol) - + tr = self.read_pipe_transport() err = object() tr._close(err) self.assertTrue(tr._closing) @@ -449,8 +449,7 @@ self.protocol.connection_lost.assert_called_with(err) def test__call_connection_lost(self): - tr = unix_events._UnixReadPipeTransport( - self.loop, self.pipe, self.protocol) + tr = self.read_pipe_transport() self.assertIsNotNone(tr._protocol) self.assertIsNotNone(tr._loop) @@ -463,8 +462,7 @@ self.assertIsNone(tr._loop) def test__call_connection_lost_with_err(self): - tr = unix_events._UnixReadPipeTransport( - self.loop, self.pipe, self.protocol) + tr = self.read_pipe_transport() self.assertIsNotNone(tr._protocol) self.assertIsNotNone(tr._loop) @@ -496,31 +494,33 @@ m_fstat.return_value = st self.addCleanup(fstat_patcher.stop) + def write_pipe_transport(self, waiter=None): + transport = unix_events._UnixWritePipeTransport(self.loop, self.pipe, + self.protocol, + waiter=waiter) + self.addCleanup(close_pipe_transport, transport) + return transport + def test_ctor(self): - tr = unix_events._UnixWritePipeTransport( - self.loop, self.pipe, self.protocol) + tr = self.write_pipe_transport() self.loop.assert_reader(5, tr._read_ready) test_utils.run_briefly(self.loop) self.protocol.connection_made.assert_called_with(tr) def test_ctor_with_waiter(self): fut = asyncio.Future(loop=self.loop) - tr = unix_events._UnixWritePipeTransport( - self.loop, self.pipe, self.protocol, fut) + tr = self.write_pipe_transport(waiter=fut) self.loop.assert_reader(5, tr._read_ready) test_utils.run_briefly(self.loop) self.assertEqual(None, fut.result()) def test_can_write_eof(self): - tr = unix_events._UnixWritePipeTransport( - self.loop, self.pipe, self.protocol) + tr = self.write_pipe_transport() self.assertTrue(tr.can_write_eof()) @mock.patch('os.write') def test_write(self, m_write): - tr = unix_events._UnixWritePipeTransport( - self.loop, self.pipe, self.protocol) - + tr = self.write_pipe_transport() m_write.return_value = 4 tr.write(b'data') m_write.assert_called_with(5, b'data') @@ -529,9 +529,7 @@ @mock.patch('os.write') def test_write_no_data(self, m_write): - tr = unix_events._UnixWritePipeTransport( - self.loop, self.pipe, self.protocol) - + tr = self.write_pipe_transport() tr.write(b'') self.assertFalse(m_write.called) self.assertFalse(self.loop.writers) @@ -539,9 +537,7 @@ @mock.patch('os.write') def test_write_partial(self, m_write): - tr = unix_events._UnixWritePipeTransport( - self.loop, self.pipe, self.protocol) - + tr = self.write_pipe_transport() m_write.return_value = 2 tr.write(b'data') m_write.assert_called_with(5, b'data') @@ -550,9 +546,7 @@ @mock.patch('os.write') def test_write_buffer(self, m_write): - tr = unix_events._UnixWritePipeTransport( - self.loop, self.pipe, self.protocol) - + tr = self.write_pipe_transport() self.loop.add_writer(5, tr._write_ready) tr._buffer = [b'previous'] tr.write(b'data') @@ -562,9 +556,7 @@ @mock.patch('os.write') def test_write_again(self, m_write): - tr = unix_events._UnixWritePipeTransport( - self.loop, self.pipe, self.protocol) - + tr = self.write_pipe_transport() m_write.side_effect = BlockingIOError() tr.write(b'data') m_write.assert_called_with(5, b'data') @@ -574,9 +566,7 @@ @mock.patch('asyncio.unix_events.logger') @mock.patch('os.write') def test_write_err(self, m_write, m_log): - tr = unix_events._UnixWritePipeTransport( - self.loop, self.pipe, self.protocol) - + tr = self.write_pipe_transport() err = OSError() m_write.side_effect = err tr._fatal_error = mock.Mock() @@ -602,8 +592,7 @@ @mock.patch('os.write') def test_write_close(self, m_write): - tr = unix_events._UnixWritePipeTransport( - self.loop, self.pipe, self.protocol) + tr = self.write_pipe_transport() tr._read_ready() # pipe was closed by peer tr.write(b'data') @@ -612,8 +601,7 @@ self.assertEqual(tr._conn_lost, 2) def test__read_ready(self): - tr = unix_events._UnixWritePipeTransport(self.loop, self.pipe, - self.protocol) + tr = self.write_pipe_transport() tr._read_ready() self.assertFalse(self.loop.readers) self.assertFalse(self.loop.writers) @@ -623,8 +611,7 @@ @mock.patch('os.write') def test__write_ready(self, m_write): - tr = unix_events._UnixWritePipeTransport( - self.loop, self.pipe, self.protocol) + tr = self.write_pipe_transport() self.loop.add_writer(5, tr._write_ready) tr._buffer = [b'da', b'ta'] m_write.return_value = 4 @@ -635,9 +622,7 @@ @mock.patch('os.write') def test__write_ready_partial(self, m_write): - tr = unix_events._UnixWritePipeTransport( - self.loop, self.pipe, self.protocol) - + tr = self.write_pipe_transport() self.loop.add_writer(5, tr._write_ready) tr._buffer = [b'da', b'ta'] m_write.return_value = 3 @@ -648,9 +633,7 @@ @mock.patch('os.write') def test__write_ready_again(self, m_write): - tr = unix_events._UnixWritePipeTransport( - self.loop, self.pipe, self.protocol) - + tr = self.write_pipe_transport() self.loop.add_writer(5, tr._write_ready) tr._buffer = [b'da', b'ta'] m_write.side_effect = BlockingIOError() @@ -661,9 +644,7 @@ @mock.patch('os.write') def test__write_ready_empty(self, m_write): - tr = unix_events._UnixWritePipeTransport( - self.loop, self.pipe, self.protocol) - + tr = self.write_pipe_transport() self.loop.add_writer(5, tr._write_ready) tr._buffer = [b'da', b'ta'] m_write.return_value = 0 @@ -675,9 +656,7 @@ @mock.patch('asyncio.log.logger.error') @mock.patch('os.write') def test__write_ready_err(self, m_write, m_logexc): - tr = unix_events._UnixWritePipeTransport( - self.loop, self.pipe, self.protocol) - + tr = self.write_pipe_transport() self.loop.add_writer(5, tr._write_ready) tr._buffer = [b'da', b'ta'] m_write.side_effect = err = OSError() @@ -698,9 +677,7 @@ @mock.patch('os.write') def test__write_ready_closing(self, m_write): - tr = unix_events._UnixWritePipeTransport( - self.loop, self.pipe, self.protocol) - + tr = self.write_pipe_transport() self.loop.add_writer(5, tr._write_ready) tr._closing = True tr._buffer = [b'da', b'ta'] @@ -715,9 +692,7 @@ @mock.patch('os.write') def test_abort(self, m_write): - tr = unix_events._UnixWritePipeTransport( - self.loop, self.pipe, self.protocol) - + tr = self.write_pipe_transport() self.loop.add_writer(5, tr._write_ready) self.loop.add_reader(5, tr._read_ready) tr._buffer = [b'da', b'ta'] @@ -731,8 +706,7 @@ self.protocol.connection_lost.assert_called_with(None) def test__call_connection_lost(self): - tr = unix_events._UnixWritePipeTransport( - self.loop, self.pipe, self.protocol) + tr = self.write_pipe_transport() self.assertIsNotNone(tr._protocol) self.assertIsNotNone(tr._loop) @@ -745,8 +719,7 @@ self.assertIsNone(tr._loop) def test__call_connection_lost_with_err(self): - tr = unix_events._UnixWritePipeTransport( - self.loop, self.pipe, self.protocol) + tr = self.write_pipe_transport() self.assertIsNotNone(tr._protocol) self.assertIsNotNone(tr._loop) @@ -759,26 +732,23 @@ self.assertIsNone(tr._loop) def test_close(self): - tr = unix_events._UnixWritePipeTransport( - self.loop, self.pipe, self.protocol) - + tr = self.write_pipe_transport() tr.write_eof = mock.Mock() tr.close() tr.write_eof.assert_called_with() + # closing the transport twice must not fail + tr.close() + def test_close_closing(self): - tr = unix_events._UnixWritePipeTransport( - self.loop, self.pipe, self.protocol) - + tr = self.write_pipe_transport() tr.write_eof = mock.Mock() tr._closing = True tr.close() self.assertFalse(tr.write_eof.called) def test_write_eof(self): - tr = unix_events._UnixWritePipeTransport( - self.loop, self.pipe, self.protocol) - + tr = self.write_pipe_transport() tr.write_eof() self.assertTrue(tr._closing) self.assertFalse(self.loop.readers) @@ -786,8 +756,7 @@ self.protocol.connection_lost.assert_called_with(None) def test_write_eof_pending(self): - tr = unix_events._UnixWritePipeTransport( - self.loop, self.pipe, self.protocol) + tr = self.write_pipe_transport() tr._buffer = [b'data'] tr.write_eof() self.assertTrue(tr._closing) -- Repository URL: https://hg.python.org/cpython From python-checkins at python.org Thu Jan 15 13:42:38 2015 From: python-checkins at python.org (victor.stinner) Date: Thu, 15 Jan 2015 12:42:38 +0000 Subject: [Python-checkins] =?utf-8?b?Y3B5dGhvbiAoMy40KTogYXN5bmNpbzogRml4?= =?utf-8?b?IF9Qcm9hY3RvckJhc2VQaXBlVHJhbnNwb3J0Ll9fcmVwcl9fKCk=?= Message-ID: <20150115124236.11579.95428@psf.io> https://hg.python.org/cpython/rev/063a34c7bf77 changeset: 94171:063a34c7bf77 branch: 3.4 parent: 94169:f9b127188d43 user: Victor Stinner date: Thu Jan 15 13:32:28 2015 +0100 summary: asyncio: Fix _ProactorBasePipeTransport.__repr__() Check if the _sock attribute is None to check if the transport is closed. files: Lib/asyncio/proactor_events.py | 6 +++--- 1 files changed, 3 insertions(+), 3 deletions(-) diff --git a/Lib/asyncio/proactor_events.py b/Lib/asyncio/proactor_events.py --- a/Lib/asyncio/proactor_events.py +++ b/Lib/asyncio/proactor_events.py @@ -43,12 +43,12 @@ def __repr__(self): info = [self.__class__.__name__] - fd = self._sock.fileno() - if fd < 0: + if self._sock is None: info.append('closed') elif self._closing: info.append('closing') - info.append('fd=%s' % fd) + if self._sock is not None: + info.append('fd=%s' % self._sock.fileno()) if self._read_fut is not None: info.append('read=%s' % self._read_fut) if self._write_fut is not None: -- Repository URL: https://hg.python.org/cpython From python-checkins at python.org Thu Jan 15 13:42:38 2015 From: python-checkins at python.org (victor.stinner) Date: Thu, 15 Jan 2015 12:42:38 +0000 Subject: [Python-checkins] =?utf-8?q?cpython_=28merge_3=2E4_-=3E_default?= =?utf-8?q?=29=3A_Merge_3=2E4_=28asyncio=29?= Message-ID: <20150115124236.22423.35150@psf.io> https://hg.python.org/cpython/rev/831d71f2811d changeset: 94173:831d71f2811d parent: 94170:dc21d72d0f87 parent: 94172:a35b790a2db3 user: Victor Stinner date: Thu Jan 15 13:41:01 2015 +0100 summary: Merge 3.4 (asyncio) files: Lib/asyncio/proactor_events.py | 10 ++++++---- 1 files changed, 6 insertions(+), 4 deletions(-) diff --git a/Lib/asyncio/proactor_events.py b/Lib/asyncio/proactor_events.py --- a/Lib/asyncio/proactor_events.py +++ b/Lib/asyncio/proactor_events.py @@ -43,12 +43,12 @@ def __repr__(self): info = [self.__class__.__name__] - fd = self._sock.fileno() - if fd < 0: + if self._sock is None: info.append('closed') elif self._closing: info.append('closing') - info.append('fd=%s' % fd) + if self._sock is not None: + info.append('fd=%s' % self._sock.fileno()) if self._read_fut is not None: info.append('read=%s' % self._read_fut) if self._write_fut is not None: @@ -72,6 +72,7 @@ self._loop.call_soon(self._call_connection_lost, None) if self._read_fut is not None: self._read_fut.cancel() + self._read_fut = None def _fatal_error(self, exc, message='Fatal error on pipe transport'): if isinstance(exc, (BrokenPipeError, ConnectionResetError)): @@ -93,9 +94,10 @@ self._conn_lost += 1 if self._write_fut: self._write_fut.cancel() + self._write_fut = None if self._read_fut: self._read_fut.cancel() - self._write_fut = self._read_fut = None + self._read_fut = None self._pending_write = 0 self._buffer = None self._loop.call_soon(self._call_connection_lost, exc) -- Repository URL: https://hg.python.org/cpython From python-checkins at python.org Thu Jan 15 13:42:38 2015 From: python-checkins at python.org (victor.stinner) Date: Thu, 15 Jan 2015 12:42:38 +0000 Subject: [Python-checkins] =?utf-8?b?Y3B5dGhvbiAoMy40KTogYXN5bmNpbzogRml4?= =?utf-8?q?_=5FProactorBasePipeTransport=2Eclose=28=29?= Message-ID: <20150115124236.22421.3089@psf.io> https://hg.python.org/cpython/rev/a35b790a2db3 changeset: 94172:a35b790a2db3 branch: 3.4 user: Victor Stinner date: Thu Jan 15 13:40:27 2015 +0100 summary: asyncio: Fix _ProactorBasePipeTransport.close() Set the _read_fut attribute to None after cancelling it. This change should fix a race condition with _ProactorWritePipeTransport._pipe_closed(). files: Lib/asyncio/proactor_events.py | 4 +++- 1 files changed, 3 insertions(+), 1 deletions(-) diff --git a/Lib/asyncio/proactor_events.py b/Lib/asyncio/proactor_events.py --- a/Lib/asyncio/proactor_events.py +++ b/Lib/asyncio/proactor_events.py @@ -72,6 +72,7 @@ self._loop.call_soon(self._call_connection_lost, None) if self._read_fut is not None: self._read_fut.cancel() + self._read_fut = None def _fatal_error(self, exc, message='Fatal error on pipe transport'): if isinstance(exc, (BrokenPipeError, ConnectionResetError)): @@ -93,9 +94,10 @@ self._conn_lost += 1 if self._write_fut: self._write_fut.cancel() + self._write_fut = None if self._read_fut: self._read_fut.cancel() - self._write_fut = self._read_fut = None + self._read_fut = None self._pending_write = 0 self._buffer = None self._loop.call_soon(self._call_connection_lost, exc) -- Repository URL: https://hg.python.org/cpython From python-checkins at python.org Thu Jan 15 14:26:29 2015 From: python-checkins at python.org (victor.stinner) Date: Thu, 15 Jan 2015 13:26:29 +0000 Subject: [Python-checkins] =?utf-8?b?Y3B5dGhvbiAoMy40KTogYXN5bmNpbzogQ2xv?= =?utf-8?q?se_the_transport_on_subprocess_creation_failure?= Message-ID: <20150115132626.125904.51748@psf.io> https://hg.python.org/cpython/rev/6bace35f55e3 changeset: 94174:6bace35f55e3 branch: 3.4 parent: 94172:a35b790a2db3 user: Victor Stinner date: Thu Jan 15 14:24:22 2015 +0100 summary: asyncio: Close the transport on subprocess creation failure files: Lib/asyncio/unix_events.py | 6 +++++- Lib/asyncio/windows_events.py | 7 ++++++- 2 files changed, 11 insertions(+), 2 deletions(-) diff --git a/Lib/asyncio/unix_events.py b/Lib/asyncio/unix_events.py --- a/Lib/asyncio/unix_events.py +++ b/Lib/asyncio/unix_events.py @@ -177,7 +177,11 @@ transp = _UnixSubprocessTransport(self, protocol, args, shell, stdin, stdout, stderr, bufsize, extra=extra, **kwargs) - yield from transp._post_init() + try: + yield from transp._post_init() + except: + transp.close() + raise watcher.add_child_handler(transp.get_pid(), self._child_watcher_callback, transp) diff --git a/Lib/asyncio/windows_events.py b/Lib/asyncio/windows_events.py --- a/Lib/asyncio/windows_events.py +++ b/Lib/asyncio/windows_events.py @@ -272,7 +272,12 @@ transp = _WindowsSubprocessTransport(self, protocol, args, shell, stdin, stdout, stderr, bufsize, extra=extra, **kwargs) - yield from transp._post_init() + try: + yield from transp._post_init() + except: + transp.close() + raise + return transp -- Repository URL: https://hg.python.org/cpython From python-checkins at python.org Thu Jan 15 14:26:29 2015 From: python-checkins at python.org (victor.stinner) Date: Thu, 15 Jan 2015 13:26:29 +0000 Subject: [Python-checkins] =?utf-8?b?Y3B5dGhvbiAoMy40KTogYXN5bmNpbzogQ2xv?= =?utf-8?q?se_transports_in_tests?= Message-ID: <20150115132626.22407.40933@psf.io> https://hg.python.org/cpython/rev/9dfd33f3657f changeset: 94175:9dfd33f3657f branch: 3.4 user: Victor Stinner date: Thu Jan 15 14:24:55 2015 +0100 summary: asyncio: Close transports in tests * Use test_utils.run_briefly() to execute pending calls to really close transports * sslproto: mock also _SSLPipe.shutdown(), it's need to close the transport * pipe test: the test doesn't close explicitly the PipeHandle, so ignore the warning instead * test_popen: use the context manager ("with p:") to explicitly close pipes files: Lib/test/test_asyncio/test_selector_events.py | 2 + Lib/test/test_asyncio/test_sslproto.py | 4 +++ Lib/test/test_asyncio/test_subprocess.py | 1 + Lib/test/test_asyncio/test_windows_utils.py | 11 +++++++-- 4 files changed, 15 insertions(+), 3 deletions(-) diff --git a/Lib/test/test_asyncio/test_selector_events.py b/Lib/test/test_asyncio/test_selector_events.py --- a/Lib/test/test_asyncio/test_selector_events.py +++ b/Lib/test/test_asyncio/test_selector_events.py @@ -1180,6 +1180,8 @@ self.sslsock.do_handshake.side_effect = exc with test_utils.disable_logger(): transport._on_handshake(0) + transport.close() + test_utils.run_briefly(self.loop) def test_pause_resume_reading(self): tr = self._make_one() diff --git a/Lib/test/test_asyncio/test_sslproto.py b/Lib/test/test_asyncio/test_sslproto.py --- a/Lib/test/test_asyncio/test_sslproto.py +++ b/Lib/test/test_asyncio/test_sslproto.py @@ -33,6 +33,7 @@ waiter.cancel() transport = mock.Mock() sslpipe = mock.Mock() + sslpipe.shutdown.return_value = b'' sslpipe.do_handshake.side_effect = do_handshake with mock.patch('asyncio.sslproto._SSLPipe', return_value=sslpipe): ssl_proto.connection_made(transport) @@ -40,6 +41,9 @@ with test_utils.disable_logger(): self.loop.run_until_complete(handshake_fut) + # Close the transport + ssl_proto._app_transport.close() + if __name__ == '__main__': unittest.main() diff --git a/Lib/test/test_asyncio/test_subprocess.py b/Lib/test/test_asyncio/test_subprocess.py --- a/Lib/test/test_asyncio/test_subprocess.py +++ b/Lib/test/test_asyncio/test_subprocess.py @@ -286,6 +286,7 @@ # "Exception during subprocess creation, kill the subprocess" with test_utils.disable_logger(): self.loop.run_until_complete(cancel_make_transport()) + test_utils.run_briefly(self.loop) if sys.platform != 'win32': diff --git a/Lib/test/test_asyncio/test_windows_utils.py b/Lib/test/test_asyncio/test_windows_utils.py --- a/Lib/test/test_asyncio/test_windows_utils.py +++ b/Lib/test/test_asyncio/test_windows_utils.py @@ -3,6 +3,7 @@ import socket import sys import unittest +import warnings from unittest import mock if sys.platform != 'win32': @@ -115,8 +116,10 @@ self.assertEqual(p.handle, h) # check garbage collection of p closes handle - del p - support.gc_collect() + with warnings.catch_warnings(): + warnings.filterwarnings("ignore", "", ResourceWarning) + del p + support.gc_collect() try: _winapi.CloseHandle(h) except OSError as e: @@ -170,7 +173,9 @@ self.assertTrue(msg.upper().rstrip().startswith(out)) self.assertTrue(b"stderr".startswith(err)) - p.wait() + # The context manager calls wait() and closes resources + with p: + pass if __name__ == '__main__': -- Repository URL: https://hg.python.org/cpython From python-checkins at python.org Thu Jan 15 14:26:29 2015 From: python-checkins at python.org (victor.stinner) Date: Thu, 15 Jan 2015 13:26:29 +0000 Subject: [Python-checkins] =?utf-8?q?cpython_=28merge_3=2E4_-=3E_default?= =?utf-8?q?=29=3A_Merge_3=2E4_=28asyncio=29?= Message-ID: <20150115132626.125888.6031@psf.io> https://hg.python.org/cpython/rev/78fb5c4e3129 changeset: 94176:78fb5c4e3129 parent: 94173:831d71f2811d parent: 94175:9dfd33f3657f user: Victor Stinner date: Thu Jan 15 14:25:08 2015 +0100 summary: Merge 3.4 (asyncio) files: Lib/asyncio/unix_events.py | 6 ++++- Lib/asyncio/windows_events.py | 7 +++++- Lib/test/test_asyncio/test_selector_events.py | 2 + Lib/test/test_asyncio/test_sslproto.py | 4 +++ Lib/test/test_asyncio/test_subprocess.py | 1 + Lib/test/test_asyncio/test_windows_utils.py | 11 +++++++-- 6 files changed, 26 insertions(+), 5 deletions(-) diff --git a/Lib/asyncio/unix_events.py b/Lib/asyncio/unix_events.py --- a/Lib/asyncio/unix_events.py +++ b/Lib/asyncio/unix_events.py @@ -177,7 +177,11 @@ transp = _UnixSubprocessTransport(self, protocol, args, shell, stdin, stdout, stderr, bufsize, extra=extra, **kwargs) - yield from transp._post_init() + try: + yield from transp._post_init() + except: + transp.close() + raise watcher.add_child_handler(transp.get_pid(), self._child_watcher_callback, transp) diff --git a/Lib/asyncio/windows_events.py b/Lib/asyncio/windows_events.py --- a/Lib/asyncio/windows_events.py +++ b/Lib/asyncio/windows_events.py @@ -272,7 +272,12 @@ transp = _WindowsSubprocessTransport(self, protocol, args, shell, stdin, stdout, stderr, bufsize, extra=extra, **kwargs) - yield from transp._post_init() + try: + yield from transp._post_init() + except: + transp.close() + raise + return transp diff --git a/Lib/test/test_asyncio/test_selector_events.py b/Lib/test/test_asyncio/test_selector_events.py --- a/Lib/test/test_asyncio/test_selector_events.py +++ b/Lib/test/test_asyncio/test_selector_events.py @@ -1180,6 +1180,8 @@ self.sslsock.do_handshake.side_effect = exc with test_utils.disable_logger(): transport._on_handshake(0) + transport.close() + test_utils.run_briefly(self.loop) def test_pause_resume_reading(self): tr = self._make_one() diff --git a/Lib/test/test_asyncio/test_sslproto.py b/Lib/test/test_asyncio/test_sslproto.py --- a/Lib/test/test_asyncio/test_sslproto.py +++ b/Lib/test/test_asyncio/test_sslproto.py @@ -33,6 +33,7 @@ waiter.cancel() transport = mock.Mock() sslpipe = mock.Mock() + sslpipe.shutdown.return_value = b'' sslpipe.do_handshake.side_effect = do_handshake with mock.patch('asyncio.sslproto._SSLPipe', return_value=sslpipe): ssl_proto.connection_made(transport) @@ -40,6 +41,9 @@ with test_utils.disable_logger(): self.loop.run_until_complete(handshake_fut) + # Close the transport + ssl_proto._app_transport.close() + if __name__ == '__main__': unittest.main() diff --git a/Lib/test/test_asyncio/test_subprocess.py b/Lib/test/test_asyncio/test_subprocess.py --- a/Lib/test/test_asyncio/test_subprocess.py +++ b/Lib/test/test_asyncio/test_subprocess.py @@ -286,6 +286,7 @@ # "Exception during subprocess creation, kill the subprocess" with test_utils.disable_logger(): self.loop.run_until_complete(cancel_make_transport()) + test_utils.run_briefly(self.loop) if sys.platform != 'win32': diff --git a/Lib/test/test_asyncio/test_windows_utils.py b/Lib/test/test_asyncio/test_windows_utils.py --- a/Lib/test/test_asyncio/test_windows_utils.py +++ b/Lib/test/test_asyncio/test_windows_utils.py @@ -3,6 +3,7 @@ import socket import sys import unittest +import warnings from unittest import mock if sys.platform != 'win32': @@ -115,8 +116,10 @@ self.assertEqual(p.handle, h) # check garbage collection of p closes handle - del p - support.gc_collect() + with warnings.catch_warnings(): + warnings.filterwarnings("ignore", "", ResourceWarning) + del p + support.gc_collect() try: _winapi.CloseHandle(h) except OSError as e: @@ -170,7 +173,9 @@ self.assertTrue(msg.upper().rstrip().startswith(out)) self.assertTrue(b"stderr".startswith(err)) - p.wait() + # The context manager calls wait() and closes resources + with p: + pass if __name__ == '__main__': -- Repository URL: https://hg.python.org/cpython From python-checkins at python.org Thu Jan 15 16:31:09 2015 From: python-checkins at python.org (victor.stinner) Date: Thu, 15 Jan 2015 15:31:09 +0000 Subject: [Python-checkins] =?utf-8?q?cpython_=28merge_3=2E4_-=3E_default?= =?utf-8?q?=29=3A_Merge_3=2E4_=28asyncio=29?= Message-ID: <20150115153026.125906.56311@psf.io> https://hg.python.org/cpython/rev/e05646ea3c40 changeset: 94178:e05646ea3c40 parent: 94176:78fb5c4e3129 parent: 94177:8adf1896712d user: Victor Stinner date: Thu Jan 15 16:29:23 2015 +0100 summary: Merge 3.4 (asyncio) files: Lib/asyncio/tasks.py | 12 +++++-- Lib/test/test_asyncio/test_tasks.py | 27 +++++++++++++++++ 2 files changed, 35 insertions(+), 4 deletions(-) diff --git a/Lib/asyncio/tasks.py b/Lib/asyncio/tasks.py --- a/Lib/asyncio/tasks.py +++ b/Lib/asyncio/tasks.py @@ -347,10 +347,9 @@ it cancels the task and raises TimeoutError. To avoid the task cancellation, wrap it in shield(). - Usage: + If the wait is cancelled, the task is also cancelled. - result = yield from asyncio.wait_for(fut, 10.0) - + This function is a coroutine. """ if loop is None: loop = events.get_event_loop() @@ -367,7 +366,12 @@ try: # wait until the future completes or the timeout - yield from waiter + try: + yield from waiter + except futures.CancelledError: + fut.remove_done_callback(cb) + fut.cancel() + raise if fut.done(): return fut.result() diff --git a/Lib/test/test_asyncio/test_tasks.py b/Lib/test/test_asyncio/test_tasks.py --- a/Lib/test/test_asyncio/test_tasks.py +++ b/Lib/test/test_asyncio/test_tasks.py @@ -1705,6 +1705,33 @@ 'test_task_source_traceback')) self.loop.run_until_complete(task) + def _test_cancel_wait_for(self, timeout): + loop = asyncio.new_event_loop() + self.addCleanup(loop.close) + + @asyncio.coroutine + def blocking_coroutine(): + fut = asyncio.Future(loop=loop) + # Block: fut result is never set + yield from fut + + task = loop.create_task(blocking_coroutine()) + + wait = loop.create_task(asyncio.wait_for(task, timeout, loop=loop)) + loop.call_soon(wait.cancel) + + self.assertRaises(asyncio.CancelledError, + loop.run_until_complete, wait) + + # Python issue #23219: cancelling the wait must also cancel the task + self.assertTrue(task.cancelled()) + + def test_cancel_blocking_wait_for(self): + self._test_cancel_wait_for(None) + + def test_cancel_wait_for(self): + self._test_cancel_wait_for(60.0) + class GatherTestsBase: -- Repository URL: https://hg.python.org/cpython From python-checkins at python.org Thu Jan 15 16:31:09 2015 From: python-checkins at python.org (victor.stinner) Date: Thu, 15 Jan 2015 15:31:09 +0000 Subject: [Python-checkins] =?utf-8?b?Y3B5dGhvbiAoMy40KTogQ2xvc2VzICMyMzIx?= =?utf-8?q?9=3A_cancelling_asyncio=2Ewait=5Ffor=28=29_now_cancels_the_task?= Message-ID: <20150115153026.72565.22664@psf.io> https://hg.python.org/cpython/rev/8adf1896712d changeset: 94177:8adf1896712d branch: 3.4 parent: 94175:9dfd33f3657f user: Victor Stinner date: Thu Jan 15 16:29:10 2015 +0100 summary: Closes #23219: cancelling asyncio.wait_for() now cancels the task files: Lib/asyncio/tasks.py | 12 +++++-- Lib/test/test_asyncio/test_tasks.py | 27 +++++++++++++++++ 2 files changed, 35 insertions(+), 4 deletions(-) diff --git a/Lib/asyncio/tasks.py b/Lib/asyncio/tasks.py --- a/Lib/asyncio/tasks.py +++ b/Lib/asyncio/tasks.py @@ -347,10 +347,9 @@ it cancels the task and raises TimeoutError. To avoid the task cancellation, wrap it in shield(). - Usage: + If the wait is cancelled, the task is also cancelled. - result = yield from asyncio.wait_for(fut, 10.0) - + This function is a coroutine. """ if loop is None: loop = events.get_event_loop() @@ -367,7 +366,12 @@ try: # wait until the future completes or the timeout - yield from waiter + try: + yield from waiter + except futures.CancelledError: + fut.remove_done_callback(cb) + fut.cancel() + raise if fut.done(): return fut.result() diff --git a/Lib/test/test_asyncio/test_tasks.py b/Lib/test/test_asyncio/test_tasks.py --- a/Lib/test/test_asyncio/test_tasks.py +++ b/Lib/test/test_asyncio/test_tasks.py @@ -1705,6 +1705,33 @@ 'test_task_source_traceback')) self.loop.run_until_complete(task) + def _test_cancel_wait_for(self, timeout): + loop = asyncio.new_event_loop() + self.addCleanup(loop.close) + + @asyncio.coroutine + def blocking_coroutine(): + fut = asyncio.Future(loop=loop) + # Block: fut result is never set + yield from fut + + task = loop.create_task(blocking_coroutine()) + + wait = loop.create_task(asyncio.wait_for(task, timeout, loop=loop)) + loop.call_soon(wait.cancel) + + self.assertRaises(asyncio.CancelledError, + loop.run_until_complete, wait) + + # Python issue #23219: cancelling the wait must also cancel the task + self.assertTrue(task.cancelled()) + + def test_cancel_blocking_wait_for(self): + self._test_cancel_wait_for(None) + + def test_cancel_wait_for(self): + self._test_cancel_wait_for(60.0) + class GatherTestsBase: -- Repository URL: https://hg.python.org/cpython From python-checkins at python.org Thu Jan 15 18:12:45 2015 From: python-checkins at python.org (steve.dower) Date: Thu, 15 Jan 2015 17:12:45 +0000 Subject: [Python-checkins] =?utf-8?q?cpython=3A_Fixes_sys=2Ewinver_generat?= =?utf-8?q?ion_and_removes_dependency_on_user32=2Edll?= Message-ID: <20150115171218.3960.25809@psf.io> https://hg.python.org/cpython/rev/36b0a5c1bc9e changeset: 94179:36b0a5c1bc9e user: Steve Dower date: Thu Jan 15 09:10:16 2015 -0800 summary: Fixes sys.winver generation and removes dependency on user32.dll files: PC/dl_nt.c | 8 ++++++++ PCbuild/pyproject.props | 2 +- PCbuild/python.props | 8 ++++++-- PCbuild/pythoncore.vcxproj | 8 +++++--- 4 files changed, 20 insertions(+), 6 deletions(-) diff --git a/PC/dl_nt.c b/PC/dl_nt.c --- a/PC/dl_nt.c +++ b/PC/dl_nt.c @@ -12,7 +12,12 @@ #include "windows.h" #ifdef Py_ENABLE_SHARED +#ifdef MS_DLL_ID +// The string is available at build, so fill the buffer immediately +char dllVersionBuffer[16] = MS_DLL_ID; +#else char dllVersionBuffer[16] = ""; // a private buffer +#endif // Python Globals HMODULE PyWin_DLLhModule = NULL; @@ -88,8 +93,11 @@ { case DLL_PROCESS_ATTACH: PyWin_DLLhModule = hInst; +#ifndef MS_DLL_ID + // If we have MS_DLL_ID, we don't need to load the string. // 1000 is a magic number I picked out of the air. Could do with a #define, I spose... LoadString(hInst, 1000, dllVersionBuffer, sizeof(dllVersionBuffer)); +#endif #if HAVE_SXS // and capture our activation context for use when loading extensions. diff --git a/PCbuild/pyproject.props b/PCbuild/pyproject.props --- a/PCbuild/pyproject.props +++ b/PCbuild/pyproject.props @@ -87,7 +87,7 @@ diff --git a/PCbuild/python.props b/PCbuild/python.props --- a/PCbuild/python.props +++ b/PCbuild/python.props @@ -1,7 +1,7 @@ - + - Win32 + Win32 Release .cp$(MajorVersionNumber)$(MinorVersionNumber)-win32 .cp$(MajorVersionNumber)$(MinorVersionNumber)-win_amd64 + + + $(MajorVersionNumber).$(MinorVersionNumber) + $(SysWinVer)-32 diff --git a/PCbuild/pythoncore.vcxproj b/PCbuild/pythoncore.vcxproj --- a/PCbuild/pythoncore.vcxproj +++ b/PCbuild/pythoncore.vcxproj @@ -67,7 +67,7 @@ /Zm200 %(AdditionalOptions) $(PySourcePath)Python;$(PySourcePath)Modules\zlib;%(AdditionalIncludeDirectories) - _USRDLL;Py_BUILD_CORE;Py_ENABLE_SHARED;%(PreprocessorDefinitions) + _USRDLL;Py_BUILD_CORE;Py_ENABLE_SHARED;MS_DLL_ID="$(SysWinVer)";%(PreprocessorDefinitions) ws2_32.lib;%(AdditionalDependencies) @@ -335,7 +335,6 @@ - @@ -387,13 +386,16 @@ + + + - + -- Repository URL: https://hg.python.org/cpython From python-checkins at python.org Thu Jan 15 18:12:45 2015 From: python-checkins at python.org (steve.dower) Date: Thu, 15 Jan 2015 17:12:45 +0000 Subject: [Python-checkins] =?utf-8?q?cpython=3A_Issue_23018=3A_Add_version?= =?utf-8?q?_info_to_python=5Bw=5D=2Eexe?= Message-ID: <20150115171218.72555.4931@psf.io> https://hg.python.org/cpython/rev/3f7e483cebef changeset: 94180:3f7e483cebef user: Steve Dower date: Thu Jan 15 09:10:43 2015 -0800 summary: Issue 23018: Add version info to python[w].exe files: PC/python.manifest | 12 ++++++++ PC/python_exe.rc | 50 +++++++++++++++++++++++++++++++++- PC/python_nt.rc | 42 ++++++--------------------- PC/python_ver_rc.h | 35 +++++++++++++++++++++++ 4 files changed, 106 insertions(+), 33 deletions(-) diff --git a/PC/python.manifest b/PC/python.manifest new file mode 100644 --- /dev/null +++ b/PC/python.manifest @@ -0,0 +1,12 @@ + + + + + + + + + + + + \ No newline at end of file diff --git a/PC/python_exe.rc b/PC/python_exe.rc --- a/PC/python_exe.rc +++ b/PC/python_exe.rc @@ -1,1 +1,49 @@ -1 ICON DISCARDABLE "pycon.ico" +// Resource script for Python console EXEs. + +#include "python_ver_rc.h" + +// Include the manifest file that indicates we support all +// current versions of Windows. +#include +1 RT_MANIFEST "python.manifest" + +1 ICON DISCARDABLE "pycon.ico" + + +///////////////////////////////////////////////////////////////////////////// +// +// Version +// + +VS_VERSION_INFO VERSIONINFO + FILEVERSION PYVERSION64 + PRODUCTVERSION PYVERSION64 + FILEFLAGSMASK 0x3fL +#ifdef _DEBUG + FILEFLAGS VS_FF_DEBUG +#else + FILEFLAGS 0x0L +#endif + FILEOS VOS__WINDOWS32 + FILETYPE VFT_APP + FILESUBTYPE 0x0L +BEGIN + BLOCK "StringFileInfo" + BEGIN + BLOCK "000004b0" + BEGIN + VALUE "CompanyName", PYTHON_COMPANY "\0" + VALUE "FileDescription", "Python\0" + VALUE "FileVersion", PYTHON_VERSION + VALUE "InternalName", "Python Console\0" + VALUE "LegalCopyright", PYTHON_COPYRIGHT "\0" + VALUE "OriginalFilename", "python" PYTHON_DEBUG_EXT ".exe\0" + VALUE "ProductName", "Python\0" + VALUE "ProductVersion", PYTHON_VERSION + END + END + BLOCK "VarFileInfo" + BEGIN + VALUE "Translation", 0x0, 1200 + END +END diff --git a/PC/python_nt.rc b/PC/python_nt.rc --- a/PC/python_nt.rc +++ b/PC/python_nt.rc @@ -1,33 +1,11 @@ // Resource script for Python core DLL. -// Currently only holds version information. -// -#include "winver.h" -#define MS_WINDOWS -#include "modsupport.h" -#include "patchlevel.h" -#ifdef _DEBUG -# include "pythonnt_rc_d.h" -#else -# include "pythonnt_rc.h" -#endif +#include "python_ver_rc.h" -/* e.g., 3.3.0a1 - * PY_VERSION comes from patchlevel.h - */ -#define PYTHON_VERSION PY_VERSION "\0" - -/* 64-bit version number as comma-separated list of 4 16-bit ints */ -#if PY_MICRO_VERSION > 64 -# error "PY_MICRO_VERSION > 64" -#endif -#if PY_RELEASE_LEVEL > 99 -# error "PY_RELEASE_LEVEL > 99" -#endif -#if PY_RELEASE_SERIAL > 9 -# error "PY_RELEASE_SERIAL > 9" -#endif -#define PYVERSION64 PY_MAJOR_VERSION, PY_MINOR_VERSION, FIELD3, PYTHON_API_VERSION +// Include the manifest file that indicates we support all +// current versions of Windows. +#include +2 RT_MANIFEST "python.manifest" // String Tables STRINGTABLE DISCARDABLE @@ -45,23 +23,23 @@ PRODUCTVERSION PYVERSION64 FILEFLAGSMASK 0x3fL #ifdef _DEBUG - FILEFLAGS 0x1L + FILEFLAGS VS_FF_DEBUG #else FILEFLAGS 0x0L #endif - FILEOS 0x40004L - FILETYPE 0x1L + FILEOS VOS__WINDOWS32 + FILETYPE VFT_DLL FILESUBTYPE 0x0L BEGIN BLOCK "StringFileInfo" BEGIN BLOCK "000004b0" BEGIN - VALUE "CompanyName", "Python Software Foundation\0" + VALUE "CompanyName", PYTHON_COMPANY "\0" VALUE "FileDescription", "Python Core\0" VALUE "FileVersion", PYTHON_VERSION VALUE "InternalName", "Python DLL\0" - VALUE "LegalCopyright", "Copyright ? 2001-2015 Python Software Foundation. Copyright ? 2000 BeOpen.com. Copyright ? 1995-2001 CNRI. Copyright ? 1991-1995 SMC.\0" + VALUE "LegalCopyright", PYTHON_COPYRIGHT "\0" VALUE "OriginalFilename", PYTHON_DLL_NAME "\0" VALUE "ProductName", "Python\0" VALUE "ProductVersion", PYTHON_VERSION diff --git a/PC/python_ver_rc.h b/PC/python_ver_rc.h new file mode 100644 --- /dev/null +++ b/PC/python_ver_rc.h @@ -0,0 +1,35 @@ +// Resource script for Python core DLL. +// Currently only holds version information. +// +#include "winver.h" + +#define PYTHON_COMPANY "Python Software Foundation" +#define PYTHON_COPYRIGHT "Copyright ? 2001-2014 Python Software Foundation. Copyright ? 2000 BeOpen.com. Copyright ? 1995-2001 CNRI. Copyright ? 1991-1995 SMC." + +#define MS_WINDOWS +#include "modsupport.h" +#include "patchlevel.h" +#ifdef _DEBUG +# include "pythonnt_rc_d.h" +# define PYTHON_DEBUG_EXT "_d" +#else +# include "pythonnt_rc.h" +# define PYTHON_DEBUG_EXT +#endif + +/* e.g., 3.3.0a1 + * PY_VERSION comes from patchlevel.h + */ +#define PYTHON_VERSION PY_VERSION "\0" + +/* 64-bit version number as comma-separated list of 4 16-bit ints */ +#if PY_MICRO_VERSION > 64 +# error "PY_MICRO_VERSION > 64" +#endif +#if PY_RELEASE_LEVEL > 99 +# error "PY_RELEASE_LEVEL > 99" +#endif +#if PY_RELEASE_SERIAL > 9 +# error "PY_RELEASE_SERIAL > 9" +#endif +#define PYVERSION64 PY_MAJOR_VERSION, PY_MINOR_VERSION, FIELD3, PYTHON_API_VERSION -- Repository URL: https://hg.python.org/cpython From python-checkins at python.org Thu Jan 15 18:18:33 2015 From: python-checkins at python.org (steve.dower) Date: Thu, 15 Jan 2015 17:18:33 +0000 Subject: [Python-checkins] =?utf-8?b?Y3B5dGhvbiAoMi43KTogQ2xvc2VzICMyMzE2?= =?utf-8?q?0=3A_Respect_the_environment_variable_SVNROOT_in_external-commo?= =?utf-8?q?n=2Ebat?= Message-ID: <20150115171819.11571.5114@psf.io> https://hg.python.org/cpython/rev/294501835890 changeset: 94181:294501835890 branch: 2.7 parent: 94157:25a1ce2a6f9b user: Steve Dower date: Thu Jan 15 09:15:45 2015 -0800 summary: Closes #23160: Respect the environment variable SVNROOT in external-common.bat (patch by anselm.kruis) files: Tools/buildbot/external-common.bat | 18 ++++++++++-------- 1 files changed, 10 insertions(+), 8 deletions(-) diff --git a/Tools/buildbot/external-common.bat b/Tools/buildbot/external-common.bat --- a/Tools/buildbot/external-common.bat +++ b/Tools/buildbot/external-common.bat @@ -1,6 +1,8 @@ @rem Common file shared between external.bat and external-amd64.bat. Responsible for @rem fetching external components into the root\externals directory. +if "%SVNROOT%"=="" set SVNROOT=http://svn.python.org/projects/external/ + if not exist externals mkdir externals cd externals @rem XXX: If you need to force the buildbots to start from a fresh environment, uncomment @@ -31,31 +33,31 @@ @rem bzip if not exist bzip2-1.0.6 ( rd /s/q bzip2-1.0.5 - svn export http://svn.python.org/projects/external/bzip2-1.0.6 + svn export %SVNROOT%bzip2-1.0.6 ) @rem Berkeley DB if exist db-4.4.20 rd /s/q db-4.4.20 -if not exist db-4.7.25.0 svn export http://svn.python.org/projects/external/db-4.7.25.0 +if not exist db-4.7.25.0 svn export %SVNROOT%db-4.7.25.0 @rem NASM, for OpenSSL build @rem if exist nasm-2.11.06 rd /s/q nasm-2.11.06 -if not exist nasm-2.11.06 svn export http://svn.python.org/projects/external/nasm-2.11.06 +if not exist nasm-2.11.06 svn export %SVNROOT%nasm-2.11.06 @rem OpenSSL if exist openssl-1.0.1i rd /s/q openssl-1.0.1i -if not exist openssl-1.0.1j svn export http://svn.python.org/projects/external/openssl-1.0.1j +if not exist openssl-1.0.1j svn export %SVNROOT%openssl-1.0.1j @rem tcl/tk if not exist tcl-8.5.15.0 ( rd /s/q tcltk tcltk64 tcl-8.5.2.1 tk-8.5.2.0 - svn export http://svn.python.org/projects/external/tcl-8.5.15.0 + svn export %SVNROOT%tcl-8.5.15.0 ) -if not exist tk-8.5.15.0 svn export http://svn.python.org/projects/external/tk-8.5.15.0 -if not exist tix-8.4.3.5 svn export http://svn.python.org/projects/external/tix-8.4.3.5 +if not exist tk-8.5.15.0 svn export %SVNROOT%tk-8.5.15.0 +if not exist tix-8.4.3.5 svn export %SVNROOT%tix-8.4.3.5 @rem sqlite3 if not exist sqlite-3.6.21 ( rd /s/q sqlite-source-3.3.4 - svn export http://svn.python.org/projects/external/sqlite-3.6.21 + svn export %SVNROOT%sqlite-3.6.21 ) -- Repository URL: https://hg.python.org/cpython From python-checkins at python.org Thu Jan 15 18:18:45 2015 From: python-checkins at python.org (steve.dower) Date: Thu, 15 Jan 2015 17:18:45 +0000 Subject: [Python-checkins] =?utf-8?q?cpython_=28merge_3=2E4_-=3E_default?= =?utf-8?q?=29=3A_Null_merge_from_3=2E4?= Message-ID: <20150115171837.22425.5221@psf.io> https://hg.python.org/cpython/rev/c52f76c84eec changeset: 94183:c52f76c84eec parent: 94180:3f7e483cebef parent: 94182:45e2c95bb802 user: Steve Dower date: Thu Jan 15 09:17:18 2015 -0800 summary: Null merge from 3.4 files: -- Repository URL: https://hg.python.org/cpython From python-checkins at python.org Thu Jan 15 18:18:45 2015 From: python-checkins at python.org (steve.dower) Date: Thu, 15 Jan 2015 17:18:45 +0000 Subject: [Python-checkins] =?utf-8?b?Y3B5dGhvbiAoMy40KTogQ2xvc2VzICMyMzE2?= =?utf-8?q?0=3A_Respect_the_environment_variable_SVNROOT_in_external-commo?= =?utf-8?q?n=2Ebat?= Message-ID: <20150115171837.11561.7624@psf.io> https://hg.python.org/cpython/rev/45e2c95bb802 changeset: 94182:45e2c95bb802 branch: 3.4 parent: 94177:8adf1896712d user: Steve Dower date: Thu Jan 15 09:16:38 2015 -0800 summary: Closes #23160: Respect the environment variable SVNROOT in external-common.bat (patch by anselm.kruis) files: Tools/buildbot/external-common.bat | 18 ++++++++++-------- 1 files changed, 10 insertions(+), 8 deletions(-) diff --git a/Tools/buildbot/external-common.bat b/Tools/buildbot/external-common.bat --- a/Tools/buildbot/external-common.bat +++ b/Tools/buildbot/external-common.bat @@ -1,6 +1,8 @@ @rem Common file shared between external.bat and external-amd64.bat. Responsible for @rem fetching external components into the root\.. buildbot directories. +if "%SVNROOT%"=="" set SVNROOT=http://svn.python.org/projects/external/ + if not exist externals mkdir externals cd externals @rem XXX: If you need to force the buildbots to start from a fresh environment, uncomment @@ -18,35 +20,35 @@ @rem bzip if not exist bzip2-1.0.6 ( rd /s/q bzip2-1.0.5 - svn export http://svn.python.org/projects/external/bzip2-1.0.6 + svn export %SVNROOT%bzip2-1.0.6 ) @rem NASM, for OpenSSL build @rem if exist nasm-2.11.06 rd /s/q nasm-2.11.06 -if not exist nasm-2.11.06 svn export http://svn.python.org/projects/external/nasm-2.11.06 +if not exist nasm-2.11.06 svn export %SVNROOT%nasm-2.11.06 @rem OpenSSL if not exist openssl-1.0.1j ( rd /s/q openssl-1.0.1i - svn export http://svn.python.org/projects/external/openssl-1.0.1j + svn export %SVNROOT%openssl-1.0.1j ) @rem tcl/tk if not exist tcl-8.6.1.0 ( rd /s/q tcltk tcltk64 tcl-8.5.11.0 tk-8.5.11.0 - svn export http://svn.python.org/projects/external/tcl-8.6.1.0 + svn export %SVNROOT%tcl-8.6.1.0 ) -if not exist tk-8.6.1.0 svn export http://svn.python.org/projects/external/tk-8.6.1.0 -if not exist tix-8.4.3.4 svn export http://svn.python.org/projects/external/tix-8.4.3.4 +if not exist tk-8.6.1.0 svn export %SVNROOT%tk-8.6.1.0 +if not exist tix-8.4.3.4 svn export %SVNROOT%tix-8.4.3.4 @rem sqlite3 if not exist sqlite-3.8.3.1 ( rd /s/q sqlite-source-3.8.1 - svn export http://svn.python.org/projects/external/sqlite-3.8.3.1 + svn export %SVNROOT%sqlite-3.8.3.1 ) @rem lzma if not exist xz-5.0.5 ( rd /s/q xz-5.0.3 - svn export http://svn.python.org/projects/external/xz-5.0.5 + svn export %SVNROOT%xz-5.0.5 ) -- Repository URL: https://hg.python.org/cpython From python-checkins at python.org Thu Jan 15 21:52:24 2015 From: python-checkins at python.org (victor.stinner) Date: Thu, 15 Jan 2015 20:52:24 +0000 Subject: [Python-checkins] =?utf-8?q?cpython_=283=2E4=29=3A_Backout_change?= =?utf-8?q?set_6ab2575bc12b?= Message-ID: <20150115205216.104134.94410@psf.io> https://hg.python.org/cpython/rev/ecbde0b31f6f changeset: 94184:ecbde0b31f6f branch: 3.4 parent: 94182:45e2c95bb802 user: Victor Stinner date: Thu Jan 15 21:50:19 2015 +0100 summary: Backout changeset 6ab2575bc12b StreamWriter: close() now clears the reference to the transport StreamWriter now raises an exception if it is closed: write(), writelines(), write_eof(), can_write_eof(), get_extra_info(), drain(). files: Lib/asyncio/streams.py | 25 ++++--------------------- 1 files changed, 4 insertions(+), 21 deletions(-) diff --git a/Lib/asyncio/streams.py b/Lib/asyncio/streams.py --- a/Lib/asyncio/streams.py +++ b/Lib/asyncio/streams.py @@ -258,22 +258,8 @@ self._reader = reader self._loop = loop - def close(self): - if self._transport is None: - return - self._transport.close() - self._transport = None - - def _check_closed(self): - if self._transport is None: - raise RuntimeError('StreamWriter is closed') - def __repr__(self): - info = [self.__class__.__name__] - if self._transport is not None: - info.append('transport=%r' % self._transport) - else: - info.append('closed') + info = [self.__class__.__name__, 'transport=%r' % self._transport] if self._reader is not None: info.append('reader=%r' % self._reader) return '<%s>' % ' '.join(info) @@ -283,23 +269,21 @@ return self._transport def write(self, data): - self._check_closed() self._transport.write(data) def writelines(self, data): - self._check_closed() self._transport.writelines(data) def write_eof(self): - self._check_closed() return self._transport.write_eof() def can_write_eof(self): - self._check_closed() return self._transport.can_write_eof() + def close(self): + return self._transport.close() + def get_extra_info(self, name, default=None): - self._check_closed() return self._transport.get_extra_info(name, default) @coroutine @@ -311,7 +295,6 @@ w.write(data) yield from w.drain() """ - self._check_closed() if self._reader is not None: exc = self._reader.exception() if exc is not None: -- Repository URL: https://hg.python.org/cpython From python-checkins at python.org Thu Jan 15 21:52:24 2015 From: python-checkins at python.org (victor.stinner) Date: Thu, 15 Jan 2015 20:52:24 +0000 Subject: [Python-checkins] =?utf-8?q?cpython_=28merge_3=2E4_-=3E_default?= =?utf-8?q?=29=3A_Merge_3=2E4_=28asyncio=29?= Message-ID: <20150115205216.93201.649@psf.io> https://hg.python.org/cpython/rev/973173ad0fe7 changeset: 94185:973173ad0fe7 parent: 94183:c52f76c84eec parent: 94184:ecbde0b31f6f user: Victor Stinner date: Thu Jan 15 21:50:48 2015 +0100 summary: Merge 3.4 (asyncio) files: Lib/asyncio/streams.py | 25 ++++--------------------- 1 files changed, 4 insertions(+), 21 deletions(-) diff --git a/Lib/asyncio/streams.py b/Lib/asyncio/streams.py --- a/Lib/asyncio/streams.py +++ b/Lib/asyncio/streams.py @@ -258,22 +258,8 @@ self._reader = reader self._loop = loop - def close(self): - if self._transport is None: - return - self._transport.close() - self._transport = None - - def _check_closed(self): - if self._transport is None: - raise RuntimeError('StreamWriter is closed') - def __repr__(self): - info = [self.__class__.__name__] - if self._transport is not None: - info.append('transport=%r' % self._transport) - else: - info.append('closed') + info = [self.__class__.__name__, 'transport=%r' % self._transport] if self._reader is not None: info.append('reader=%r' % self._reader) return '<%s>' % ' '.join(info) @@ -283,23 +269,21 @@ return self._transport def write(self, data): - self._check_closed() self._transport.write(data) def writelines(self, data): - self._check_closed() self._transport.writelines(data) def write_eof(self): - self._check_closed() return self._transport.write_eof() def can_write_eof(self): - self._check_closed() return self._transport.can_write_eof() + def close(self): + return self._transport.close() + def get_extra_info(self, name, default=None): - self._check_closed() return self._transport.get_extra_info(name, default) @coroutine @@ -311,7 +295,6 @@ w.write(data) yield from w.drain() """ - self._check_closed() if self._reader is not None: exc = self._reader.exception() if exc is not None: -- Repository URL: https://hg.python.org/cpython From python-checkins at python.org Thu Jan 15 23:01:28 2015 From: python-checkins at python.org (victor.stinner) Date: Thu, 15 Jan 2015 22:01:28 +0000 Subject: [Python-checkins] =?utf-8?q?cpython_=28merge_3=2E4_-=3E_default?= =?utf-8?q?=29=3A_Merge_3=2E4_=28asyncio=29?= Message-ID: <20150115215845.93203.91300@psf.io> https://hg.python.org/cpython/rev/031fc0231f3d changeset: 94187:031fc0231f3d parent: 94185:973173ad0fe7 parent: 94186:992ce0dcfb29 user: Victor Stinner date: Thu Jan 15 22:53:21 2015 +0100 summary: Merge 3.4 (asyncio) files: Lib/test/test_asyncio/test_subprocess.py | 14 ++++++++++- 1 files changed, 12 insertions(+), 2 deletions(-) diff --git a/Lib/test/test_asyncio/test_subprocess.py b/Lib/test/test_asyncio/test_subprocess.py --- a/Lib/test/test_asyncio/test_subprocess.py +++ b/Lib/test/test_asyncio/test_subprocess.py @@ -179,6 +179,18 @@ 'sys.stdout.write("x" * %s)' % size, 'sys.stdout.flush()', )) + + connect_read_pipe = self.loop.connect_read_pipe + + @asyncio.coroutine + def connect_read_pipe_mock(*args, **kw): + transport, protocol = yield from connect_read_pipe(*args, **kw) + transport.pause_reading = mock.Mock() + transport.resume_reading = mock.Mock() + return (transport, protocol) + + self.loop.connect_read_pipe = connect_read_pipe_mock + proc = yield from asyncio.create_subprocess_exec( sys.executable, '-c', code, stdin=asyncio.subprocess.PIPE, @@ -186,8 +198,6 @@ limit=limit, loop=self.loop) stdout_transport = proc._transport.get_pipe_transport(1) - stdout_transport.pause_reading = mock.Mock() - stdout_transport.resume_reading = mock.Mock() stdout, stderr = yield from proc.communicate() -- Repository URL: https://hg.python.org/cpython From python-checkins at python.org Thu Jan 15 23:01:28 2015 From: python-checkins at python.org (victor.stinner) Date: Thu, 15 Jan 2015 22:01:28 +0000 Subject: [Python-checkins] =?utf-8?b?Y3B5dGhvbiAoMy40KTogSXNzdWUgIzIyNjg1?= =?utf-8?q?=3A_Fix_test=5Fpause=5Freading=28=29_of_asyncio/test=5Fsubproce?= =?utf-8?q?ss?= Message-ID: <20150115215845.103313.99238@psf.io> https://hg.python.org/cpython/rev/992ce0dcfb29 changeset: 94186:992ce0dcfb29 branch: 3.4 parent: 94184:ecbde0b31f6f user: Victor Stinner date: Thu Jan 15 22:52:59 2015 +0100 summary: Issue #22685: Fix test_pause_reading() of asyncio/test_subprocess Override the connect_read_pipe() method of the loop to mock immediatly pause_reading() and resume_reading() methods. The test failed randomly on FreeBSD 9 buildbot and on Windows using trollius. files: Lib/test/test_asyncio/test_subprocess.py | 14 ++++++++++- 1 files changed, 12 insertions(+), 2 deletions(-) diff --git a/Lib/test/test_asyncio/test_subprocess.py b/Lib/test/test_asyncio/test_subprocess.py --- a/Lib/test/test_asyncio/test_subprocess.py +++ b/Lib/test/test_asyncio/test_subprocess.py @@ -179,6 +179,18 @@ 'sys.stdout.write("x" * %s)' % size, 'sys.stdout.flush()', )) + + connect_read_pipe = self.loop.connect_read_pipe + + @asyncio.coroutine + def connect_read_pipe_mock(*args, **kw): + transport, protocol = yield from connect_read_pipe(*args, **kw) + transport.pause_reading = mock.Mock() + transport.resume_reading = mock.Mock() + return (transport, protocol) + + self.loop.connect_read_pipe = connect_read_pipe_mock + proc = yield from asyncio.create_subprocess_exec( sys.executable, '-c', code, stdin=asyncio.subprocess.PIPE, @@ -186,8 +198,6 @@ limit=limit, loop=self.loop) stdout_transport = proc._transport.get_pipe_transport(1) - stdout_transport.pause_reading = mock.Mock() - stdout_transport.resume_reading = mock.Mock() stdout, stderr = yield from proc.communicate() -- Repository URL: https://hg.python.org/cpython From solipsis at pitrou.net Fri Jan 16 09:05:59 2015 From: solipsis at pitrou.net (solipsis at pitrou.net) Date: Fri, 16 Jan 2015 09:05:59 +0100 Subject: [Python-checkins] Daily reference leaks (031fc0231f3d): sum=6 Message-ID: results for 031fc0231f3d on branch "default" -------------------------------------------- test_asyncio leaked [3, 0, 0] memory blocks, sum=3 test_functools leaked [0, 0, 3] memory blocks, sum=3 Command line was: ['./python', '-m', 'test.regrtest', '-uall', '-R', '3:3:/home/antoine/cpython/refleaks/reflogeikkXn', '-x'] From python-checkins at python.org Fri Jan 16 18:05:23 2015 From: python-checkins at python.org (guido.van.rossum) Date: Fri, 16 Jan 2015 17:05:23 +0000 Subject: [Python-checkins] =?utf-8?q?peps=3A_First_public_draft_of_PEP_484?= Message-ID: <20150116170520.93203.12248@psf.io> https://hg.python.org/peps/rev/0dd72b08606a changeset: 5673:0dd72b08606a user: Guido van Rossum date: Fri Jan 16 09:05:19 2015 -0800 summary: First public draft of PEP 484 files: pep-0484.txt | 484 ++++++++++++++++++++++++++++++++++++++- 1 files changed, 479 insertions(+), 5 deletions(-) diff --git a/pep-0484.txt b/pep-0484.txt --- a/pep-0484.txt +++ b/pep-0484.txt @@ -3,19 +3,493 @@ Version: $Revision$ Last-Modified: $Date$ Author: Guido van Rossum , Jukka Lehtosalo , ?ukasz Langa -Discussions-To: Python-Ideas +Discussions-To: Python-Dev Status: Draft Type: Standards Track Content-Type: text/x-rst -Created: 08-Jan-2015 -Post-History: +Created: 29-Sep-2014 +Post-History: 16-Jan-2015 Resolution: + Abstract ======== -This PEP is currently a stub. The content should be copied from -https://github.com/ambv/typehinting (but omitting the literature overview, which is PEP 482) and reformatted. +This PEP introduces a standard syntax for type hints using annotations +on function definitions. + +The proposal is strongly inspired by mypy [mypy]_. + + +Rationale and Goals +=================== + +PEP 3107 added support for arbitrary annotations on parts of a function +definition. Although no meaning was assigned to annotations then, there +has always been an implicit goal to use them for type hinting, which is +listed as the first possible use case in said PEP. + +This PEP aims to provide a standard syntax for type annotations, opening +up Python code to easier static analysis and refactoring, potential +runtime type checking, and performance optimizations utilizing type +information. + + +Type Definition Syntax +====================== + +The syntax leverages PEP 3107-style annotations with a number of +extensions described in sections below. In its basic form, type hinting +is used by filling function annotations with classes:: + + def greeting(name: str) -> str: + return 'Hello ' + name + +This denotes that the expected type of the ``name`` argument is ``str``. +Analogically, the expected return type is ``str``. Subclasses of +a specified argument type are also accepted as valid types for that +argument. + +Abstract base classes, types available in the ``types`` module, and +user-defined classes may be used as type hints as well. Annotations +must be valid expressions that evaluate without raising exceptions at +the time the function is defined. In addition, the needs of static +analysis require that annotations must be simple enough to be +interpreted by static analysis tools. (This is an intentionally +somewhat vague requirement.) + +.. FIXME: Define rigorously what is/isn't supported. + +When used as an annotation, the expression ``None`` is considered +equivalent to ``NoneType`` (i.e., ``type(None)`` for type hinting +purposes. + +Type aliases are also valid type hints:: + + integer = int + + def retry(url: str, retry_count: integer): ... + +New names that are added to support features described in following +sections are available in the ``typing`` package. + + +Callbacks +--------- + +Frameworks expecting callback functions of specific signatures might be +type hinted using ``Callable[[Arg1Type, Arg2Type], ReturnType]``. +Examples:: + + from typing import Any, AnyArgs, Callable + + def feeder(get_next_item: Callable[[], Item]): ... + + def async_query(on_success: Callable[[int], None], on_error: Callable[[int, Exception], None]): ... + + def partial(func: Callable[AnyArgs, Any], *args): ... + +Since using callbacks with keyword arguments is not perceived as +a common use case, there is currently no support for specifying keyword +arguments with ``Callable``. + + +Generics +-------- + +Since type information about objects kept in containers cannot be +statically inferred in a generic way, abstract base classes have been +extended to support subscription to denote expected types for container +elements. Example:: + + from typing import Mapping, Set + + def notify_by_email(employees: Set[Employee], overrides: Mapping[str, str]): ... + +Generics can be parametrized by using a new factory available in +``typing`` called ``TypeVar``. Example:: + + from typing import Sequence, TypeVar + + T = TypeVar('T') # Declare type variable + + def first(l: Sequence[T]) -> T: # Generic function + return l[0] + +In this case the contract is that the returning value is consistent with +the elements held by the collection. + +``TypeVar`` supports constraining parametric types to classes with any of +the specified bases. Example:: + + from typing import Iterable + + X = TypeVar('X') + Y = TypeVar('Y', Iterable[X]) + + def filter(rule: Callable[[X], bool], input: Y) -> Y: + ... + +.. FIXME: Add an example with multiple bases defined. + +In the example above we specify that ``Y`` can be any subclass of +Iterable with elements of type ``X``, as long as the return type of +``filter()`` will be the same as the type of the ``input`` +argument. + +.. FIXME: Explain more about how this works. + + +Forward references +------------------ + +When a type hint contains names that have not been defined yet, that +definition may be expressed as a string, to be resolved later. For +example, instead of writing:: + + def notify_by_email(employees: Set[Employee]): ... + +one might write:: + + def notify_by_email(employees: 'Set[Employee]'): ... + +.. FIXME: Rigorously define this. Defend it, or find an alternative. + + +Union types +----------- + +Since accepting a small, limited set of expected types for a single +argument is common, there is a new special factory called ``Union``. +Example:: + + from typing import Union + + def handle_employees(e: Union[Employee, Sequence[Employee]]): + if isinstance(e, Employee): + e = [e] + ... + +A type factored by ``Union[T1, T2, ...]`` responds ``True`` to +``issubclass`` checks for ``T1`` and any of its subclasses, ``T2`` and +any of its subclasses, and so on. + +One common case of union types are *optional* types. By default, +``None`` is an invalid value for any type, unless a default value of +``None`` has been provided in the function definition. Examples:: + + def handle_employee(e: Union[Employee, None]): ... + +As a shorthand for ``Union[T1, None]`` you can write ``Optional[T1]``; +for example, the above is equivalent to:: + + from typing import Optional + + def handle_employee(e: Optional[Employee]): ... + +An optional type is also automatically assumed when the default value is +``None``, for example:: + + def handle_employee(e: Employee = None): ... + +This is equivalent to:: + + def handle_employee(e: Optional[Employee] = None): ... + +.. FIXME: Is this really a good idea? + +A special kind of union type is ``Any``, a class that responds +``True`` to ``issubclass`` of any class. This lets the user +explicitly state that there are no constraints on the type of a +specific argument or return value. + + +Platform-specific type checking +------------------------------- + +In some cases the typing information will depend on the platform that +the program is being executed on. To enable specifying those +differences, simple conditionals can be used:: + + from typing import PY2, WINDOWS + + if PY2: + text = unicode + else: + text = str + + def f() -> text: ... + + if WINDOWS: + loop = ProactorEventLoop + else: + loop = UnixSelectorEventLoop + +Arbitrary literals defined in the form of ``NAME = True`` will also be +accepted by the type checker to differentiate type resolution:: + + DEBUG = False + ... + if DEBUG: + class Tracer: + + else: + class Tracer: + + +For the purposes of